Discussion in 'Frontpage news' started by Hilbert Hagedoorn, May 21, 2019.
Guess which company's not getting initial stock now.
Sapphire will still, they're probably AMD's best board partner lol.
I really should have bought a 1080 instead of a 1070 dammit!!!
Yup, I would still be waiting too, but was forced into an upgrade when my Fury died... The 2080ti has reached 4k 60fps levels reliably, but is a bit Nope! for me at that price lol
Got a great Black Friday deal on an almost 4k ready 2080 - Good enough for now.
Still want one of AMD's Next Gen post GCN cards when they finally get here with their chiplet design - they will do proper 4k
I hear ya. I could never justify buying a 2080Ti. A 2080 is 4K-ready if you just turn off AA. Depending on the size of your display, you probably wouldn't notice AA being off anyway, though, I'm sure some purists here will vehemently disagree.
I too would hope the post-GCN cards will be what we're looking for but knowing their development rate, that's just way too far away, and that includes mature drivers.
On the note of those chiplet-based GPUs, I'd like to see if that breathes new life into mGPU setups, where you can basically just keep adding as many core clusters as you want and it will scale appropriately. So to start with, you could have AIBs where their performance is basically determined by how many core clusters they have and how much VRAM they have. Then, there could be absurd Crossfire setups where you've got PCIe x1 or M.2 cards with only a single core cluster that act as a mini performance boost. I'm not keeping my hopes up that will happen, but I think it'd be pretty cool to be able to incrementally add performance like that.
Yeaaah, AMD can't afford to punish its partners.
They can ask Sapphire to fire the person who leaked it which is what I suspect will occur.
I dont think we'll be seeing multi-card setups outside of the industrial/research space anymore.
On the other hand, it does mean we could see maybe 16 chiplets (or more) on a single board (with HBM).
I look forward to our new chiplet overlords
I like pulling for AMD but if they don't push the GPU market forward then Im likely getting whatever Nvidia puts out on 7nm. Still holding onto my GTX 1070 and my 2k monitor which is actually doing pretty well. I did have an ati 9700 pro and an ATI 5870 which at the time they were better than anything Nvidia offered albeit the 5870's dominance was very short lived. Hopefully next week we can get some details on big Navi or whatever they will call it.
If it comes out for 500 and performs similarly to a 2070 it's dead in the water day one.
I have been holding onto my rx 580 for a while, just to get a NAVI and if it's so slow and pricey and as we know HOT AS HELL, thank you, but no thank you.
Haha i'm pretty much like you the last two ATI/AMD GPU i owned is a 9800 pro (flashed to XT with a custom cooler) and a 5850. The 9800 pro flashed to XT remains one of the best card i ever owned. Like you said the 5850/5870 price to performance "dominance" was short lived but it was still a very good card and from that i can remember it was not running hot at all.
I own a 1070 too and while it's okay at 2k144Hz most of the time it's a little bit on the not powerful enough side. I wish i could upgrade. The 2070 has come down in price and at the current price i could justify it (the launch price was totally ridiculous and i still have not digested it) but just to make a statement with my wallet if Navi is not good enough i'll wait for the next nVidia lineup i wont touch the RTX cards.
Certainly brings some interesting possibilities, really looking forward to see what comes. Certainly more cores and more speed just like the new Ryzen chips we have coming all so Soon ^^
I don't know, perhaps the heavy demands of ray tracing will see a resurgence of multicard setups?
Though the chiplet design and reduced cost of no longer needing monolithic GPU design may make multi card setups defunct for enough performance from a gaming perspective with decent quality..
What you do wrong is comparing apples to oranges. Stock, to Stock 1080 is around 2060. OC both and not much changes. Except maybe 2060 eating less power.
When someone WC's a GPU to keep noise down rather than for performance sake, you know theres an interesting story somewhere . So how do you like the R7 vs the 1080ti?
Are you in the future? Specifically May 27th, 2021?
That's my only assumption i can come up with due to your "5 years old technology" statement. You must be in the future, May 27th 2021 or later, for a GTX 1080 to be 5 years old.
Aftermarket Navi cards that compete against the RTX 2070 are going to be 350-400 US. This is a water cooled and overpriced edition at 500 US. I am not sure why anyone is surprised by the rip off price. With Nvidia's appalling prices, a water cooled RTX 2070 would cost 650 US.
I expect better than this. AMD will either quickly lower pricing post launch or these prices are for premium Sapphire cards. You should think logically: Why release products which are dead on arrival? I'm personally expecting at least hundred euros less for both cards from typical partners like Gigabyte. Some analyses don't consider the market space at all, just thinking that maximum profit is what every company is after, when AMD cannot gain that by overpricing.
Lets just take a moment, to think how they are going to do RT via software, after seeing how well Nvidia can do it via hardware
Technically very true. As we see from Pascal, software RT can't compare.
But I find that RT has lost most of its interest to me. Shopping right now I'm looking at RTX because compared to a used GTX it's lower power and can be had new with warranty*. Compared to anything Radeon it can be had with higher performance and will hold such a lead for a long time.
RT? I'll run some demos but otherwise that almost doesn't matter.
* A used 1080 TI is still an excellent choice, and I'd go for that in a different use case.
Pascal has magnitude worse FP16 performance than FP32. Therefore all those operations run on FP32. On other hand Vega has 2:1 FP16 to FP32, this makes it's FP16 twice as good as FP32 of GTX 1080Ti. So, if AMD does it through brute force, you can expect Vega 56 to do about twice as well as GTX 1080Ti. (Still not the best, but that Vega 56 is quite cheaper than lowest RTX enabled 2060.)
Check this and potentially double fps of GTX 1080Ti.