Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Oct 28, 2020.
Wow, I definetaly not see THIS coming. Well done, AMD!
RTX 3070 doesnt look soo good anymore with that shy 8 gigs of ram . It was worth ot wait .
So the people that have been predicting for the top card to come close to the 3090 were right , I have expected them to match the 3080 at best , well done AMD .
Honestly I didn't think that I will see this happen .
Vega FE 999$ to 6900XT 999$
Vega 64 LC to 6800XT
Vega 64 to 6800
Prices are almost the same as ever.
We lost gpu class from vega 56 and below(Polaris).
+smart access memory. So those slide are more marketing.
Even the 6700xt will have 12gb. I wonder when they will announce it.
For me it's another rip off.
They followed the jackals (Intel & Nvidia) with same practice and prices.
I don't care if they're at the same level or slightly better.
And again, and again. 8 GB is enough for 4K. There is no a single game that needs more. Even 10 GB is enough in many games in 8K. Only few very recent games can push for more in 8K.
Can't wait for the reviews. If AMD's RT performance isn't up to par then that makes this a less compelling offering, especially without a DLSS alternative.
Bit of a blow to all the dicks who used bots to get a hold of all those nVidia cards...must be fun to behold that AMD launch when you're sitting on dozens of 3090s you've got listed on the bay at near $2.5k...or even pushing $4k for a 3090 Strix...with 3080's at over $1.5k...not much respect for the 'entrepreneurial initiative' displayed there...
Of course there's no indication that won't happen to AMD as well either...
You can't please everyone haha .
They announced Denoiser. But that's it. They didn't say a word on it.
Have you ever thought that AMD has to survive? Do they need to sell you at a loss &stop their evolution?
So is it that instead of using expensive GDDR6X VRAM, they used the cheaper & slower GDDR6 VRAM with an extra VRAM caching system? Similar to CPU structure. There is a difference though in that most CPU based code blocks will fit in the small cache... not sure how caching the VRAM helps if a large framebuffer needs to be continually read for every frame. Must help a small bit, but there would be still much less cache hits and much more misses than similar CPU situation.
Anyway, 6900X looks like good kick to NVidia with their 1500 suckers card as I have said previously.
Most likely RT performance will be worse... but as we know... future RT console games will be based on AMD hardware, so won't push much more than that, so any PC based AMD GPU will most likely be sufficient.
Looking forward to reviews.
This, and a 3070 isn't even aimed for native 4K.
Interesting having a look at the 6900XT shows AMD with battlefield 5 with 122, being above the RTX 3090, but on the view here it shows the 3090 with 124fps they show it around 110fps so thats a little interesting why its much lower.
Again Borderlands 3 3090 shows 78, 6900XT shows 73. There results show otherwise.
This is why we should not jump up on hype from what they show. This isn't a dig at AMD or a fanboi for Nvidia btw, just showing that results are not always as they seem and need to be taken with a grain of sand. Third party results should hopefully show a bigger picture
Just take a second to appreciate this quote from the presentation:
Wake me up when someone finally presents a sub 300 u$s card.....
And no, "go buy a 3yo model" is not an answer, tech industry (used to) work like this: "If u want my money, offer me something better for the same money. Otherwise I'll keep my current hard."
And again no, "old cards are enough for 1080 gaming" is not an answer either. Resolution is only half of the equation, the other half is graphic detail/amount. The damn consoles have kept that variable static for too long but next year the bar will rise, a lot (and then be static for another 3 years....).
Yea AMD soo stupid giving that 16 gigs for nothing haha , obviously their enginers suck at what they are doing and nvidia is smart