Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Sep 13, 2021.
Radeon VII with its 16GB HBM2 certainly aged well in mining farms
Except it has zero efficiency.
Certainly. Radeon VII costs as much as 6900xt this days.
For gaming Radeon VII is pretty much a dead card, even driver support was abandoned . Even Maxwell GPU can still play all the latest AAA games with no problem, albeit at low settings.
So yeah, AMD GPU and "longevity" don't stick.
Radeon VII still does support latest drivers idk what your talking about and still outperforms gtx1080 as well as Vega64.
Indeed Radeon VII is a bloody beast of a card, it's got that HBM goodness.
ASRock RX 6600 XT Phantom Gaming D 8GB OC review - DX11: Far Cry New Dawn (guru3d.com)
Not bad for a vega. it beats even the 1080ti in some games.
It's indeed incredible too at mining but like i mentioned very power hungry while doing so unlike the newer cards.
What an incredible lifespan it's had tho.
Radeon VII release date Feb 2019, EOL Aug 2019 - MSRP 700usd
GTX 1080Ti release date March 2017 - MSRP 700usd
GTX 1080 release date June 2016 - MSRP 600usd
Radeon VII has been so buggy since launch that even the most die hard AMD fan like AdoredTV ripped AMD a new one for terrible driver support, not sure if AMD even fixed any bug since then
Nvidia tried to set a trend where everybody must buy an Nvidia card and pay 150 euros extra for a screen to get adaptive synchronization. That trend failed, and now Nvidia also supports the more hardware agnostic approach as well, the one pioneered by AMD.
huh, Variable Refresh Rate is VESA standard which came out after G-Sync, definitely not "pioneered" by AMD LMAO. Also G-Sync monitors are capable of ULMB, which was not possible on VRR monitor without G-Sync module until now.
The fact that almost everyone now own an VRR monitor means that Nvidia is a trend setter LOL, despite their own solution cost a little too much for people's taste. Well next is Real Time RT and AI upscaling, which AMD and Intel will also have to copy.
Freesync was AMD's competitor against G-sync. It's much closer to the VESA adaptive sync. It doesn't really matter if G-sync appeared sooner or not. It lost the race, so Nvidia didn't manage to set a trend. Before Nvidia bowed its head and adopted non-G-sync adaptative sync, it was funny to read ad brochures when plenty of screens were advertising AMD Freesync, only one or two screens G-sync, yet Nvidia's GPU market share was 75-80%. Nvidia hit itself in the knee with an axe by charging an arm and a leg for the G-sync module/license. Well, you are of course free to have you own opinion on the matter. I won't change mine, though.
290x had 4g
that's what people forget
when I bought a 2716dg 1440p 144hz ulmb with a 980Ti in july 2015,how long did it take for amd to give me the same options ?
a card capable of overclocked 980Ti's performance wasn't even available until Vega 56 in 2017,neither was strobing on first freesync monitors,and when it became an option,you still couldn't find adaptive v-sync in the drivers.Not to mention the Hz range was a joke,a 144hz monitor capped at 90Hz in freesync mode.
that is the part of the story that often gets conveneintly forgotten.
at the time I was playing dying light at 120fps 1440p with ulmb on all that amd owners could do is get frustrated with their fury x's
tables have turned now and no one needs the module,but it took years.
What a world we live in that copycats get the credit , chinese companies must living in joy knowing people appreciate their effort of steal Western IPs
it was inevitable that amd would get all the cool features themselves.
so as much as people deny that nvidia sets trends,and that includes me being sceptical about that statement,it's true they try to push a lot of cool stuff first.
the case of g-sync/ulmb 1440p 144hz displays with a $650 best of a card to push them is probably the best example.took amd 2 years to catch up.
same with rtx,and even now current rx6000 cards can't provide the same experience since they not only lack rt performance but a dlss 2.0 equivalent that's helping in ray traced games specifically.
gameworks was hit or miss,but I still enjoy some of those features a lot,and if I don't I just turn them off.VXAO in rotr was probably the best looking AO implementation until ray traced ao beccame a thing.So was HTFS in WD2,looked incredible.Im playing Mirrors Edge Catalyst atm,and Ansel is so much fun that at times I prefer exploring the environment more than actually playing.
so as much as people hate to give nvidia credit,not giving it to them for some of the things they tried (with various degree of success) is just denial.I recently watched HUB's review of 2060S vs 5700,and they completely shrugged off RT and DLSS,or the fact that one card is missing a lot of modern features completely .They did however cover RT in their 6700xt review.Pretty ironic considering that not only are 2060S and 6700XT similar in RT performance,the 2060S has DLSS available for almost every ray traced game there is and is gonna provide an overall smoother experience thanks to it.
so shrug nvidia's tech off all you want if you don't like them personally,but that has a name - it's called living in a bubble.
Yeah I bought the Asus PG278Q which was one of the first 1440p 144hz screen back in 2014, I was totally owning in games like DOTA 2 and CSGO (top 1% ranked)
Idk why HUB became so obsessed with the money aspect, they were more enthusiast-like back then, but now all they care about are price/perf, which is funny because they are not interested in the best price/perf GPU anyways (that would be the RX570 LOL), looks very double standard to me.
Also HUB is as blind AF, they were dissing DLSS 2.0 before like saying it's only limited to a few games, but everyone else saw the huge potential of DLSS 2.0 (which is required for 4K 120hz gaming).
I don't like raytracing . For me 1440p raster performance is main selling factor and at this 6800 is more close to 3080 . And 8gig vram is big no no .
When i got my 6800 , price was 1:1 with 3070 , so for me was no brainer. But i guess it's a matter of perference also.
Yeah let just disable RT and dial Texture Quality to 11, so that I can feel good about my 16GB VRAM GPU .
Can I interest you in a blind test between Ultra and High Texture Quality?
Textures does not take a performance hit you only have to have enough vram. I see you would rather enjoy the rt slideshow.