Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Sep 13, 2021.
Well it's certainly a slideshow on RDNA2
This is without DLSS enabled.
It came with both 8gb and 4gb same as the 480 did
No, the 290x only has 4GB, 390x come with 4/8GB variant.
Control and Cyberpunk rt is a slideshow on both mind you.
Yeah sure but DLSS is there and you get back all the FPS from RT, and then some
If you say 4K DLSS is not 4K then I can give you some screenshots to prove that 4K DLSS is not 4K Native, it's better than 4K native (well with a bit of Sharpen+)
okay but not everyone dislikes ray tracing
and it's weird if you ask me that you would dislike that.as much as has been said about performance hit and cost,no one really questioned that ray tracing looks better,except for this 6800 owner.
I did sell my 5700XT for more than i paid for it. I would even say significantly more which yeah is totally ridiculous.
Obviously the markets are different in different countries. Obviously the 5700XT was not a better card the 2070 Super was better but when it comes to value in Canada the 2070 Super was pretty bad value. I paid 635$ after taxes for my GB 5700XT and looking at the price charts in Canada an equivalent 2070 Super from the same company was 805$ after taxes. That's 170$ more which was ridiculous considering the difference in performance between the two cards. And BTW i ran exclusively nVidia for like 15 years.
it's been said a thousand times
a good alternative for 3070 8G is for amd to sell 6800 16G and not lose 30% ray tracing performance at the same time.
buying a more expensive card with 16G vram cause doom rtx takes over 8G of vram on 3070 to find out the 16G card loses to 3070 in ray tracing - that is just idiotic.
the 290 non x came only with 4gb the 290x had both 8gb and 4gb versions released
it does not look always better. sacrifice is too big , its big no for this generation for me. Thats just me maybe most love it , i don't , simple as that
just saying,there are cards in this very generation that can run it pretty well.
you just chose one that can't for that bigger vram pool.
I don't hate giving Nvidia credit. They have been the major driving force behind development in the PC video card market for a number of years now. I doubt AMD would have even thought about RT at this point without Nvidia accepting the risk of huge criticism when RTX 2000 had so weak RT performance, especially in the beginning. I still don't like the idea of DLSS (AMD's half-assed response to it even less), but if it makes some people happy, it was a good innovation. What I hate is the American style dialogue of seeing the world in pure ones and zeroes, in 100% contrast black and white. If one company innovates something, it automatically means the other doesn't innovate anything. If one company sets trends, it automatically means the other company can't set even a single trend. That's so annoying a way to express things. In the first place, both Nvidia and AMD would innovate a whole lot less without competition.
You can't patent a trend. Anyone can try to exploit it, after someone has managed to launch it. Of course if the trend is very tightly related to a patent, it can be more difficult for competitors. Apple's phenomenal story in the phone market is a perfect example of it: It created neither the idea of a smartphone nor a touchscreen phone, but it achieved a success greater than any other company (so far) by using those two trends perfectly.
I dont know why is so hard for you to comprehend that for some rasterized performance and bigger memory pool is more important than raytracing. Whats the point of 3070 superior rt when even next year it will be out of memory even without raytracing lol.
How can anyone post something with NO capitals. We're devolving. Come on... don't be 'that' lazy.
P.s. I've a 1060 and REALLY want a 2060. Shame it costs more than my entire pc.
Well AMD create the whole Mantle (and later DX12) thing, which I thought was pointless, DX11 still runs fine, especially when game engines that were designed with DX11 run like crap on DX12 (UE4, Frostbite). I wouldn't even use DX12 API in certain games if it weren't required in order to enable RTX or DLSS.
RT + DLSS + High Texture Quality still look better and more immersive than Ultra Texture Quality without RT .
You would need to zoom in 400% on a still shot to notice Ultra Texture Quality vs High
Testing out with Doom Eternal I couldn't find any difference with Ultra Nightmare Texture Pool size vs High when pixel peeping.
it's not unimportant to them,they just opted out of it cause they can't have both.
imo 8G is not as big a problem as is straight lacking performance and features.
just on sheer numbers,there's gonna be more games that benefit from higher rt performance than those that will exceed 8G while doing so.and by some margin.if you can't have both,imo take the more rreasonable option.6800 has 16G,but it's marketed as a 1440p card.It has the vram for ray tracing,but lackcs power.It all points to 16G just being an afterthought,if we can't have what nvidia has what can we do for the specs that could be a selling point.It is gonna help,but to what degree is unknown.It's not gonna do as much as having features like faster rt performance and dlss.It's only gonna matter once you do exceed that buffer.When and if that happens - we've had games like cp2077,almost fully raytraced,same for control and quite a few others - that run rt on 8G just fine.
Doom's ultra nightmare texture example is an outlier.
it's like those people who said more cores matters most,and then saw their 1st/2nd gen ryzens and threadrippers absolutely demolished by 11400/5600x in games.There has to be a balance.
DLSS is also very divisive. Some people praise it, some hate it. People often use static images to show how good DLSS is and it's true, it's remarkable and you'd be hard pressed to tell the difference between native and upscaled yet in motion it can fall apart and ruin the experience completely. The smearing and artifacting of details is really noticeable when it happens and can be really distracting.
I still personally do not believe RT is viable yet, particularly if upscaling is still necessary (in before people say DLSS isn't upscaling, it's image reconstruction.... It's still using a lower resolution base render).
As for Nvidia and their trend setting, problem is, their trend would see you be locked into a platform and upgrade every generation for a new feature. And this may seem tinfoil hat to some, Gameworks, DLSS and even the driver based scheduling is planned obsolescence. For the driver example, it's game ready drivers. Nvidia can easily make a product obsolete by no longer providing game optimised drivers without necessarily dropping support altogether (what I like to call the Kepler effect).
Despite this, Nvidia are pushing the envelope of graphics technologies, even if it is in an anti-consumer way.
and then what ?
upscales it ?
It's just a mocking comment on pedants who can't agree if it's upscaling or not that then go into exposition of image reconstruction, interpolation and every other debate on the details
I was about to suggest that you copy DLSS DLL ver 2.2.11 to DLSS support games to reduce the smearing, but you don't have an RTX GPU , so figure you would nitpick about DLSS.
"ruin the experience completely" LMAO, yeah giving 50% more FPS is ruining the gaming experience for some non-RTX owners
I would buy AMD GPU in a heart beat if they were more reasonably priced for the features that they offer, I bought the R9 290 back then because it was much better than GTX780 after I replace the stock cooler, still couldn't find any better AMD GPU in recent time.
Can you tell which side is 4K DLSS Performance without looking at the OSD? I made some comparison