Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Nov 14, 2018.
Dice DOWNGRADED ray tracing , Imagine if the did not
Those are no ordinary puddles, they are ray traced puddles!!!
Why do we need to imagine it, it would be what they showed at the demo earlier this year lol
Tessellation did the exact same thing when it launched and didn't even come with a "this visual effect is amazing". Now in days the Extreme Tessellation setting in Uniengine is nothing to modern hardware and even higher levels are used throughout games.
I really fail to see how this is different.
Playing on 3440x1440 here too
Any word on when we get dlss? That's really the only rtx feature I'm looking forward to.
I'm sure performance will improve over time. It is a BF game after all. Every game since BF3 has run terrible at launch so hopefully they will work the bugs out.
Back then though, AMD was called out for its inferior tessellation performance compared to Nvidia. And today, you can see the complaints about Hairworks destroying performance, so it's not just raytracing that's getting complaints.
IMO, years of consoles setting the baseline has made gamers expect to always be able to max out a new game on the latest hardware. And gamers who have gotten a 4k monitor doesn't want to have to use 1080p just for a single graphical feature.
Wonder if DLSS could come in and save the day for performance at a later date.
But yeah, this is just the beginning of what will hopefully be something we expect as standard in years to come, this needs to be a thing even if the performance isn’t there right now, standard rasterisation has unfortunately reached its peak.
What would be the results when ENABLING DXR on a GTX 1060 series and above cards?
In some aspects, GeForce RTX resembles AMD's Radeon HD 5000 Series when the first DX11 tessellation benchmarks were published -- impressive visual gimmick with hefty performance drag. The difference back then was, that the price-tag wasn't asking for your kidney.
The 2070 is capable of DXR and doesn't cost a kidney and more cost effective DXR solutions can be built by competitor companies. I fail to see how the price plays a role in evaluating the long term implications of the technology.
Also the early DX11 tessellation benchmarks were sub 30fps. In games like AVP tessellation was limited to a few assets and only minimally improved object quality. This technology is completely revamping entire sections of the lighting model to more correctly simulate the way light behaves.
Edit: I think the difference here is that I'm evaluating DXR as a technology and everyone else seems to just conflate it with Nvidia's shitty 2000 series launch. DXR and hybrid Raytracing as a whole is part of Microsoft's DX12 standard, Vulkan and it's here to stay.
sli does not work with dx 12 in BF5
Would be nice if one of the most hyped DX12 features would actually be supported by game developers.
pleases gaming and raytracing in the same sentence we are spoiled.
I ment for anything at all (denoising is just one thing of many you can use the thing), that's a very powerful core sitting idle, if it's not used.
May we have screenshots of same area and aiming at same spot (same viewport) at different settings?
Would be nice to have Screen-space-reflections vs. RTX Low/Med/High comparison.
Also, I don't get how are fps lower - it's a separate core, you can offset task to it, if anything fps should go up for having to do less on shader core. Want more information on tensor core utilization as well. Apparently, at this point it's just * code.
It's a powerful core but it's only useful for low precision matrix operations. Not very many things related to gaming can take advantage of that.. perhaps that might change in the future while people learn new methods but as it currently stands not much is going to get done on them. BF5 doesn't utilize them at all even for denoising the raytraced output.
wow look at the fps penalties