Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Nov 14, 2018.
BFV running @ Ultra 1080p with RTX on in a live multiplayer match achieving 90+fps with RTX 2080Ti
Think of it from tech maturity, hw suitability and time invested point of view.
At this point in time devs have myriads of rendering tricks up their sleeve.
They have had all the time in the world to learn the ins and outs of traditional rendering and have become very proficient in faking the physically correct images.
The hw itself is tuned to accomplish these tasks.
Good result, his GPU runs deeply overclocked ~2100 MHz also his 'ultra' settings are not ultra but custom, he tweaked a thing or two below that fold.
We finally got there in the end.
I completed the first war story with RTX on. It looks fantastic but it is very buggy, having grass flicker and the first light you come upon has a cut off point that literally cuts off your shadow where its at.
But really, buggy or no it really looks amazing. I didn't even turn on my fps counter... it was fully playable and very smooth.
It's similar to how PhysX can put a significant performance penalty. You're 'offloading' processing to a separate process. However that needs to stay synced with the rest of the frame information that is being rendered by the GPU core. The RT effects are synced with the GPU generated frames. So, that synchronization is what slows the frame rate. It's a simplification to explain the concept.
18fps on a 2080! (watch from 9:45)
The water reflections are impressive, but I'd much rather be able to play at higher resolutions personally.
This game only uses RTX for reflections, not lighting, so it shouldn't be a problem.
There is no global illumination or shadow RTX applied.
Not bad first start. Though, it does look like 7nm cards will probably make ray tracing a lot more practical. Third ray tracing generation should make it mainstream.
I've seen some comparison screenshots between ray tracing on ultra & ray tracing on low at some other review sites, and there's hardly any difference in perceived quality - I can't see the difference in screenshots, so it makes sense just to turn ray tracing to Low to gain the most fps possible. It's still a massive performance hit close to 50% loss of performance, and even with a 2080ti at 1080p and with ray tracing on low you're only getting 80 odd fps, so it's a no go for anyone playing the game competitively on a 144Hz screen - but it's ok for 60 Hz screens or people with G-sync that aren't wanting to play competitively. It's good to see the first ray traced game out there now, it's got it's limitations, but it's a start!
You want to know the difference?
GeForce GTX 480 : 480 SP, 384-bit, 250W TDP, US$ 449~499
GeForce GTX 470 : 448 SP, 320-bit, 225W TDP, US$ 299~349
And this is how it has been....expressed over time as 'can it play Crysis?'. Nothing has changed...resolutions increase and rendering power requirements increase. The different rendering techniques are just quibbling around the edges of the basic pardigm. Other than base metric of raw FPS, image fidelity and effects and all that encompasses are subjective. If I was I twitch shooter player, I'm fine with lower res graphics and effects for less distractions and higher FPS. If I'm a MMO casual, I have an entirely different view of what is 'best'. There's not one thing that fits all.
Major question is: Is 1080p rendering quality feel with RTX better than 4k without it? I now that is subjective, but i want to hear your answers.
Well I’d personally feel a little ripped off if I had to give up 4K for 1080p. Heck I game at 1440, I don’t even want to go down to 1080.
I have a 1080p display and I barely or rarely use native, always opt for dsr 2880×1620.
I'd say that for this specific title since it is only reflections, that the loss of image fidelity for the effects would not be worth it, if you can play at Native 4K. If you're faking it with DSR or other techniques and your native resolution is 1080p, then perhaps it is.
Yes, GPU prices fluctuate, taking inflation into it you'll remember (...unless you're too young) that the 8800 Ultra if released today would have been a $1011 card, and guess what? It didn't offer any revolutionary technology that all games will be moving over to in the future.
Was it fast? Absolutely. Was it worth it? Depends on your opinion. But all it did was the same thing the previous generation GPUs did, just faster, rather then trying to actually bring to gaming one of the major technologies game developers have been trying to get into their games for the last 10+ years.
Want to be mad with Nvidia? Be mad at every time they charged a pretty decent large amount of money for SMALL chips, or be mad with them for the Titan V...But all this hatred and anger for a first ever ability to actually do ray tracing on pretty gigantic chips costing more? Doesn't make sense. Learn when to pick your battles.
Stop presenting them with facts. You're ruining everyone's desperate attempt to bash these cards