Open world games always have some bugs. The problem is that some people do stupid things in game. Also mods can introduce some bugs.
Well, it wasn't the game to have "some bugs" on launch lol. I've completed my first playthrough with my di*k hanging out of my pants. Got two softlocks, which thankfully resolved themselves with more or less finessing and save-loading but the game was far from bug free. Witcher 3 by comparison (even though it crashed my GPU driver for half a year post launch) was a rock solid product lol.
I've never encountered these bugs, or anything similar. But i don't do stupid things in game: like for example i never ride a horse in the city because it can create mesh conflicts, locking. Also this doesn't make sense, to ride a horse through crowded areas. I also never call Roach if around me is a lot of objects, because he can get stuck. Open world environment will always create some bugs because you can do many things in it, many approaches which create different outcomes. Because of this it can't be 100% tested.
To be honest the latter part of preview makes no sense, claiming ~130 FPS with DLSS 3 but only 20 with it off. DLSS3 usually achieves slightly below double the framerate so i don't get the source of the disproportion unless it involves way more gimmicks than what DLSS3 already does. I'd expect it to be either ~20 fps without DLSS and about 36'ish with it enabled or 70-80 FPS without it to achieve claimed ~130 fps.
I think they mean DLSS 3 as the sum of DLSS 2 + FG vs Native resolution without DLSS 2 and FG. In that sense those values might be true and knowing Nvidia when they talk about DLSS they usually base it on performance mode.
What Nightwalker said. "RTX ON" is all the bells an whistles activated, including DLSS (although not specified if quality or performance, which is a huge difference in itself), frame generation enabled, and so on.
Beautiful but way more sane to buy a PS5 + PSVR2 and wait until 5-6x 4090 GPUs become available for 4K Native RTRT... I'll probably plan to retire by then...
They should not have called it dlss3, it's frame generation to boost the fps artificially by guessing more frames very quickly then inserting them. Dlss2 render it at a lower resolution then do some AA tricks to make it look like a higher resolution because cards can't do it other than money burning ones and not even those in some cases. Truly very odd stuff. I mean I'm pretty sure people here know these things but sometimes seems otherwise. So really you're going to have to have dlss2 and dlss3 3 enabled at the same time to get 1. High fps for smoothness on your 120hz+screen and appropriate input speed 2. to be able to fool your eyes into thinking it's native 2k/4k/etc Weird industry.
I don't see any reason why it technically should be, given neither Q2RTX nor Portal RTX weren't. Now, practically, it will probably run on 3000 series while 2000 series and AMD cards will be just slideshow demo mode.
Actually, since it's a performance costing feature, I can imagine Nvidia "allowing" it to be run on older cards too, as long as they got the tensor cores. With crap performance and slideshow character, it's actually a selling point for the still plenty Ampere and Lovelace gens compared to Turing. That's exactly why they did not want Turing and Ampere owners to get frame generation, since it would help with performance and in the end, cripple sales of anything other than Lovelace. But that's Nvidia... they built an ecosystem around their GPUs, and they are quite aggressively protecting newer hardware sales in it.
That's a good point. NVidia is no longer just fighting off the competition. They are fighting off their own, older GPUs, to force everyone to upgrade.