In lieu of the new technology surrounding ray tracing cards, will the drivers be optimized for all Nvidia card or will the older technology be left behind just a bit?
Linus (Linus Sebastian) already talked about this... Topic was: IS NVIDIA RUINING YOUR PERFORMANCE? (youtube) Short answer is NO.
The promoted stuff like raytracing and DLAA will obviously be a no-go for pre-Turing. For general performance optimizations, it depends on if Turing still is similar enough to Maxwell and Pascal to share optimizations. For games, game studios will want to support all cards back to Kepler (and GCN 1.0 for AMD), but Nvidia has incentive to encourage the use of features that Turing excels at, even if the older archs are bad at them. Nvidia did update Kepler for Shader Model 6.1 last year. They also updated Fermi for WDDM 2.3 last year. I wouldn't be surprised if Kepler is going to only get bug fixes soon, but I wouldn't worry for Maxwell yet.
I think that it's highly likely that Paxwell cards will stagnate in performance compared to Turing (and post-Turing) cards over the next couple of years: a) As it was with Kepler vs Maxwell, bulk of NV's SW optimization effort (in both drivers and ISVs relations) will go into Turing arch now. There are certainly a lot of untapped performance potential in Turing and this will be delivered with driver updates over the next couple of years - Paxwell won't benefit from this of course. b) Again, similar to how it was with Kepler vs Maxwell, Turing is more suited for complex workloads, even without touching DXR/RTX stuff. Most of games NV used to demonstrate Turing's performance gains are actually games which weren't running that well on Pascal (comparatively, of course) which means that the performance profile expectation is changing with Turing - it's highly likely that workloads which were bad for Pascals will be good on Turing. c) DXR/RTX will be used in more and more games over the coming years, and Paxwells will just die in these with DXR features enabled - not much anyone can do about it.
There will be games that support RTX, but I can easily see it being a novelty like PhysX was when it was released. Even more so if AMD lags behind with hardware for DXR. I don't see DXR being the standard unless the next-gen consoles support it with good performance.
Sure, but how many studios will use it in their games if Nvidia is the only one that supports it? If the next-gen consoles can't utilize DXR at playable performance, most studios will continue to stick to rasterization, except for Nvidia sponsored titles.
Most engines won't even support DX12 properly... How many DX12 games out there that significantly gained something from it?
I dunno, most of them? To answer this question we have to know how hard it will be for them to use it. So far we have reports that RTX effects were added in game engines in a matter of days. The level of support isn't that relevant, and even then you should consider that NV controls the majority of PC GPU market. You don't know that. That's because writing a proper D3D12 renderer is fcking hard.
By the time games come out that have good RT support, the 2xxx cards will be too slow to play them and you'll need a 3xxx series GPU