Most materials in real life have some reflective quality. Going though everything and getting the physically based texture to have the perfect roughness takes a LOT of time. It is just easier for devs to set a small number of roughness levels and then apply that to every PBR texture. Which is why when you compare a game like CP2077 to WDL, you can visually see the difference the devs were able/allowed to put in to get much better reflective properties on most/all objects.
We could go back further to when you didn't care about RT. Now you're taking a wait n see approach, even though you're not going to be able to use it without an upgrade. I've found that many people have gone from not caring to at least interested in the potential. Stances are softening on RT. A few more years and it'll just become a standard option that comes with most triple A games. Shrug. I'm expecting RT performance with high-end RTX4000 and RX7000 series to be totally usable up to 4K. I can't wait until it's just another choice, instead of compromises. #StillExcitedForRT
At this stage if youre an enthusiast gamer the Radeons are not a good choice. Or very future proof. CP2077 proved that. To run it with all the features on is simply not possible with a AMD GPU atm. No fine wine driver is gonna fix that either.
PBR and normal map is probably best way how to achieve better materials or real life materials If it takes lots of time, agree with that as well because I do only stills and sometimes I literally spending like 1 day just tweaking materials/shaders for simple scene Thanks, Jura
My opinion did not change from start. Even at very beginning I wrote that raytracing is bottomless pit where you can throw any and all performance as you desire. Never enough. What will happen with next generation when nVidia doubles raytracing capability? Game which will release shortly will add one extra bounce for reflections, one extra shadow sample per pixel. And performance will be gone again. Gobling performance in rasterization requires actual work, use of more detailed assets. (Or cheaping out on optimizations.) But any current DX-R game can get update which will change numerical value for reflections/refractions/per pixel rays/... And performance is gone. And when they decide to do as pointless thing as having accurate raytraced Depth of Field. Funny times ahead. Things will not get interesting till there is like 8x raytracing performance of 3080, but available at $400~500. And at that time, movie industry will be all over those cards.
RT in Cyberpunk on a 3060Ti is already interesting, no need to wait for several years (although you might have to wait for several months for the card to actually be in stock at normal prices!).
but scalability is also an advantage of RT that no other effect has. In years to come we could boot-up our favourite games from "back in the day" and it's fully realised with RT or even higher quality RT. My dream (which I've posted here years ago) is port of a film to playable game and it's all using the same assets. I'm sure my example film was Toy Story. We are closer than ever to that being possible.
Ugly kids in video games was something that got accidentally accomplished years ago though and has been recurring ever since due to shortcuts in development or asset re-use. ...As to the rest of it though yeah there's a ways to go even if the assets in that first movie are showing their age.
This article actually makes me feel ok about dropping the $ on the EVGA RTX 3090 XC3 Ultra Hybrid Gaming card I just got last Thursday. Since I run 3840x1600 I can see that the 3090 would be the right choice even for non RT games. Not to mention my LG 38GL950G monitor has a Gsync 2 chip in it, so it make sense in both areas to go with the 3090.