Discussion in 'Frontpage news' started by Clawedge, Mar 18, 2019.
Just the first paragraph!
"It just works" *shrugs* - who knew? *ynark ynark'
The Nvidia RTX is reverse raytracing in the first place, so it doesn't match reality. That's already one extra way to do it. A smart way, of course. It's ridiculous to say there can be nothing between the Crytec rudimentary implementation and the Nvidia crippling a $1300 card implementation. Of course it's largely up to the game studios and game engine makers, but Nvidia has always been willing to offer plenty of help to those. It's not like even the current few RTX enabled games would be out with the functionality if Nvidia's folks hadn't been providing info and help to the game makers.
As it is now, Nvidia managed to create an image in people's minds that the top of the line RTX card can't manage 60 fps in 1080p because the first game turned out like that initially. Of course the game now does manage it, but it was still a disaster.
I'll have to take it as a compliment if you don't consider it a big deal.
Enabling DXR in Pascal isn't about Pascal - it's about Turing. Nvidia wants to give Pascal owners a taste of ray-tracing so that they will be encouraged to buy Turing for faster ray-tracing performance.
It's a testament to the slow adoption and low sales of RTX.
Every ray tracing reverses real life, it's actually called path tracing and not ray tracing. But even reversed, the RT cores will still work all the same because the implementation is 100% in the developer's hand, all the RT cores do is calculate intersections between rays and the world (boxes and polygons). So NVIDIA doesn't reverse reality, developers do. The RT cores don't care, in the end, every RT algorithm will use intersections between rays and the world and that's what the RT cores do.
Crytek? They are just doing lower res RT reflection than DICE, that's it. The RT cores will still accelerate them like any other RT method because the RT cores do an atomic arithmetic action, not a full algorithm. It doesn't matter what algorithm the developer will use, the RT cores will help them. It's like saying "do better multiplication", you can't. The developer writes the algorithm, the RT cores just do the "multiplication". You can also do other things with the RT cores like collision detection of bouncing sound off surfaces.
There isn't a 1300$ NVIDIA implementation, one is software and one is hardware. NVIDIA just makes a core that accelerates the basics, they don't implement anything. It's not HairWorks. What you mean is that there is the DICE implementation and the Crytek implementation. Both can run on a Vega 56 and both can be accelerated by an RT core.
Yup, you are right. They shouldn't have told partners only two weeks before the announcement that they are making an RT card. Today an 2080ti can run rt games x2 faster than when they launched and yes, it was bad PR. But the important part is that developers will get better at RT effects. Look at Control, it has 4 RT effects while every game until now had only 1. Like shaders with the Geforce 3, hardware will get better at running it and developers will get better at using it.
Yup, it means more games with RT features and a taste for AMD and the 1000 series users. If more games have RT effects and AMD/1000 series gamers taste them @1080 @30fps, they might shell the cash to get an RTX card to play it with better performance. It's a win-win for NVIDIA, they get more content while advertising their RTX series.
Sounds good to me , let the new tech become main stream. Now AMD and Nvidia both are in the same camp on ray tracing one way or the other. Hopefully game devs will utilize this technics on advacement and new eye candy for game. I'm all in for future development on games to fully support ray tracing.
Do yourself a favour and look at the column on the far right of the image.
Makes most sense of all of what I've read in here.
Also, that's why Nvidia didn't feature any new GPU on GTC, simply doesn't help their business plan.
That being said, I'm still not sure why they release RTX on other GPUs... people still won't suddenly jump up from their chairs and run to the next retailer for a 2080TI if they haven't done so already.
well this is pretty logical moves i guess
with crytek also making RayTrace tech avaiable for all GPU
now nvidia supporting it for older cards simply because they want people using more and more "RayTrace"
which eventually RT-cores will clear enough giving improvement (either visual or fps), that improvement that u cant get without gpu with RT-cores
so rather than probably some people think nvidia killing RTX series sales by supporting older-cards
it instead creating opportunity for the Turing with more ray-trace uses in games, because in term of performance its what make it different to pascal
There's a lot of gaming devs using pascal gpu's, letting them test ray tracing code (even if doesn't run fast enough) is a great way to iimprove DXR adoption in future games.
Yes that's true, we shall see if it pulls them in... one thing though, RTX / DXR only runs under DX12 (win10 1809) afaik, so the devs playing around with it is like playing around with DX12... we aren't there yet, again. Sadly.
One other thing I just wondered, if RTX comes down and does make any sense on the lower end cards in terms of showing something playable, that could indeed be a feature added to the 1660TI... hard to be sure about that though.
Nvidia's Turing architecture includes a reworked pipeline that allows for concurrent FP (Floating point operations) and INT (integer operations) execution.
In the past, a lot of graphics work relied purely on FP calculations, but INT calculations are becoming increasingly common, particularly with ray tracing algorithms.
Depending on the game and the DXR implementation, Nvidia's data shows that the Turing architecture without its RT cores (which is what you'll get in the GTX 1660 Ti and GTX 1660) is still better than Pascal.
And depending on how much ray tracing is being done, Turing RTX 2080 may be anywhere from 1.3x-4.0x faster than a Pascal GTX 1080 Ti (and 1.6x-5.0x faster if DLSS OFF).
Well this is surprising but good news. Definitely looking forward to see how the 16 series will benefit from this.
did you finally go with the Gtx1660 ventus?
Ray tracing coming soon.
Yep. I'm hoping these new drivers unlock some of turing's power.
This would have little use from pascal as it still would not be able to do ray tracing even remotely well, this would be only useful if it was an RTX card and potentially, but less so, a 1660 series card.
A dedicated ray tracing card could actually be quite useful if it was 2-4 times larger and more powerful then what is in a 2080 ti, but i'm hoping for the technology itself to get better then offload it to another card, even another GPU.
Someone didn't read the article and is drinking the koolaid......
Really starting to question your ability to read and comprehend as the graph he posted, explained exactly what you're saying here it didn't. Here, i'll post it again, and if you can't understand it, ..............i'm not even sure what to say to that, as that's just...wow.
You state $300 in 2000 is not the same as $300 in 2016, which, yes, that's....obvious....as it's stated in the image, it clearly shows that. Granted, it doesn't show it in 2016 inflation, rather 2017, but that point isn't really even a point, since 2016 or 2017, doesn't matter, it matters that they are all compared to the same year, and not random years that will give random number jumps that are meaningless
Seriously, the whole graph is about inflation, yet you state it's not taking into account....inflation......................................what?
Wrong, look at the article slides:
1080ti, RT Ultra 1440p, 16Gb RAM
18fps metro exodus
30fps shadow of the TR
Just drop the res down to 480p and have everything else on low settings and bingo, RT on a 1080 working perfectly. Lol
You guys can argue the point of ray tracing till you are blue in the face. All I know is it has been six months since Rtx cards came out. Since then a whopping 11 games use ray tracing. Two of which I would probably buy. But I assure you I am not going to spend $1100 to play two games with ray tracing. The price to actual use of ray tracing is ridiculous. Unless just about every game available becomes ray tracing enabled, or the pricing of Rtx cards tumble 50%, Nvidia has a white elephant on its hands. Do they double down on RT on the next generation Rtx cards, and charge even more for that series. And alienate even more customers. Or do they circle the wagons and go in another direction? And oh yeah, tick tick tick, Intel is getting into the gpu game in 2020. My how the mighty Nvidia has tumbled.