Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jan 10, 2019.
It is a gimmick for as long as the performance is poor, raytracing will need a good 2-3 generational steps in power to have enough support for it to work on even low end. Buy a 2060 now and you basicly cannot use RT without being on sub 30 FPS.
AMD can already do raytracing albeit with less performance.
It's not the price point it's the fact that right now only one card can actually manage to achieve a somewhat acceptable performance. What's the point for a developer to include raytracing if realisticly only a handful are going to use it.
Well said. Luckly AMD have hinted that they will be doing Ray Tracing in the future. When this happens i suggest there will be a change of attitudes. I currently use an AMD GPU, but am so hyped about Ray Tracing.
of course he says that. It's what he does best, talks $h1T
Performance isn't poor, performance is extremely good compared to anything else out there and considering the one main game(for now) can be played at 1440p with ray tracing affects for a first release, that's not bad at all. People need to really figure out what they deem is "good fps", as first introductions of technologies have generally been far worse in performance cost then ray tracing has, and yet ray tracing should have been the most.
As to your "AMD can already do ray tracing", proof? where? I mean any CPU and GPU can technically do ray tracing, but you say less performance? 1fps? Where's your statement come from?
If you don't understand my point, lets say two graphics cards with equal performance in "normal" rasterization technologies, are pitted against eachother in ray tracing. Lets say Card A, gets 10FPS, and card B, gets 60FPS. Again, both are equal without ray tracing. What about that is "poor performance" about card B?
No, currently, we can't test that out to see if it's true, but what IS true is that ray tracing has NEVER been possible in a game at any reasonable resolution, until now. This is fact. If you want to say that's not fact: Prove it.
Also you should note these are big words from the man who sold, 110,000 of his shares in Nvidia and is being sued for lying to the share holders and lying to his customers for years.
It's not so simple. First the Riva TNT and TNT 2 and then the GeForce were pitted against 3Dfx and beat them. Yes, 3Dfx hastened their own demise, but they were already behind. Ever heard of the Voodoo 5?
ATI came out with the Rage chip back in 1996. Yes, they beat 3Dfx and Nvidia to market with a "3D" chip (and I use that term loosely). Yet their design was terrible, and even though they made a huge amount of money from integration into OEM PCs (with marvelous advertising stickers on the front proclaiming, "Blazing Fast 3D"), it wasn't enough for them to come out with a competitive card until the Radeon in 2000, which was plagued with issues right from the beginning.
Well, majority of gamers prefer high resolution and high refresh rates over what RTX bring.
Does RTX improve visuals? Yes...kind off...but at what costs???
"Who cares if no games are published that does not use Ray tracing as long as an army of creatives finally have a card ( cheaper than volta ) to finally begin working on implementing as much."
Seriously, retarded comments like this is something that really piss me off.
Those are gaming cards and as such are primarily targeted for gamers, unlike Titan cards which were always promoted as cards for creators.
Plus how many creators would use xx60-xx70 cards ? None.
If Nvidia released cards with at least usual performance uplift there would not be as much anger from gaming community, but as it stands, they forces us to pay double for something that half a year after release still cant be used in more than one game and in which its even mostly pointless due it being fast paces multiplayer shooter where you don't even have time to enjoy better visual quality and where higher FPS is best.
Anyway I am not spending single penny on RTX until it can manage high resolutions at high FPS (WQHD@120Hz in my case) and for that we will have to wait at least few generations.
Hey, just 2, or maybe 3 or 4 new generations of card from now and it will be totally awesome
I remember Intel threatening to come out with a real-time ray tracing card some years back (like 13?). Apparently it's a lot tougher than they thought it would be
Yep. By the time RTX visuals will be worthwhile vs the performance hit, RTX2xxx cards will be virtually obsolete. People will be on RTX4xxx or newer.
Whilst that it's true. Someone had to start theball rolling. And even though i use an AMD GPU i applaude Nvidia for taking the first step.
Absolutely. I salute RTX owners for paving the road for the rest of us.
There must be something if he made this rant move.
JHH was probably expecting a quad-Navi on a single interposer...
well CEO or Janitor both same human, its just his bad day ... everyone get moody bad day sometimes no ?
but if he know how things spread in internet nowdays, he should change few words and the impression can be whole different
"radeonVII performance is good, but still ours still better and have cutting edge tech"
or he can simply saying "our RTX still better "
regarding RTX series, from what i see, the only problem is the price$$$
if nvidia keep the price range same to previous generation (10**/ti) people wont meme it like now
high price coupling with new-tech that not ready = what people been arguing now
Well, except that resolute "No" at start I do agree.
Simple answer is that it will fall down to AMD's implementation. And I do expect radeon 7 to be anywhere between 2080 and 2080Ti in DX-R as it has 34% higher compute power than 2080 and practically matches 2080Ti.
And as this is written, one should not forget that Vega 64 has some 69% higher compute power than RTX 2070.
Therefore AMD can deliver few surprises.
The performance of the 2080, to me, is trash. What is the old saying? "When you point one finger, there are three fingers pointing back to you."
I don't think compute tells the entire story:
2080Ti only has 3% higher FP32 yet half FP16 and performance on 2080Ti is ~30% faster than Titan V (On Ultra).
We know that RT cores are fixed function units designed for BVH traversal optimization & ray triangle intersect and Nvidia also mentions the split INT32/FP32 pipeline being advantageous for these workloads in it's whitepaper. Like I said it's possible AMD's architecture is inherently better at these things than Nvidia's was prior to Turing but Nvidia clearly optimized for these and pretty much spelled out that they're necessary for DXR performance. The only benchmark we have testing it basically shows that's true to be true.
Edit: Another datapoint:
You also know the saying. "One man trash is another man's treasure".
2080 performance is second to best.
so many arm chair ceo's and chip engineers in this thread.