Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jan 10, 2019.
of course he says that. It's what he does best, talks $h1T
Performance isn't poor, performance is extremely good compared to anything else out there and considering the one main game(for now) can be played at 1440p with ray tracing affects for a first release, that's not bad at all. People need to really figure out what they deem is "good fps", as first introductions of technologies have generally been far worse in performance cost then ray tracing has, and yet ray tracing should have been the most.
As to your "AMD can already do ray tracing", proof? where? I mean any CPU and GPU can technically do ray tracing, but you say less performance? 1fps? Where's your statement come from?
If you don't understand my point, lets say two graphics cards with equal performance in "normal" rasterization technologies, are pitted against eachother in ray tracing. Lets say Card A, gets 10FPS, and card B, gets 60FPS. Again, both are equal without ray tracing. What about that is "poor performance" about card B?
No, currently, we can't test that out to see if it's true, but what IS true is that ray tracing has NEVER been possible in a game at any reasonable resolution, until now. This is fact. If you want to say that's not fact: Prove it.
Also you should note these are big words from the man who sold, 110,000 of his shares in Nvidia and is being sued for lying to the share holders and lying to his customers for years.
It's not so simple. First the Riva TNT and TNT 2 and then the GeForce were pitted against 3Dfx and beat them. Yes, 3Dfx hastened their own demise, but they were already behind. Ever heard of the Voodoo 5?
ATI came out with the Rage chip back in 1996. Yes, they beat 3Dfx and Nvidia to market with a "3D" chip (and I use that term loosely). Yet their design was terrible, and even though they made a huge amount of money from integration into OEM PCs (with marvelous advertising stickers on the front proclaiming, "Blazing Fast 3D"), it wasn't enough for them to come out with a competitive card until the Radeon in 2000, which was plagued with issues right from the beginning.
Well, majority of gamers prefer high resolution and high refresh rates over what RTX bring.
Does RTX improve visuals? Yes...kind off...but at what costs???
"Who cares if no games are published that does not use Ray tracing as long as an army of creatives finally have a card ( cheaper than volta ) to finally begin working on implementing as much."
Seriously, retarded comments like this is something that really piss me off.
Those are gaming cards and as such are primarily targeted for gamers, unlike Titan cards which were always promoted as cards for creators.
Plus how many creators would use xx60-xx70 cards ? None.
If Nvidia released cards with at least usual performance uplift there would not be as much anger from gaming community, but as it stands, they forces us to pay double for something that half a year after release still cant be used in more than one game and in which its even mostly pointless due it being fast paces multiplayer shooter where you don't even have time to enjoy better visual quality and where higher FPS is best.
Anyway I am not spending single penny on RTX until it can manage high resolutions at high FPS (WQHD@120Hz in my case) and for that we will have to wait at least few generations.
Hey, just 2, or maybe 3 or 4 new generations of card from now and it will be totally awesome
I remember Intel threatening to come out with a real-time ray tracing card some years back (like 13?). Apparently it's a lot tougher than they thought it would be
Yep. By the time RTX visuals will be worthwhile vs the performance hit, RTX2xxx cards will be virtually obsolete. People will be on RTX4xxx or newer.
Whilst that it's true. Someone had to start theball rolling. And even though i use an AMD GPU i applaude Nvidia for taking the first step.
Absolutely. I salute RTX owners for paving the road for the rest of us.
There must be something if he made this rant move.
JHH was probably expecting a quad-Navi on a single interposer...
well CEO or Janitor both same human, its just his bad day ... everyone get moody bad day sometimes no ?
but if he know how things spread in internet nowdays, he should change few words and the impression can be whole different
"radeonVII performance is good, but still ours still better and have cutting edge tech"
or he can simply saying "our RTX still better "
regarding RTX series, from what i see, the only problem is the price$$$
if nvidia keep the price range same to previous generation (10**/ti) people wont meme it like now
high price coupling with new-tech that not ready = what people been arguing now
Well, except that resolute "No" at start I do agree.
Simple answer is that it will fall down to AMD's implementation. And I do expect radeon 7 to be anywhere between 2080 and 2080Ti in DX-R as it has 34% higher compute power than 2080 and practically matches 2080Ti.
And as this is written, one should not forget that Vega 64 has some 69% higher compute power than RTX 2070.
Therefore AMD can deliver few surprises.
The performance of the 2080, to me, is trash. What is the old saying? "When you point one finger, there are three fingers pointing back to you."
I don't think compute tells the entire story:
2080Ti only has 3% higher FP32 yet half FP16 and performance on 2080Ti is ~30% faster than Titan V (On Ultra).
We know that RT cores are fixed function units designed for BVH traversal optimization & ray triangle intersect and Nvidia also mentions the split INT32/FP32 pipeline being advantageous for these workloads in it's whitepaper. Like I said it's possible AMD's architecture is inherently better at these things than Nvidia's was prior to Turing but Nvidia clearly optimized for these and pretty much spelled out that they're necessary for DXR performance. The only benchmark we have testing it basically shows that's true to be true.
Edit: Another datapoint:
You also know the saying. "One man trash is another man's treasure".
2080 performance is second to best.
so many arm chair ceo's and chip engineers in this thread.
Even if AMD with Navi might pull out RayTracing , im pretty sure it wont run on BFV. unless devs implement it for AMD Tech aswell.
RTX sounds like the new PhysX to me. Dead as soon as it was out.
Only a few games ever implemented it and it never ran that well , only on the Batman Titles.
Ray Tracing in real time sure is revolutionary , but ill pick 144hz Ultra over RTX barely 60 fps on FullHD.
FullHD days are kinda over for me.
And when or should I say maybe in some ridiculous dream, I buy Nvidia RTX 2080 for 800-900 bucks my wallet will be broken.HAHAH
And Radeon VII for 700 bucks is no go.
I think it is better buy used RX 580 8 gb from mining rig and change termal paste.It wil cost me here in Croatia 230 bucks or 200 euros with original box.
Well yeah the DXR implementation is on devs - so they'd definitely have to plug into AMD's "RTX variant" library in order to get acceleration but they already set the game up to do that. It's the reason why they aren't using tensors for denoise. The RTX gameworks library is just an easy method for devs to quickly integrate it into their games. The better studios/engines will implement DXR and use NVAPI/RTX acceleration to accelerate the workload.
I think people are confused about how RTX works. RTX is a branding for Nvidia's acceleration of RT in hardware. DXR is what actually creates the BVH instance and creates the framework for ray traversal calculations. DXR can work cross vendor - game dev can implement DXR then accelerate it with either Nvidia or AMD's protocol for acceleration (typically some kind of open/proprietary library) RTX Gameworks is a completely different thing - it's Nvidia only prepackage of DXR effects for gamedevs to quickly implement. BF5 doesn't use this and I doubt most of the bigger games/engines will. It's for smaller studios to quickly integrate DXR tech into their game and have it run easily/effortlessly.
Hardware PhysX never got off the ground because it was always Nvidia only and CPU physics were always good enough. DXR is different because it's part of DX12, supported by both vendors, will eventually dramatically increase image quality and reduce artist workload. Eventually all games will use it or some form of RT and you won't get the choice between not enabling it so you won't know the performance difference anyway.