NVIDIA Announces Support for lots of RTX ON based games at Gamescom

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Aug 19, 2019.

  1. tsunami231

    tsunami231 Ancient Guru

    Messages:
    10,416
    Likes Received:
    559
    GPU:
    EVGA 1070Ti Black
    I think you hit the nail on the head. not that there wasnt a reset when 4k became thing cause even the high end cards are chasing that still and will be for years, RT is just worse I think it will be 5 years at minimum before RT is really usable.

    As for AMD and what @Astyanax said about RT AMD obviously has it they just not implementing for reason and I dont expect them to, till after the next ps5/xbox comes out, seeing it being claim they both will have it and both are AMD hardware. which by the time they are released the hit might not be so massive, maybe.
     
    Last edited: Aug 20, 2019
    Fox2232 likes this.
  2. jwb1

    jwb1 Master Guru

    Messages:
    725
    Likes Received:
    156
    GPU:
    MSI GTX 2080 Ti
    Like he cares.

    [​IMG]
     
  3. sykozis

    sykozis Ancient Guru

    Messages:
    21,624
    Likes Received:
    927
    GPU:
    MSI RX5700
    3000 series will be an improvement over the 2000 series, but nothing overly dramatic. It's still going to struggle with ray-tracing, but not as much as the 2000 series does. It's going to be a few generations before ray-tracing can be implemented in any truly meaningful way. From a business standpoint, it doesn't make any sense to bring a product to market that can do real-time ray-tracing at a high enough level of performance any sooner than necessary. We will see small gains in ray-tracing implementation every generation, and small performance gains to accommodate those improvements.

    Someone should show him NVidia's product stack.....since they sell GPUs that don't support ray-tracing.... He basically just said that NVidia's better selling GPU's make no sense.....lol
     
    carnivore and MonstroMart like this.
  4. tsunami231

    tsunami231 Ancient Guru

    Messages:
    10,416
    Likes Received:
    559
    GPU:
    EVGA 1070Ti Black
    you expecting it to make sense? nothing makes sense these days, not even me o_O
     

  5. sykozis

    sykozis Ancient Guru

    Messages:
    21,624
    Likes Received:
    927
    GPU:
    MSI RX5700
    You've never made sense though....lol It's ok, we still like....well, most of us anyway.
     
  6. Astyanax

    Astyanax Ancient Guru

    Messages:
    6,597
    Likes Received:
    2,054
    GPU:
    GTX 1080ti
    nvidia stock is up 7%

    I want to see what a 5700XT can do with DXR fallback once the 5900XT is out
     
  7. Fox2232

    Fox2232 Ancient Guru

    Messages:
    10,323
    Likes Received:
    2,458
    GPU:
    5700XT+AW@240Hz
    It will likely do worse than let's say Vega 64. But it depends on where main load gets processed and how.
    If it is heavy on moving data around and scheduling, RDNA will win even with quite smaller computational power.
    If it is easily scheduled and does not have to deal with complex data fetching, then Vega will win with higher compute.

    Then there is concurrency where RDNA may pull ahead thanks to better scheduling... fewer stalls of hybrid workload.

    All in all I expect 1.45~1.55 times higher performance in DX-R games than 1660Ti
     
  8. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    11,536
    Likes Received:
    3,508
    GPU:
    2080Ti @h2o
    With the spitting in their loyal customer's face... when did Nvidia really do them a service before? Not sure that I've ever had the feeling that Nvidia was trying to cater to their customers by doing them a favour... ever. In that regard I agree with you, sure, but I'm not surprised or would not have expected otherwise. :D

    Here, again, I really, really doubt your saying that there would have arrived a time when devs couldn't bring a GPU to their knees. I honestly think you are just being too optimistic here. We already see how they're trying to sell people internal rendering resolutions above 4K too etc. etc. and even the biggest card can be crippled by placing multiple effects, more lights, etc. etc.
    I am not a game dev, coder, or graphics engineer, but I do not ever have the feeling that they couldn't just do more if they had the time for it. But I guess only a real professional could tell us the truth about this. I can't say you're wrong, but I do have to say I doubt that the GPUs we have, even if they'd have made a huge chip without RT, could not be brought to it's knees.
    You are right though, 2080TI SLI probably does bring 4K60fps to many conventional rendering scenarious as of now, that I have to admit. But give it a year or two time and you'll need more again. Has always been like that. Or am I wrong to think, when 1080p hit, that a few years later, you'd have needed more GPU power to still drive 1080p60 with just a single card in the best looking games?

    With the sentiment of RT being a premium feature and very costly, I 100% agree with you.
    Only that I have to argue, I buy the biggest card, and I don't buy 30 games a year... maybe I'm not the norm here, but I have neither the time nor interest to play 35-40 games a year. That's almost a whole game a week... and sure there will be games that don't support it but those will probably not gain much by the graphics anyway. Nobody needs "realistic" rendering when buying the next sidescroller or tower defense game... not sure what RT should do in games like Minecraft or WOW classic, the graphics simply do not try to be realistic, so more realistic lightning doesn't make sense.
    BUT again, I agree with you that RT will still remain a premium feature until the hardware gets substantially cheaper / better performing, aka RT in every low level / mid range card. RT will definitely, like you said, remain a premium feature... but how does such a feature NOT become mainstream if the companies do not at some point start to sell cards? I can't really imagine anybody waiting for another 5 years until suddenly a company plops up saying "we can do 4K30fps ray tracing", when devs haven't even started to implement it as of that time. Who'd buy a card like that? Same problem, only later...
    That a premium feature has it's price comes with the nature of a premium feature itself... again, while you are right, but this is not surprising.

    Well, I kind of see now what you mean. Sure, scaling more and more, again and again could help. But for what gain? Simple resolution improvement is boring af. It really is. Yes a 4K screen does look marvellous... until they introduce 8K screens for PCs. Same story over again, same discussions about why a single GPU can't render four times the pixels alone again. BUT I see what you mean, and I have to agree only that it's not even remotely what I was discussing in the first place.

    Bringing in the 7nm argument is... a little too fast. They might as well do 7nm RT and get better performance / power and in general... Your argument remains true, of course, but not only for Pascal / non RT scenarios. So if anything, your argument works for my intentions too. ;)

    Again, we're discussing if RTX is a dead on arrival feature, what I referred to in the post I quoted... if you say so, fine, but I haven't seen any real graphical improvement tech wise since the introduction of dx11. In 2013, as of dx11.2. And I'm fed up with dx12 as it is since it has done nothing for graphical fidelity. It still doesn't feel like it adds much value to anything right now... DXR suddenly pops up, with at least the promise of a future improvement. Hence, I don't see RT as dead on arrival. Which was what I argued against... before being pulled in into the 4K discussion. Which is uninteresting to me since if you want 4K, buy two 2080s and SLI them and be done with it. Oh right, dx12... doesn't work. At least RT is a feature that devs try to adopt, as opposed to low level programming and mGPU programming.

    I see you arguments, and they are valid, make sense... only that they are only half way about what I was referring to in my first post, and aren't even my focus. I don't care about 4K since I know no GPU can drive it... and buying a 4K screen with no GPU able to drive it in most games is a silly buy in the first place.


    Then I hope you haven't bought any Nvidia GPUs in the past five to ten years, since they were developing RTX for a long time, and you already paid for that with your money. You probably paid for autonomous driving R&D as well if you bought an Nvidia GPU, you paid for Ryzen development in buying any recent AMD GPU since Bulldozer etc. etc.
    That sentiment is probably the worst argument to not buy a GPU, you're always paying for R&D, no matter what the company does. Lacking RTX quality, adoption, higher price for no real world gain, all fair arguments, but skip Nvidia for what really is the issue: price, not what they do with the money since you can't influence that.
     
    Fox2232 and Maddness like this.
  9. AlmondMan

    AlmondMan Master Guru

    Messages:
    502
    Likes Received:
    42
    GPU:
    5700 XT Red Dragon
    Good stuff. Means that it'll be worth getting an ray tracing capable GPU in 3 years.
     
  10. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,329
    Likes Received:
    178
    GPU:
    MSI GTX1070 GamingX
    I've already said in other threads I reckon their business model will last at least a decade for RTX. But, seriously, why would I not buy an RTX capable gpu? If money isn't a problem, then, it's Nvidia all the way.

    The adoption of DXR+RTX by game devs is the fastest I've seen re: PC GPU tech in a long time. More significant than DX12 in it's first few years. Few are saying it, but, I think it's huge and is only going to get better.

    My monitor is 1440p (2k). I reckon by the time I get a 3000 series RTX that the majority of RTX games will be playable with 60fps+ at my resolution.
     
    Maddness likes this.

  11. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    5,034
    Likes Received:
    1,698
    GPU:
    HIS R9 290
    Of course he doesn't; that ironically is the reason I made the comment in the first place. He doesn't care about anyone or anything but himself and the status of his company. Unlike other greedy CEOs, he doesn't even pretend to care about what anyone else thinks. That being said, I have to give him credit for being transparent, but willfully showing the darkness of his true colors isn't exactly encouraging me to support him.

    I disagree. Nvidia may be extremely innovative (and a lot of that is due to him) but a lot of that innovation is kept all to themselves, which stifles progress. Not everyone uses Nvidia products.
    Take real-time raytracing for example - it's fantastic Nvidia pushed that through the door, because nobody else was going to for a long while. Nvidia may have single-handedly brought gaming graphics to the most realistic they have ever been, but, their execution of RTX was done so poorly that they also may have single-handedly pushed back real-time raytracing, because nobody is understanding why the expense and performance deficits are worth it. And because it's proprietary tech on an expensive platform, it hasn't been adopted enough to show its true potential.

    I believe Nvidia can still be the leader of GPU innovation without Huang, the only difference is without him, they'll make something that actually gets used. To my knowledge, the only resounding success of Nvidia's proprietary tech was CUDA. And once Intel releases their new GPUs, I'm not so sure CUDA is going to dominate the GPGPU market in a few years from now.
     
    Last edited: Aug 20, 2019
    carnivore and airbud7 like this.
  12. Denial

    Denial Ancient Guru

    Messages:
    12,784
    Likes Received:
    2,037
    GPU:
    EVGA 1080Ti
    I don't think he's greedy - big ego maybe but then again he grew Nvidia from nothing into a top tier tech company. Nvidia is consistently rated one of the best companies to work for in the tech industry. Everyone I knew that interned there out of RIT basically said it was incredible compared to their other co-ops.
     
  13. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    5,034
    Likes Received:
    1,698
    GPU:
    HIS R9 290
    I guess I phrased myself poorly: "Unlike other CEOs that are greedy" would've been better.
    Although I agree greediness wouldn't be within my top 5 (or maybe even top 10) descriptions of him, it's worth pointing out that selfishness is inherent of a giant ego. Selfishness is also inherent of greed. That doesn't mean having a big ego makes you greedy, but in this context, it's pretty obvious he wants more people buying his products. He and his company are already filthy rich and dominating the market, so, if it's not greed, what is it?

    All I want him to do is be more humble. Look at all the other CEOs of any other tech giants and you don't hear them ridiculing their competitor's products or judging people for shopping for their competitors. They don't act like they're the best, whether as a person or representing the company. Steve Balmer was pretty much the only exception I could think of, and he didn't last long. You could argue Steve Jobs had a big ego too, but he was mostly just pretentious; he didn't so blatantly act like he was better than everyone (not to the public anyway). Jobs would find ways to say "look at how elegant this product is" but Huang's approach is "look how much better we are".

    That being said - I'm not happy with AMD's recent marketing about Epyc, because it's doing exactly what I don't like: bragging and insulting the competition.
     
    airbud7 likes this.
  14. Embra

    Embra Maha Guru

    Messages:
    1,006
    Likes Received:
    258
    GPU:
    Vega 64 Nitro+LE
    But Nvidia is not so great to work WITH.
     
    schmidtbag likes this.
  15. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    5,034
    Likes Received:
    1,698
    GPU:
    HIS R9 290
    I agree, though I would argue that's indirectly part of Huang's ego. It's always "my way or no way" for him.
    As someone who mostly uses Linux, the consequences of that make it my #2 reason for not buying Nvidia. But, I won't ramble on about that - most people here don't care.

    Price would be my #3 (and even that you could argue is indirectly part of his ego).
     
    airbud7 likes this.

  16. ruthan

    ruthan Master Guru

    Messages:
    363
    Likes Received:
    48
    GPU:
    G970/3.5G MSI
    Why not implement it, when Nvidia will send you bunch of their own developers for free.. They did it with PhysX, Apex, Ansel and probably other stuff as Hairworks annd they will do it again.. im not saying that its bad.. but its really only about result.. and AAA game would be even more expensive unless game would become RTX only and it would take lots of time.. because lots of now days games still running fine on few year old GPU.. and people are not used to buy new GPU every year or two as in the past.. its also not such cheap as it was. Geforce 2060 its quite expensive and RTX on struggle a lot on it.

    Regardless of future, in the end we who are buying Nvidia stuff will pay budget of this experiment, developers.. lets hope that is right way of progress not so dead end..

    It remembered my Brano - Voxel Slovak - https://www.atomontage.com - he is still working on it.
     
  17. Fox2232

    Fox2232 Ancient Guru

    Messages:
    10,323
    Likes Received:
    2,458
    GPU:
    5700XT+AW@240Hz
    Yeah, why to not let nVidia touch your code? Because in many cases they managed to damage code shared between AMD, intel and nVidia for others instead of creating separate code path.
    Some studios did talk about it. Basically as nVidia's guys were done, it was all wonderful, except it would not even run on AMD/intel.

    As for PhysX, Apex, Hairworks, ... nobody sane wants that. Even people with nV HW reduce/disable it if they can just to save themselves stutter or other performance issues.
    And it does not look good anyway.
     
  18. HeavyHemi

    HeavyHemi Ancient Guru

    Messages:
    6,936
    Likes Received:
    931
    GPU:
    GTX1080Ti

    You've never met him, yet you've defined him based on your impressions of his public persona as head cheerleader for the company he founded. I think, you should reflect on why you have so much emotional energy invested in him that it affects your lifestyle choices. I could argue, you're Payton Place. This, like many tech articles, devolves into the nonsense of discussing subjective personalities traits. Utterly pointless.
     
  19. HeavyHemi

    HeavyHemi Ancient Guru

    Messages:
    6,936
    Likes Received:
    931
    GPU:
    GTX1080Ti

    In many cases AMDrones just make stuff up. 'Gimping' is their favorite troll antic. How about a list of these games where as you state as fact "they damaged code and broke many games ability to even run on AMD/intel". Reads like crap to me.
     
  20. Fox2232

    Fox2232 Ancient Guru

    Messages:
    10,323
    Likes Received:
    2,458
    GPU:
    5700XT+AW@240Hz
    [​IMG]
     
    airbud7 likes this.

Share This Page