NVIDIA Announces Support for lots of RTX ON based games at Gamescom

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Aug 19, 2019.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,392
    Likes Received:
    18,564
    GPU:
    AMD | NVIDIA
  2. Abd0

    Abd0 Member

    Messages:
    28
    Likes Received:
    11
    GPU:
    Nvidia 1060 6GB
    I can't see the difference in the first two pics.
    also, now all games must be muddy and there is water all over the place to showcase the reflections.
    I don't think that this is how Ray Tracing should be used.
     
    geogan likes this.
  3. Maddness

    Maddness Ancient Guru

    Messages:
    2,440
    Likes Received:
    1,738
    GPU:
    3080 Aorus Xtreme
    The more the merrier i say. Bring it on.
     
  4. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Dying Light 1 did run well, except 4K. There one would need RTX 2080 or Ti to get stable 60fps+ at all times.
    Considering that 2nd installment should look better (heavier), addition of RTX can possibly turn it into 1080p RTX vs. 4K situation.

    Isn't Metro already RTX game? Minecraft is same old news.

    Synced from Tencent... Likely reason for name is that they had so bad AI and animation system, that they decided to call it "synced" instead of crappy.

    CoD is dead for many and RTX in CoD will be dead as soon as implemented since it is twitch shooter.

    Control has good chance to not get on people's nerves due to RTX.

    In Vampire series, they should primarily focus on gameplay, not graphics. It can be quick way to damnation.

    Watchdogs is already failed game series. And Wolfenstein: Youngblood proven to be trashy. I do not think that people who are aware of it will buy it just because RTX.
    - - - -
    All in all, 1 new game where RTX may be worth it.

    Edit: They should add NFS: Underground 4!!!
     
    airbud7, -Tj- and beta-sama like this.

  5. HybOj

    HybOj Master Guru

    Messages:
    394
    Likes Received:
    325
    GPU:
    Gygabite RTX3080
    The screens just confirm that rasterization is so perfectly utilized nowadays that it renders RTX basically obsolette.

    Theres a very good reason there is no FPS counter there. Because that would show the sheer scope of lunacy of this.

    Wanna different puddles? Go from 150fps to 70fps.

    All this is caused by ngreedia. The rtx is just damn slow even on 2080ti. If they add 10x the rtx cores, than we can talk.

    As it is now, its very close to scam
     
    Clawedge likes this.
  6. beta-sama

    beta-sama Member Guru

    Messages:
    139
    Likes Received:
    12
    GPU:
    AORUS GTX1080Ti WF
    Totaly agree !
     
  7. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,011
    Likes Received:
    7,353
    GPU:
    GTX 1080ti
    100% ignorance and opinion invalidated by the hatename.
     
    Aura89, Maddness and fantaskarsef like this.
  8. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,696
    Likes Received:
    9,574
    GPU:
    4090@H2O
    While I tend to share your opinion here @Astyanax , he does have a point: Rasterization (if that's the correct term) is perfectly utilized. A decade after it was invented or so? :D
    And RTX as it is is close to a scam if you bought into it on the start, expecting every game to have RTX etc. etc. but as with any new thing to come adaption takes time, unfortunately.

    But calling RTX obsolete is just trash talk, sorry mate. I dare to say a game dev might think differently, 10 years after the introduction of DXR, it will be as "easy" to use as rasterization now, and will probably save time in level design and programming, once the framework's complete and every dev has learned how to handle it. That just... takes time.

    That Nvidia sold it at a too high price with too little support is absolutely right, but the tech itself won't go away, and will make games look better.

    And yeah, the first two screenshots are really badly chosen to represent RTX... what's to be reflected if there's practically no light in the scene anyway? It's like showing off your new car at a family dinner when you have it locked up in your garage.

    edit: Which I too wanted to ask, where's the real news now? Wasn't it already known, individually announced, that those games would get RTX support?
     
    Last edited: Aug 19, 2019
    airbud7 likes this.
  9. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,011
    Likes Received:
    7,353
    GPU:
    GTX 1080ti
    Rasterization has limits that RT will remove, its slow now - but so was

    Z Sorting
    MSAA
    Pixel shaders
    OpenEXR HDR


    The things RT will solve that people whine about on forums (blaming the driver):

    Low distance shadow stepping
    Cascaded shadow flickering
    Water diffraction and diffuse, and color grading.
    Incorrect scene lighting
    That weird flickering that happens with windows when you quickly approach them in first person shooters.


    It will render obsolete
    Manual placement and static calculation of light flows
    Secondary light sources to achieve the 'fake' appearance of GI
    Reflection maps

    make no mistake, its introductory tech, but those who say "we don't need it coz this can already do that" are ignorant, plain and simple.
     
    carnivore, geogan and Maddness like this.
  10. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    I would not say that problem is in adaptation speed. I would say that problem was from start that HW did not have anywhere near required compute strength to do meaningful RT.
    As for used term "nGreedia", it is understandable. When nVidia introduced RTX I wrote about this being market shift strategy for next 10 years or more.
    And people are not doing themselves any service by cheering for RTX here or anywhere else!

    Why?

    Simply because GPU power would become non-limiting factor on 4K in case nVidia took Pascal and scaled it to Turing's 18.6B transistors. Yes, 1080 Ti is 11.8B transistor GPU only.
    In other words, If nVidia decided to sell 18.6B transistor Pascals, people would have no need to upgrade in next ~5 years even on 4K.
    So they did paradigm shift, reset performance at all resolutions for this "new" thing, and people are going to chase fps again. And will be upgrading in much more regular intervals.

    Why? Because RT demands can be several magnitudes higher on mere 1080p than it is now on 4K Battlefield. And it will take few clicks to increase demands. RT is bottomless pit that gobbles any kind of performance.
    Going forward, improvements to GPU performance will be relatively big. But even doubling RT performance per generation at same price points will not result in happy gamers. Quite contrary, new RT settings for game from given year will put last generation cards to their knees as their RT capability is only 1/2.
    Top cards will be almost useless for RT in time 2 GPU generations are released.

    People should imagine that nVidia delivers 4x RT performance with next generation. How will those RT games made for new cards run on 1st generation Turing?
    - - - -
    And what's other option there to have?

    nVidia brings smaller performance improvements to RT per generation. Like 50%. Top cards from last generation will survive through next.
    But time it will take for RT to actually become good IQ improvement will turn into half decade or more.
    - - - -
    And either way it is great milking cow. Thing like enough RT does not and will not exist in our realm of discrete GPUs.
    But Enough rasterization is very real thing, because to suck power out of GPU takes actual effort on developer's side. Like much more complex shader code, additional effects, postprocessing, ...

    This entire thing is planned obsolescence level 9000 + 1.
     
    carnivore and airbud7 like this.

  11. Hyderz

    Hyderz Member Guru

    Messages:
    171
    Likes Received:
    43
    GPU:
    RTX 3090
    man nice list of games def gonna get dying light 2, since i loved the first one.
    I want Titanfall 3 with full rtx support since i loved second one alot.
    if next Resident Evil 8 or Resident Evil 3 Remake that has rtx support, that would be nice :)
    Remake or remaster all 3 of the dying space series with ray tracing, i will buy that definately

    looks like i made the right decision to hold off on buying current nvidia cards, since better gpu is coming from amd and nvidia in 2020 or early 2021 :)
     
  12. Silva

    Silva Ancient Guru

    Messages:
    2,048
    Likes Received:
    1,196
    GPU:
    Asus Dual RX580 O4G
    As long as I can turn it off, so I can play with decent frame rate, I don't care.
     
  13. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,696
    Likes Received:
    9,574
    GPU:
    4090@H2O
    For easier reading, I put my babbling into a spoiler, below's the shortened version.


    Yes, it makes sense. 4K performance could be achievable (in a single chip), but it's not going to happen. I don't see that happening at any time soon, and not sooner than 3 years ago.

    That said, 2% of PC users use a 4K screen for gaming. From a company point of view, that you correctly put as a base of Nvidia's thinking, it does not make sense to cater to that group. It makes much more sense to bring a new feature to the group that is the biggest... which is 1080p. Still is on PC. I hate to put it that simply, but you already explained to yourself why it makes no sense to push for 4K but instead a new feature on 1080p. Simply business speaking, of course. Business logic part one.

    Sure consoles are gaming on 4K now with the advancement of TVs, and their becoming cheap enough for wide adoption, but we too know that consoles might not render all of their picture with as many effects as a PC (in terms of details, light sources, etc. etc. we know the differences between console and PC game settings). So the driving force in console adaption again, is to make them run at a reasonable rate for a reasonable price... limited by a capitalist motive, again. Business logic, part 2.
    Until consoles support RT (AMD's turn to innovate here), it won't kick off since consoles do not have dedicated RT hardware or the general power to do it for the next years. Same with true 4K rendering with as many details as a PC could push. And that's why it won't be adapted either, makes no sense money wise. Why sell it to 2% of RTX gamers on PC (which equals to something less than 1% of total non-mobile gaming market shares) when you can cater to nearly 100% of console users and most PC users (which at least comes close to 100% of the non-mobile gaming market). Business logic part 2a.

    The rest of your arguments... sure it sucks that the first RTX cards can't really push what's needed to run well. Because they needed to have a working sample to bring it to devs first... makes no sense to develop for 4K RTX either since nobody will have that hardware. A 2060 though is fairly easy to buy these days. I too agree that the hardware wasn't performing well enough for really good RT under DXR, but they have to sell cards. They have to do something new. A company can't put out the same product forever. People won't buy it as soon as somebody else does a new trick (compare Intel and AMD here...). So again, it makes sense to more push a feature that's half backed than something that's proven and works for no improvement and no new selling point. With just a bigger chip, Nvidia would have sold less cards than now with a new feature, since people without a 4K screen couldn't have cared less about buying a new GPU. So push something for people that the other 98% are a probable target group for. Business logic part 3.

    And if your argument is "you can RT all you want but you can't rasterize all you want" then I'm sorry... but that isn't supposed to be an argument for a tech. "Because it's difficult to use when you want to improve it" is actually the exact same thing I suspect with RT, code wise and dev wise, until it gets good for all of us. And I do not see the sense behind the argument that just because it's difficult to do more with what we have it's the better tech than to adapt and do the same thing with less work once you've got a grip on it. That's anti innovative tbh. Innovation sells products, the same that stays the same has to be handled with planned obscolescence, business logic part 4.

    And I'm fairly sure, if all was treated that way in regards to PC gaming, we'd still play 2.5D games. Nobody would have cared to make a 3D engine when you could have put more things happening onto Super Mario's background.

    As for planned obscolescence... yeah sure, it's a tech. To expect ANYTHING else than this is being naive. Or what do you think other companies are doing? Light bulbs that could have burned forever were dead in the first decade after inventing them. Glueing batteries into phones instead of making them replaceable with ease. This is a basic principle of our time and our world since it's driven by capitalist companies... they all use designs and materials as good as they need to be, not as good as they could be, that's a fundamental basic to everything we humans do. Business logic part 5, as in, greed is what drives a company. Ngreedia is as much guilty of it as Intel, or IBM, or AMD or which company you want to name... or would you think AMD would even exist if there wasn't somebody behind it that wanted to make money off of their work? Alrtuism is NOT the way of the world, unfortunately.


    For a second, picture this: GPU companies just push for 4K performance, let's say 4K60fps under current usage of rasterization. Nobody uses more rasterization because "it needs more work" or is limited by the engineering hours put into it. Once that threshold is met, a GPU building company, or both, can sell cards. And after that... it's just replacement of breaking down cards. That's the end of every company that produces GPUs since mere replacement does not make enough business in the long run.

    Other variant: GPU companies don't push for 4K, and no new feature, and devs start to use more rasterization but there's nothing to run it on. That's the death of the dev since he puts engineering hours and money into a feature nobody can have since there simply is now GPU to run it.

    Third way: New feature at range for most people to adapt. More money to be made by GPU makers and devs that might have games with an outstanding quality in the first place, when other games don't adapt it. Same goes for the platform itself... PC gets RT, consoles don't. If you want RT you need to buy a PC... could be a quality, a feature for the PC to make somebody wonder twice if they buy a 500$ console or just put another 500$ GPU into their rigs.


    And there comes out the nature of the basic issue that plagues it all: Greed is a desperate search for growth, since staying the same is actuall risking losing, dying. That's the model every biological life is based on, grow (replicate, evolve) to counter death. And why should we humans, a company, or this single situation be based on another form of that basic principle, or another outcome? Understand the nature of the beast, and understand what it's capable of. And then do not be surprised if you find that what you expect actually happens.



    Honestly, anything else than how it did go down over the last 2 years, in my personal hindsight, would not have made much sense business wise. And since they're all businesses and we're just consumers, not customers... that's pretty much perfectly sensible, to push for a new feature (RT) instead of higher resolution.
    I agree with you, and I'm unhappy about RT's performance under current, most likely next gen's and the one's after that gen's cards, but that doesn't make RT a bad thing by itself. Blockchain tech isn't bad, just because it's used for doing the worst thing that humankind can do, feed the greed once again. 100% on your side that RT hardware doesn't perform, or that adpotion on it in HW is too early.

    But that doesn't make pushing out yet another GPU that much better... it would have just postponed the situation we're in now for a few years. And when the hardware can do said double performance as in 4K60 fps performance with RT, there's still ways and I believe they will be made use off, to run even more RT effects so that hardware once again finds it's master. It will be just the same over and over again... like there will be games that even bring a 4K 60 fps capable running system to a stutter a few years from now.

    And when we go back to what the post I responded too actually said, that RT is obsolete, it still remains valid, from a business point, that it's far from it, and only a consumer could think that, no company would think like that.

    So we're just in that little moment where it might not make sense, but zoom out just a big more, and you see that this is all so perfectly logical and couldn't have gone down any other way. RT was introduced, it might be a thing or not, but a new feature itself is not obsolete, the first real new thing in graphics for the last decade. And it's what we got... no argueing can change it.
     
    Maddness likes this.
  14. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    I don't agree with this - a 18.6B transistor Pascal would simply be limited by power and run at lower clocks - the same way 2080Ti is. Swapping those RT cores to CUDAs wouldn't magically increase performance and Nvidia's architecture isn't like AMD's where they are running it inefficiently (underclocking on AMD is significantly more effective than on Nvidia). The only argument against RT and the increased transistor count is cost. You couldn't turn those transistors into more performance and I think Nvidia knew it was going to be an okay problem with 7nm where you're limited to ~400 mm2 dies at 300w. By the time they ship 7nm GPUs yields will be equivalent to 12nm and now they have this feature that they've made almost necessary and have a headstart in it.
     
  15. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,011
    Likes Received:
    7,353
    GPU:
    GTX 1080ti
    nvidia isn't doing 7nm though, they are doing 7nm euv jumping directly from a non euv part.

    and they are doing this because they want to do 7nm right the first time and not have limited power and high degradation microprocess on their extreme high end chips.

    AMD on the other hand don't have the money to take the risk, thats why they are stepping their way towards a 7nm euv process, which so long as zen 3 makes it means more efficient cpu's and higher clockrates are coming. Soon. (but we're not talkin about cpus)

    the RT cores are pittance in comparison to the other changes to the die between pascal and turing, most of the size comes from the tensor area and making it double duty as fp16/int16.
     

  16. TheDeeGee

    TheDeeGee Ancient Guru

    Messages:
    9,636
    Likes Received:
    3,413
    GPU:
    NVIDIA RTX 4070 Ti
    They have to, cuz people paid a premium for it.
     
  17. SniperX

    SniperX Member Guru

    Messages:
    149
    Likes Received:
    80
    GPU:
    MSI RTX2080S Trio
    Sweeeeeet!
     
  18. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    To Bold part: We kind of think same. Issue is in scope difference. Current performance of 2080Ti is insufficient to even pull multiple effects on 1080p at same time.
    In BF, they have kind of variable density of rays depending on objects to save hit checks. Doubling RT performance can easily be gobbled into reducing need for optimization like that.
    Make 10 times as strong RT GPU. And BF would run only marginally better because they would not be pressed to optimize as much. It would look better as result too. But not much.

    Remember those comparison screenshots where RT missed a lot of geometry in reflections I was pointing out? Like entire barn gone. That was due to too heavy optimizations. It would not happen with stronger RT GPU.

    To underlined: Surely, there is no obsolescence on only solution that enables production of realistic computer graphics. If anything it felt like DoA/Undesirable in given state/shape.

    To other sentiments:
    - We agree that nVidia did introduce RT to games for continuation of sales. But nVidia did it as spit to their loyal clients face who kept them in business for so long. Those clients could finally get GPU that could stand tall and proud on 4K. Instead they got RT.
    - To the obsolescence. I kind of disagree because I see scale. With traditional rasterization approach we was so close to hit that wall where developers would have very hard time making game that would put top card to their knees. But RT, we are like at 0.01~0.1% low in scale for Full RT at photo-realistic quality in 4K. How big GPU, how strong can it even be made? Will 5nm get us to maybe 2~3% of entire scale? I believe that in few generations RT on 1080p will deliver good IQ with all bells and whistles (just not photorealism seen in movies). But any HW before that will be obsolete by games after. And that's issue I have. This rate of obsolescence is damn high. Or progress will be too slow for it to be good any time soon. It's like times of HD 3870/4870/5870 but with jumps much bigger and not stopping for next 10 years.

    There is good chance that there will be one or two generations where people will love it. And then they'll get tired and give up on RT because of cost requirements.
    Because in worst case scenario price will be pretty steep:
    - There are going to be 10 RT enabled games released in given year and person upgrades GPU ($600) for them.
    - Ends up buying just 4 RT games since other will not be in interest zone for given person.
    - Person buys another 10~20 non-RT games in given year that did not require upgrade of GPU at all.
    - Next year there are other 10 RT enabled games, but since person bought $600 GPU and not $1000 one, they'll run poorly with RT ON.
    - As result playing those 4 RT enabled games came at $150 premium for each since new GPU had no other additional use over old one.

    Above is pretty discouraging. And I wrote following at start too. If nVidia made add-in RT-only card, I would go for it as soon as I would be interested in playing 1st RT enabled game. But it has not been designed in that way. Or at least I think so.
    - - - -
    Amount of advantages RT has is unbelievable. But there is 1st need for critical performance with which once can work.
    Best one for me is that once we have full RT w/o rasterization, there is control over fps vs IQ. Just setting time limit for each frame and moment card works for given amount of time on frame, it is sent out. If card is too weak for given fps, there will be noise/per pixel inaccuracy. But that's in land of maybe 20 times performance we have now. (While dedicated RT card may have been there already.)
    So, more cuda cores, more TMUs, ROPs, ... all that did manage to increase performance pretty well each generation.
    But here it would be magical end of the road, right? Sorry, I am not being persuaded by that. It is weak argument.
    Truth is that 18.6B Pascal vs Turing would eat about same power under situation where all transistors tick. And truth is that even if Pascal ended up being little more power hungry, nVidia had 7nm option. That's why your argument is weak. it has no base at all. Sorry.
     
    airbud7 likes this.
  19. Serotonin

    Serotonin Ancient Guru

    Messages:
    4,578
    Likes Received:
    2,036
    GPU:
    Asus RTX 4080 16GB
    I own an RTX card and I want to see differences and feel justified by my purchase. But...the minimal differences I do see seem like options I could change with nvidia digital vibrance and some contrast.
     
  20. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    How do you plan on adding all that when the previous generation was 300w and you aren't doing a die shrink? 2080Ti is as good as it's getting - cut the RT/Tensors and the performance will be the same no matter how you optimize it because it's limited by TDP. You couldn't magically add more ROPS/Shaders without blowing past 300w.

    7nm in October of last year? The high performance node wasn't even ready. They chose to wait for 7nm EUV - a process that isn't a massive regression in yield.
     
    airbud7 likes this.

Share This Page