PNY GeForce RTX 2080 and 2080 Ti Product Data Sheet Leaks - Reveals All

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Aug 18, 2018.

  1. gx-x

    gx-x Ancient Guru

    Messages:
    1,530
    Likes Received:
    158
    GPU:
    1070Ti Phoenix
    But, on top of the current limitations of hardware and 30 fps more often than not, adding RT?! Just because it can, doesn't mean it should. Maybe next generation of consoles...I mean, even on PC they are adding it into the mix with rasterization, a very small part of it, and it can be toggled on/off. Actually, new Metro isn't even out yet, all we saw was a demo video.
    It's very new tech (for games). Time is required to pass until adoption. Just like with SSAO, Tesselation (most cards still struggle with it)...Hell, even DX12 sees most of it's usage on windows 10 UI...
     
  2. Paulo Narciso

    Paulo Narciso Guest

    Messages:
    1,226
    Likes Received:
    36
    GPU:
    ASUS Strix GTX 1080 Ti
    Microsoft was selling DX12 as the fix for all problems, the tech is used in only a couple of games and it doesn't bring anything new.
    Vulkan is even worse. The only worth games are doom and wolfenstein.
     
  3. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    Well yeah, obviously if you're expecting 100% adoption in the first few years you're going to be disappointed, but I think it's promising enough that devs will utilize it to a higher rate than previous technologies. In the professional rendering industry nearly every single first and third party renderer has pledged support for it (mostly through OptiX). It will eventually save artists a ton of time and thus money for games as well as just overall visual improvements. I don't think it matters that it's mixed with rasterization as the it's alleviating the most difficult lighting tasks - ones that are often hard to replicate and performance intensive anyway.

    No they weren't lol. Like show me where Microsoft was selling DX12 as the fix for everything? All the articles that came out with DX12 said the exact opposite, that DX12 would be hard to adopt and it would take a ton of time to come to fruition - only the larger, more experienced developers would even gain capability out of it. That's why they continue to develop/support DX11 along side of DX12.

    Aside from the CPU overhead reduction, all DX12 does is give you deeper access to the hardware. The developer has to be the one that uses that level of access to improve performance and essentially out optimize the driver developers at AMD/Nvidia. Thinking that was going to happen in any reasonable timeframe or to any real extent was only pushed by delusional forum going gamers with no understanding of how difficult that level of software development is - it certainly wasn't said by Microsoft.
     
    Last edited: Aug 20, 2018
  4. Paulo Narciso

    Paulo Narciso Guest

    Messages:
    1,226
    Likes Received:
    36
    GPU:
    ASUS Strix GTX 1080 Ti
    So, they won't use DX12 because it's hard, but they will use DX12 ray tracing?
     

  5. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    Some companies will use preconfigured libraries based on DX12, like Nvidia's RTX Gameworks systems and AMD equivalents. The bigger engines like UE/Unity/Crytech/Frostbite/etc will do all the hard work and the developers building games in those engines (which are the majority of developers now) will just call the DXR functions with a checkbox/switch/light-type/howeverthefucktheydecidetoimplement/etc in the engine. The hard part of DX12 is the optimization - you can easily wrap DX11 games in 12 and do no work but you're not getting any of the benefit of low level optimization (which is the hard part). Read the article I linked in the previous post.

    Should also point out that you're conflating DXR adoption with DX12 - if anything DXR is just another selling point for devs to take the step to DX12. Especially because the artistry time saved. There is an upfront cost in the switch to DX12 but the time saved in development after is worth the cost - especially as raytrace acceleration performance improves and comes to other platforms.
     
    Last edited: Aug 20, 2018
    fantaskarsef likes this.
  6. Paulo Narciso

    Paulo Narciso Guest

    Messages:
    1,226
    Likes Received:
    36
    GPU:
    ASUS Strix GTX 1080 Ti
    Honestly I'm very sceptical about this "new" technologies that vendors try to sell. Over the years they tried to sell new hardware, announcing wonders about those technologies like Physx, Tesselation, etc. And to these days they are hardly used.
    I think Nvidia is trying to justify their tensor cores in a gaming card, instead of giving more cuda cores which would give more tangible performance increase.
     
  7. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    Tessellation is used in nearly every single game? PhysX (GPU accelerated) was hardware locked to Nvidia which is why it never saw adoption. That plus physics systems themselves are a nightmare for multiplayer implementation and thus often not used anyway.

    The problem with just adding more CUDA cores is that game developers are running into a visual plateau with rasterization. The amount of work you need to put in to fake a shader into looking like a physically based representation is getting exponentially higher. Raytracing is just a paradigm shift that automatically gives us better results with less work (but less performance as well). Should also point out that the "RT core" is a misnomer, the "raytracing engine" is built into the SM and ALUs - the processing occurs entirely in the SM. The CUDA cores on Volta/Turing are larger due to a larger cache, which should give a general IPC uplift and would have happened regardless to raytracing or not. I keep seeing people thinking RT/Tensor cores are discreet cores - they aren't - think of it more like an ISA extension that utilizes ALU concurrency featured in Volta/Turing.
     
  8. gx-x

    gx-x Ancient Guru

    Messages:
    1,530
    Likes Received:
    158
    GPU:
    1070Ti Phoenix
    Well, having listened to that John Carmack talk he gave back in 2016, shaders are being thrown out for many things. Even first Rage used a lot of light sampling instead of baked shadows. As did Doom reboot. What he said thou was that material properties have to be updated if you want things to look real. Otherwise, it doesn't matter if you ray trace it or not. Updating materials today (or rather back then in 2015/2016) required a lot of work, involved laser readings etc.
    I assume companies like Epic and others that make engines will do the heavy lifting, but still...I mean, looking back at first Crysis game, you can see how much benefit is there when you just do the materials more correctly. It will never look outdated, like, say, Quake 2. :)
     
  9. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    Well in terms of photorealistic graphics in materials the direction everyone is headed is photogrammetry. They are getting better at delighting the photogrammetry samples (this was the hardest, most labor intensive part previously, but algorithms/software are helping to speed this up) and handling the texture requirements better. Eventually most photorealistic games will use Quixel quality level stuff everywhere. They'll probably also start using machine learning to create new textures with similar properties to those.. as it's hard to get to planets like Mars to sample rocks/soil for the next Doom :p

    It's a combination of everything that brings higher visual quality - the industry is tackling every side. Lighting is the most challenging from a compute standpoint, that's why this DXR acceleration stuff is deemed so "revolutionary". Were obviously still far away from 100% scene path tracing but this is a great start.
     
  10. gx-x

    gx-x Ancient Guru

    Messages:
    1,530
    Likes Received:
    158
    GPU:
    1070Ti Phoenix
    Well, big studios get 99% of everything correctly, at 30 mins per frame render time...(watch The Adventures of Tintin if you haven't already) so that is not the issue, the issue is how to do it at 60 fps, at least, without throwing one unimaginable number of transistors at it :D
     

  11. alanm

    alanm Ancient Guru

    Messages:
    12,269
    Likes Received:
    4,471
    GPU:
    RTX 4080
    What games expected to use RT over next few months. Metro Exodus only one announced?
     
  12. gx-x

    gx-x Ancient Guru

    Messages:
    1,530
    Likes Received:
    158
    GPU:
    1070Ti Phoenix
    afaik, yes, even that is not for sure, could be just a stunt.
     
  13. Goiur

    Goiur Maha Guru

    Messages:
    1,341
    Likes Received:
    632
    GPU:
    ASUS TUF RTX 4080
    Someone open the windows... to much smoke with this ray tracing crap lol

    WATCHOUT!!!! 2070 6 times faster than 1080Ti in ray tracing ops... speaking of fps 20% behind.
     
    Last edited: Aug 20, 2018
    lucidus likes this.
  14. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super


    [​IMG]
     
  15. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    That's correct and I do agree. Technological improvements are important.

    But look back. There were DX7 cards which did only T&L. Games either supported DX7 and DX8(.1), or user had to use wrapper to remove shader code.
    But even then, DX8.1 game in DX7 mode run well on DX7 hardware. It was DX8(.1) HW which paid additional cost or run quite faster in DX7 mode.
    Similar thing did happen with each DX step.

    In here, you have some new rendering techniques based on strength of new HW which is not much better in those already existing rendering methods.
    So, now New HW is likely to run current code just a bit better as you wrote. But games implementing those new techniques will likely deliver quite some kick into balls of older HW.
    It would be nice to see those features as optional in games for at least 2~4 years, or till current high end cards like GTX 1080/Vega 56(64) become as weak as lower mainstream in given time when those new things are OFF.

    Because if those new features are mandatory (and nVidia loves to enforce their technologies), You'll see current High-end HW at level of next gen entry level within one year. That's quite certain from presentation showing time cost of some of those features and from nVidia's Siggraph presentation where Huang clearly stated that Turing Quadro does similar amount of raytracing as did 4 top quadros of last gen.

    Improvements are welcom, but not at price of reducing today's GPU capable to do 140fps to 60fps or worse.
     

  16. Andrew LB

    Andrew LB Maha Guru

    Messages:
    1,251
    Likes Received:
    232
    GPU:
    EVGA GTX 1080@2,025
    I think people flock to nVidia more because they sell cards with outstanding performance, cutting edge features, and don't feed consumers bull$hit performance claims which never turn out to be true like their competition has done for generations.
     
    nz3777 likes this.
  17. nz3777

    nz3777 Ancient Guru

    Messages:
    2,504
    Likes Received:
    215
    GPU:
    Gtx 980 Radeon 5500
    They have been leading the industry since I got onto the scene which was Gtx 500 series (long time ago) iam sure some of you remember even older gen cards but I follow it really close,Amd sometimes (almost) trys to catch-up but iam affraid to say Nvidia is just so far advanced at this point theres no catching up to them anymore!
    Been a loyal fan since my 1st Gtx 580 and never thought about getting another Amd card.
     
  18. gx-x

    gx-x Ancient Guru

    Messages:
    1,530
    Likes Received:
    158
    GPU:
    1070Ti Phoenix
    GF 5 series, no offense, is a recent history.
     
  19. Goiur

    Goiur Maha Guru

    Messages:
    1,341
    Likes Received:
    632
    GPU:
    ASUS TUF RTX 4080
    ^ And was the worst nvidia gen i can remember lol
     
  20. Paulo Narciso

    Paulo Narciso Guest

    Messages:
    1,226
    Likes Received:
    36
    GPU:
    ASUS Strix GTX 1080 Ti
    GTX480 was much worse, so hot that it could fry an egg :)
     
    gx-x likes this.

Share This Page