Nvidia announces Turing architecture for gpu's Quadro RTX8000, 6000, 5000

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Aug 14, 2018.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    38,529
    Likes Received:
    7,120
    GPU:
    AMD | NVIDIA
    jura11 and fantaskarsef like this.
  2. cryohellinc

    cryohellinc Ancient Guru

    Messages:
    3,215
    Likes Received:
    2,448
    GPU:
    RX 5700 XT/GTX 1060
    Looking forward to next week!
     
  3. -Tj-

    -Tj- Ancient Guru

    Messages:
    16,874
    Likes Received:
    1,776
    GPU:
    Zotac GTX980Ti OC
    Right, I said it will have Tensor cores. ;)
     
    fantaskarsef likes this.
  4. HardwareCaps

    HardwareCaps Master Guru

    Messages:
    452
    Likes Received:
    154
    GPU:
    x
    Pricing looks great.... 10K for the RTX8000!! gotta love competition
     
    Last edited: Aug 14, 2018

  5. Noisiv

    Noisiv Ancient Guru

    Messages:
    6,990
    Likes Received:
    674
    GPU:
    2070 Super
    • Huge amount of new features in both hardware and software.
    • Monst.... no, more like Frankenstein GPU
    • REAL TIME RAY TRACING IS HERE!

    [​IMG]

    [​IMG]

    [​IMG]

    [​IMG]

    [​IMG]
     
    fantaskarsef likes this.
  6. jortego128

    jortego128 Member Guru

    Messages:
    107
    Likes Received:
    15
    GPU:
    AMD RX 580 4GB
    A lot of hype....but man if this lives up to the hype AMD have a mountain to climb to catch up...
     
  7. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    5,127
    Likes Received:
    1,754
    GPU:
    HIS R9 290
    It's so difficult for me to know how much of this is legitimately actually amazing vs Huang just spewing superlatives and patting himself on the back. I'm not questioning if Turing will be a great architecture; I know it will be. But what I don't think it's as revolutionary as he makes it out to be, particularly when it comes to existing hardware (let alone, affordability).
     
  8. tsunami231

    tsunami231 Ancient Guru

    Messages:
    10,527
    Likes Received:
    581
    GPU:
    EVGA 1070Ti Black
    isnt it the greatest? but hey atlest there amd cpu department is actual worth damn again I still think "ATI" care were better competitions back when they were ATI then again what do I know
     
  9. Denial

    Denial Ancient Guru

    Messages:
    12,830
    Likes Received:
    2,115
    GPU:
    EVGA 1080Ti
    Idk the entire "revolutionary" bit is related to Raytracing - other than that it's just more of the same with some slight architecture tweaks. I don't really find the word revolutionary synonymous with affordable.

    If Raytracing catches on in games (which I think it will) then obviously Nvidia is ahead of the pack not only because they basically pushed Microsoft to do DXR (so they know it best) but they've been practicing denoising algorithms (the means for their acceleration) for the last half decade and they are obviously the first to put dedicated hardware in a GPU that can accelerate DXR. For game developers DXR is a no brainer as it considerably reduces the workload on artists to come up with rendering hacks and can potentially significantly increase image quality. I assume this will go for whatever the Vulkan equivalent to DXR is.

    It will take time for game engines to start including DXR functionality and perhaps a generation or two before it's considered "mainstream" but I think it's definitely the future and if Nvidia's architecture accelerates it then I think it's safe to say what they are doing here is revolutionary, even though it may not lead to immediate benefits.
     
    fantaskarsef likes this.
  10. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    5,127
    Likes Received:
    1,754
    GPU:
    HIS R9 290
    I didn't suggest revolutionary was synonymous with affordability... I'm saying I think Huang is exaggerating how great Turing is, and something becomes a lot less impressive or interesting once you find out how expensive it is.
    If raytracing really takes off, it's going to take a long while. It all comes down to how it's implemented. If it ends up being too demanding on hardware without tensor cores, it's going to suffer the same fate as Physx. If current or next-gen consoles don't use DXR, very few PC games will either.
    So yes, if raytracing becomes a mainstream reality, Nvidia will have a major head-start. But that's a really big "if".
     

  11. Denial

    Denial Ancient Guru

    Messages:
    12,830
    Likes Received:
    2,115
    GPU:
    EVGA 1080Ti
    AMD can already accelerate DXR with mixed math on Vega. Mixed math will most likely be on all cards going forward, mainstream and highend for both companies - the acceleration on mixed math won't be to the same degree as tensor but it will still be faster than FP32. I imagine games will have toggles for various effects that will utilize DXR, so depending on the level of acceleration some cards will be able to flip everything on, others not. Eventually, once mainstream cards all have proper acceleration to some industry defined level, all games will utilize DXR for their lighting systems. Raytracing just simplifies the process so much that it's a no brainer. It's not really an if, it's a when - you need the hardware to write the software.

    If the 2080 is based on GV102 then it's as physically big as Nvidia can build a card, in which case it's replacing the Ti model and there will be no Ti unless it's on some kind of die shrink refresh. $700 for a Ti replacement doesn't seem that insane - same as the 1080Ti on launch and only $50 more than 980Ti.
     
    Last edited: Aug 14, 2018
  12. Noisiv

    Noisiv Ancient Guru

    Messages:
    6,990
    Likes Received:
    674
    GPU:
    2070 Super
  13. alanm

    alanm Ancient Guru

    Messages:
    9,367
    Likes Received:
    1,610
    GPU:
    Asus 2080 Dual OC
    I wonder if he's not happy at Intel and looking for a new job at Nvidia.
     
    fantaskarsef likes this.
  14. Noisiv

    Noisiv Ancient Guru

    Messages:
    6,990
    Likes Received:
    674
    GPU:
    2070 Super
    Too early for that. He's still checking out the quality of the office ladies :)

    And being a sport when it comes to competition.
     
    fantaskarsef likes this.
  15. Fox2232

    Fox2232 Ancient Guru

    Messages:
    10,502
    Likes Received:
    2,514
    GPU:
    5700XT+AW@240Hz
    If Huang had as many fingers as there are games which benefit from those technologies now, he would be fingerless.
    But it is nice that nVidia for once has HW implementation to support some standard at time standard is outed. Historically, it were other companies who invested time and transistors towards new technologies and nVidia halting industry development by not investing.
    ATi constantly delivered better IQ and higher DX HW implementation sooner. One of those striking moments were times of GF4 Titanium. Powerful cards from nVidia, comparable IQ to ATi... as long as game was only DX8.0, because ATI already had DX 8.1 and there were some games. And that was not worst thing, nVidia released tons of DX7 only cards in GF4 line. That held game development back for at least 2 years as people would not just replace their new DX7 cards which performed reasonably well.
     
    Last edited: Aug 15, 2018
    carnivore likes this.

  16. pharma

    pharma Maha Guru

    Messages:
    1,306
    Likes Received:
    221
    GPU:
    Asus Strix GTX 1080
    New hardware is usually required when raising the bar to introduce a new method of developing & creating graphical realism. Nvidia has never been a slouch regarding investment and has been dedicating close to a third of their revenue into development efforts. The lingering question is can the competition (new and old) keep up since further developments on the benefits of RT cores and tensor cores will continue to evolve. From Industry comments it seems this was the "Holy Grail" that needed to be reached for true realism in gaming.
     
  17. pharma

    pharma Maha Guru

    Messages:
    1,306
    Likes Received:
    221
    GPU:
    Asus Strix GTX 1080
  18. pharma

    pharma Maha Guru

    Messages:
    1,306
    Likes Received:
    221
    GPU:
    Asus Strix GTX 1080
  19. angelgraves13

    angelgraves13 Ancient Guru

    Messages:
    1,974
    Likes Received:
    524
    GPU:
    RTX 2080 Ti FE
    So is Ampere next-gen? Turing looks pretty cool, but doubt it will be much good for gamers until the tech matures. Maybe a 7nm refresh can put Turing on steroids.
     
  20. pharma

    pharma Maha Guru

    Messages:
    1,306
    Likes Received:
    221
    GPU:
    Asus Strix GTX 1080
    [​IMG]
     
    Noisiv likes this.

Share This Page