1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Nvidia Turing GeForce 2080 (Ti) architecture review

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Sep 14, 2018.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    33,259
    Likes Received:
    2,218
    GPU:
    AMD | NVIDIA
    It is almost time before we can present you our full review of the GeForce RTX series graphics cards. But first we'll be taking an architectural deep dive into the Turing graphics processors, and of course, we can share with you all specifications, you know the nitty gritty stuff.

    Check the article here.
     
    alanm, XenthorX, fantaskarsef and 2 others like this.
  2. Darksword

    Darksword Member

    Messages:
    18
    Likes Received:
    3
    GPU:
    MSI Gaming 980 Ti
    I like the look of the Duke better than the Gaming Trio.
     
    Robbo9999 and SniperX like this.
  3. Fox2232

    Fox2232 Ancient Guru

    Messages:
    7,343
    Likes Received:
    775
    GPU:
    -NDA +AW@240Hz
    DLSS =
    - Take angle under which 2 edges intersect
    - take color information at edge
    - load resulting pixel values from database

    Variable rate shading =
    - 1/2 precision
    - 1/4 precision
    - 1/8th precision
    - and lovely 1/16th precision
    This for sure boosts performance. Gamers will have mandatory camera tracking their eyes...

    TSS =
    - Bake in results into texture on the fly
    - use old information to skip actual work
    - update baked in texture from time to time (or at rate you feel comfortable with)
    - probably would not be as bad as it seems if we had higher than 16xAF

    To sum it up: New features can be paraphrased as "Way to cheap out IQ for performance gain."
    Maybe good for 8K, somewhat OK for 4K. But hit to per pixel quality on 1440p will be unpleasant. On 1080p unacceptable.
     
    Last edited: Sep 14, 2018
    carnivore, tunejunky and Dragam1337 like this.
  4. SniperX

    SniperX Member

    Messages:
    11
    Likes Received:
    8
    GPU:
    410M
    Same here, Duke all the way. Regarding the architecture, it's clear where the gaps are that will be filled by the 2080+ and the Titan X (looks like there won't be a 2070 Ti). Anticipation for the reviews at like at 11.
     

  5. -Tj-

    -Tj- Ancient Guru

    Messages:
    15,286
    Likes Received:
    813
    GPU:
    Zotac GTX980Ti OC
    Wow this 2080rtx is crippled like crap with tensor cores compared to TI variant nvm 2070..
     
    fantaskarsef likes this.
  6. MK80

    MK80 Member

    Messages:
    21
    Likes Received:
    7
    GPU:
    Gtx1070
    OC/2200 Ghz o_O nice
     
  7. Robbo9999

    Robbo9999 Maha Guru

    Messages:
    1,024
    Likes Received:
    107
    GPU:
    GTX1070 @2050Mhz
    Yeah, comparitively small gaps between 2070 up to 2080, and then 2080ti comes along & massively increases the gap in specs.

    I like the look of the Duke in comparison with the other MSI card, like someone already mentioned, I also like the NVidia Founders aesthetic. Looking forward to the reviews, and will read up on the associated tech of the architecture a bit more once the card reviews are out, I understand/know some of it, but not all.
     
    tunejunky likes this.
  8. Monchis

    Monchis Maha Guru

    Messages:
    1,272
    Likes Received:
    28
    GPU:
    GTX 950
    The dlss comparison shot is really tiny.
     
  9. Kaarme

    Kaarme Maha Guru

    Messages:
    1,205
    Likes Received:
    183
    GPU:
    Sapphire 390
    At least this way 2070 won't be made using second-rate chips that weren't good enough for 2080.
     
  10. scoter man1

    scoter man1 Ancient Guru

    Messages:
    4,660
    Likes Received:
    38
    GPU:
    MSI GTX 970/Qnix 1440p
    C'mon HH, show us the numbers =D
     

  11. Agent-A01

    Agent-A01 Ancient Guru

    Messages:
    11,256
    Likes Received:
    780
    GPU:
    1080Ti H20
    @Hilbert Hagedoorn

    Thanks for the article.

    Can you post the full photo of DLSS comparison?

    Also can you comment on this

    "Another important change with NVLink SLI is that now each GPU can access the other's memory in a cache-coherent way, which lets them combine framebuffer sizes- something that was not possible with SLI before. The underlying reason is that the old SLI link was used to only transfer the final rendered frames to the master GPU, which would then combine them with its own frame data and then output the combined image on the monitor. In framebuffer-combined mode, each GPU will automatically route memory requests to the correct card no matter which GPU is asking for which chunk of memory."

    You said that RTX cannot share memory.
    Is this a software limitation where shared memory-access is limited to 'prosumer' cards?
     
    tunejunky likes this.
  12. Fox2232

    Fox2232 Ancient Guru

    Messages:
    7,343
    Likes Received:
    775
    GPU:
    -NDA +AW@240Hz
    It does not matter for gaming. Sharing even 1GB of VRAM via 100GB/s available for 2080Ti will add 10ms to rendering time (ignoring actual latency of random accesses themselves).
    You would be better off paying extra 1GB of VRAM on each card and paying for additional 32bit on IMC for that 1GB.
     
    tunejunky likes this.
  13. H83

    H83 Ancient Guru

    Messages:
    2,336
    Likes Received:
    172
    GPU:
    Asus ROG Strix GTX1070
    First of all, very nice article, i managed to understand most of it... Can´t wait for the review!

    That Duke card is really a looker although my Duke card looks even better. Too bad MSI managed to ruin the looks of the gaming version... Nvidia´s FE look a lot like the cards from Pallit, at least to me.
    Also 79$ for a SLI bridge??? Nvidia needs to stop copying Apple´s business practices...
     
  14. tunejunky

    tunejunky Master Guru

    Messages:
    402
    Likes Received:
    78
    GPU:
    gtx 1070 / gtx 1080ti
    just wow at these prices.

    given that there is no need for any of these products the enthusiast market will quickly be tapped. and this is exactly why the 10xx is slow-walking its E.O.L.

    even a competitive gamer can buy a 1080ti (at the lowest ever price) for half as much and have better performance in some scenarios.
    my gawd, you can even buy a 1080 and a g-sync monitor together for less money... i'm shocked i can even say that.

    all of this just to beat out C.E.S. and AMD's announcement of the first mid-level card with high-end performance.
     
    Darren Hodgson and airbud7 like this.
  15. StewieTech

    StewieTech Chuck Norris

    Messages:
    2,445
    Likes Received:
    637
    GPU:
    MSI gtx 960 Gaming
    Yeah, what gives these prices? They´re stupid man... :( For real, i can remember a time not too long ago when the top dawg went for 600 bucks.
     
    tunejunky and airbud7 like this.

  16. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    6,941
    Likes Received:
    78
    GPU:
    Sapphire 7970 Quadrobake
    Great job Hilbert!

    I know that everybody is justified to talk about the prices, but the tech itself looks crazy, I am genuinely impressed, especially considering that Nvidia didn't have to innovate at all. Pascal with +40% Shader engines would be enough for everyone to be satisfied, and much cheaper to produce.
    Applause for the leather jacket.

    I expect the initial benchmarks to be underwhelming, as this is a new architecture, but if Nvidia has a good driver for this at a point, it should be night and day with Pascal. The potential of the Tensor cores alone is insane.
     
    Maddness and Noisiv like this.
  17. wavetrex

    wavetrex Master Guru

    Messages:
    444
    Likes Received:
    191
    GPU:
    Zotac GTX1080 AMP!
    I can do ~2100 with my 1080 if I want to, and running at 1930'ish out of the box in the games that I play.

    2200 is unimpressive at all (but expected, since this is basically the same node, just slightly refined), and it will probably consume HUGE amounts of power at that frequency, given that Turing is a bigger chip with all those fancy semi-useless raytracing corez.

    Really curious what the benchmarks will show compared to SIMILARLY PRICED Pascal:
    So 2070 results vs 1080 Ti, and not vs 1070.
     
  18. JamesSneed

    JamesSneed Master Guru

    Messages:
    342
    Likes Received:
    85
    GPU:
    GTX 1070
    Can't wait to see testing results. I have this sneaking suspicion that the improvements are not that great unless games are making use of new features. Just going off the onslaught of pre-launch marketing that we have seen.
     
    tunejunky likes this.
  19. Denial

    Denial Ancient Guru

    Messages:
    11,539
    Likes Received:
    518
    GPU:
    EVGA 1080Ti
    Haven't seen you in a bit - how are things going?

    I think that its more limited to prosumer workloads. The latency access across NVLink is too slow to complete rasterization tasks in the time required for realtime frame rendering. With certain workloads like data modeling and stuff, the card doesn't have to spit out a frame in X(ms).. it can take longer, in which case this can be useful. There may be upcoming graphics techniques where this can be applied but at the moment mGPU/SLI will continue to just share the frame buffer at the end.
     
    tunejunky, schmidtbag and airbud7 like this.
  20. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    6,941
    Likes Received:
    78
    GPU:
    Sapphire 7970 Quadrobake
    Hey man, just fine, but a different job role, I have missed you guys :)

    I actually believe that AMD will compete with all this very soon, just not in the ultra high end. 750+mm wafers are an insane risk.
     
    tunejunky, airbud7 and Embra like this.

Share This Page