Nvidia Turing GeForce 2080 (Ti) architecture review

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Sep 14, 2018.

  1. tunejunky

    tunejunky Ancient Guru

    Messages:
    4,451
    Likes Received:
    3,071
    GPU:
    7900xtx/7900xt
    1st and foremost, TSMC is open to any contract work.

    however
    , as i've said and some people refuse to hear or acknowledge, 7nm production is restricted to those companies that invested in the infrastructure of the new fab. it's really very simple.
    the so-called "increased costs" associated with the new node are no different than the costs associated with other new nodes at other times adjusted for inflation.

    having something "new" for christmas is never a priority for either AMD or Nvidia as there have been many christmases with nothing new, like last year and the year before.
    it is about herding cats (i.e. design validation, fab prep, fab shipping, AIB's getting gpus and shipping AIB's, etc).

    when the job is done well (i.e. Pascal), the game developers are under NDA for months with ES cards, the product is widely available upon release (or shortly thereafter), and the final product is an improvement, usually a marked improvement.
    that is not Turing.

    as for AMD going all in on 7nm, from their position in the market why the hell not? they are not going to catch Nvidia at it's own game, so they are playing another game...called leapfrog. and it's scaring the hell out of Nvidia and Intel.

    personally i have no axe to grind, i'm invested in Nvidia, Intel, TSMC, and AMD. so i get copies of the quarterlies that i actually read, i'm in the tech sector myself, and i live in Silicon Valley less than 10 minutes away from AMD, Nvidia, and Intel...and i've been following these companies (or their predecessors) since the 1980's.
     
    Last edited: Sep 17, 2018
  2. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    You understand that what you said means that nvidia chose not to go for 7nm, as this is both a new arch, and a new node.

    Your own first paragraph takes back what you said about nvidia not being a first partner for 7nm, as now you basically just said that they could have been.

    The fact of the matter seems to be that nothing complicated (either in the CPU or GPU arena) seems to be able to be made in any quantity this year.

    Please don't say the 7nm Vega 20, as the production for these will be just for sampling.

    We will most likely see Ampere, or however nvidia decides to call the Turing shrink, tape out some time in Q1.
     
  3. tunejunky

    tunejunky Ancient Guru

    Messages:
    4,451
    Likes Received:
    3,071
    GPU:
    7900xtx/7900xt
    that is exactly what i mean.
    Nvidia chose to not change nodes and architecture at the same time. the RTX architecture has been in the works for quite some time, and IMHO was originally slated for later release.

    Nvidia was caught being complacent...as many people would be when you dominate a market. they were not alone.

    Intel is still having problems with 14nm at this minute and their 10nm (7nm equivalent honestly) is vaporware.

    everyone except for Apple, AMD,and Qualcomm have been dumbfounded at the progress TSMC has made and are being "left in the dust".

    the entire iphone new line/ refresh is using TSMC 7nm
    which by itself will outnumber the entire production of Nvidia over two years in a matter of months. AMD will out produce all Nvidia production (counting CPU's) as will Qualcomm (by a factor of at least ten...Android dontcha know...and CPU's).
    these are facts.

    but i totally agree with you re: Vega 7nm (not 20) consumer models. Navi will be extremely compelling mid-market and make anything below and up to the 2080 non-competitive (without ray tracing).

    but $300-$400 for 4k free-sync gameplay is more than compelling
     
  4. Orwellswift

    Orwellswift Guest

    Messages:
    2
    Likes Received:
    2
    GPU:
    gtx-960/4GB
    Just out of curiosity because so far nobody has mentioned this- is this tech more of a step in the direction of rendering light and shadows appropriately for viewing with a HMD, specifically for stereo-vr where you have to have everything perfect otherwise it will make you hurl. I have been using Nvidia's stereo-3D tech since they introduced their glasses somewhere in 2009 or so and some games definitely do better than others, most issues come from disparities in how the two offset views receive light and shadow info, which meant that most of the time if you turned shadows off and extra light processing like bloom you would get a perfectly rendered 3D scene. It is my hope that raytracing will greatly enhance the scale of what is possible for rendering 3D objects to actually be seen in stereo-3D instead of being "flattened" into 2D, or as I like to call it "pirate vision".
     

  5. keromyaou

    keromyaou Member Guru

    Messages:
    103
    Likes Received:
    2
    GPU:
    EVGA RTX3080 XC3 Ul
    Apparently Squire Enix mistakenly posted the benchmark result for Final Fantasy XV including Turing GPU. It is deleted now. But somebody managed to save the data although I am not sure if this is genuine or not (http://blog.livedoor.jp/bluejay01-review/archives/54156401.html#more). The original language is Japanese. But you can understand the figures.
     
  6. Robbo9999

    Robbo9999 Ancient Guru

    Messages:
    1,858
    Likes Received:
    442
    GPU:
    RTX 3080
  7. cowie

    cowie Ancient Guru

    Messages:
    13,276
    Likes Received:
    357
    GPU:
    GTX
  8. tunejunky

    tunejunky Ancient Guru

    Messages:
    4,451
    Likes Received:
    3,071
    GPU:
    7900xtx/7900xt
    actually, no it's not.
    tariffs are bad for business period. arbitrary tariffs (like these) even more so.
    most chip fabs aren't in China, but a hell of a lot of AIB's make their cards in China and would be subject to these tariffs.
    the worst hit will be Apple and Intel (motherboards)
     
  9. cowie

    cowie Ancient Guru

    Messages:
    13,276
    Likes Received:
    357
    GPU:
    GTX
    no not that its fake news, its bs to pay more when its too much already
    if it was a normal price I could live with it
     
  10. alanm

    alanm Ancient Guru

    Messages:
    12,270
    Likes Received:
    4,472
    GPU:
    RTX 4080
    Trumps trade tariffs take effect 24th Sept?
     

  11. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
  12. wavetrex

    wavetrex Ancient Guru

    Messages:
    2,465
    Likes Received:
    2,576
    GPU:
    ROG RTX 6090 Ultra
    Amazing chart @Noisiv !

    It's quite amazing how in just 20 years technology evolved from 10-15 million transistors to 12-21 billion transistors !
    And yes, while it starts to level out due to approaching atomic scale, I can still wonder about ... how much more powerful computing chips will be in another 20 years.
     
  13. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    Unless new physics are involved, not by as much as most people think. That's why tricks like DLSS or those Vega primitive shaders are so important.
     
  14. proFits

    proFits Guest

    Messages:
    5,866
    Likes Received:
    3
    GPU:
    RTX 2080
    I was looking for this exact chart yesterday, thank you so much!
     
  15. Robbo9999

    Robbo9999 Ancient Guru

    Messages:
    1,858
    Likes Received:
    442
    GPU:
    RTX 3080
    (If you call something "bs", it implies that it's fake. Thought I'd point that out, English is not everyone's first language, so understandable.)

    In other news, reviews for the new cards out tomorrow right, tomorrow is end of NDA right?! Looking forward to reading them! What time tomorrow does NDA lift (incl time zone)?
     
    Solfaur likes this.

  16. cowie

    cowie Ancient Guru

    Messages:
    13,276
    Likes Received:
    357
    GPU:
    GTX
    ok
     
    Robbo9999 likes this.
  17. Robbo9999

    Robbo9999 Ancient Guru

    Messages:
    1,858
    Likes Received:
    442
    GPU:
    RTX 3080
    Ah, if that's the case, then here in the UK the NDA will lift at 2pm, because Finland is two hours ahead of the UK when UK is on British Summer Time (BST). Will be some interesting afternoon reading! :)
     
  18. user1

    user1 Ancient Guru

    Messages:
    2,782
    Likes Received:
    1,305
    GPU:
    Mi25/IGP
    To those that think the price is too high,
    [​IMG]

    basically the deal is that gpu designs require exponentially more transistors
    and the cost per transistor is not falling like it used to with every node shrink

    The cost per transistor has remained the same or increased since 28nm,

    An example of the effect this has,

    The 980Ti has 8billion transistors , and now the 2080ti has 18.6Billion transistors, it should be no surprise that it costs ALOT more. (even with out yield considerations)

    This is going to be the future of GPUs across the board im afraid(amd and nvidia alike), so long as transistor count increases and no major breakthroughs on the physics side of things occur.

    Price will increase with performance from now on for the most part, at least it seems that way.
     
    BangTail and Robbo9999 like this.
  19. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    That was the Tesla architecture. The first dedicated compute card from NVidia, was also named "Tesla". The Tesla architecture was used for the GF8 series, GF9 series, 200 series and the (OEM only) 300 series cards.
     
    Robbo9999 likes this.
  20. Robbo9999

    Robbo9999 Ancient Guru

    Messages:
    1,858
    Likes Received:
    442
    GPU:
    RTX 3080
    Interesting, although we have to remember that the GPU silicon is only part of the cost of the card. You have VRAM/power supply circuitry/board/cooling solution and assembly + research costs as part of the final cost too (as well as other things I've not listed too). So a doubled increased production cost of the GPU core from 980ti to 2080ti won't result in overall production costs being twice as much - it depends on the percentage cost of the GPU core production in relation to the total production cost of the whole GPU card, and I don't know those numbers. (And I'm assuming the graph has been adjusted for inflation.)
     

Share This Page