AMD’s Lisa Su Hints that high-end 7nm NAVI GPUs are on the way

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Aug 1, 2019.

  1. kings

    kings Member Guru

    Messages:
    158
    Likes Received:
    131
    GPU:
    GTX 980Ti / RX 580
    They did it from 980Ti to 1080Ti, 50%~70% depending on the resolution.

    And from 780Ti to 980Ti was around 40%~45%.

    Whether it will happen or not in the next generation, no one knows, but it´s not impossible or something never seen before!
     
    Last edited: Aug 1, 2019
  2. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,010
    Likes Received:
    4,385
    GPU:
    Asrock 7700XT
    That was a major architecture change, a more substantial die shrink, and a substantial improvement in VRAM. So far, it doesn't seem Nvidia is planning on another major improvement like that.
    The 1080Ti was the greatest thing Nvidia made since the 8800GTX Ultra. It's going to be a while until they pull that off again, especially while they don't have much competition right now.
    You're right - it isn't impossible, but it's very unlikely.


    That doesn't change my previous point...
    Last I heard, those GPUs are based on the same architecture as the RTX models except they don't have hardware-accelerated raytracing. We already know that Nvidia made some decent architectural improvements going from Pascal to Turing regardless of raytracing, so, that's why I don't understand you pointing out the non-RT models.
    I'm definitely understanding your logic better now that you've spelled it out like that, but something still clearly isn't adding up:
    The TU102 already has roughly triple the transistors of the TU116 (about 6% off) and is roughly triple in area, yet as you said, the performance is roughly double. So, to triple up the TU116 basically is a TU102, except mildly better. You could pretty much call that what a 2080 Ti Super would have been.
    If Nvidia were to make the 3080Ti the same size die as a 2080Ti but on 7nm (implying a lot more transistors), even then, I'm not sure we'd see a 50% improvement, even with the performance benefits of a die shrink or any architectural enhancements.
     
    Last edited: Aug 1, 2019
  3. chainy

    chainy Active Member

    Messages:
    68
    Likes Received:
    17
    GPU:
    970gtx
    You've got too brainwashed by nvidia's marketing thats you no longer notice that you reek of fanboyism, nvidia tried to spoil the party with s versions but failed, if they couldve released on a amaller node they would;ve...

    here some reading for ya :

    https://www.reddit.com/r/Amd/comments/cgskk6/overclocked_5700xt_beats_rtx_2080_fe_in/

    https://www.overclock3d.net/news/gp..._xt_pushed_past_2_2ghz_through_soft-modding/1

    And thats just the most early tries, drivers this will only get better...

    The 5800's are also comming soon so will see your claims go up in smoke ;)

    Time to take of the green glasses and look objectivly at things
     
  4. D3M1G0D

    D3M1G0D Guest

    Messages:
    2,068
    Likes Received:
    1,341
    GPU:
    2 x GeForce 1080 Ti
    A more powerful Navi would certainly be welcome. If they can create something similar to the 2080 for $100 cheaper then Nvidia will be in serious trouble. Let's hope it happens - Nvidia needs to be taught a lesson.

    His point was that certain games are outliers that throw off the overall results. An aggregate would also include those outliers.

    That being said, I don't think the 5700 XT is comparable to the 2070 S, nor do I see it as its direct competitor (direct competitor would be the 2060 S, based on price).
     

  5. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    Actually kind of curious to see how much Navi performance increases over time. The largest reason for GCN was essentially idle gaps being filled but the entire purpose behind Navi's architecture changes was to remove those idle gaps. Has anyone done any async performance tests with Navi and compared them to GCN?
     
  6. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,010
    Likes Received:
    4,385
    GPU:
    Asrock 7700XT
    The point is, there are a handful of games where the 2070S is pulling disproportionately far ahead, thus skewing the averages. In most cases, the GPUs are very comparable. I don't doubt the 2070S is currently better in Windows, but, AMD traditionally has had immature drivers at release day, especially on a new-ish architecture. Given enough time and a decent cooler and the 5700 XT will become more evenly matched. Maybe not necessarily better, but very competitive.
    If you compare the 2 GPUs on Linux, where there aren't as many (if any) game-specific optimization profiles, you'll find they're much more directly comparable. So, my point is give the Windows drivers enough time to mature and the 2070S's massive leads will not be so massive anymore. I'm sure it will still be in the lead - there's no way AMD's drivers can accommodate such a drastic performance difference, but my point is both GPUs are very similar in capability.
    You're changing the goalposts waaaay too much here. If Nvidia is so willing and ready to make their GPUs bigger, why haven't they already done so? There is a demand for it. AMD does so because they have a lot to compensate for. There must be a good reason Nvidia isn't going as big as they could be, and I'm sure that's not going to change a whole lot next-gen.
    I'm not saying next-gen won't be impressive or good, you're just being way too optimistic over something that, so far, doesn't appear to be a major change.
     
  7. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    I don't get what you mean here? Nvidia has made their GPUs bigger - they created their own FFN node in with TSMC with a 800mm2 reticule - Volta/Turing are by far the biggest GPUs ever made on a node custom designed to be massive. He's saying if they go 7nm they can take those massive GPUs down to ~600 with an increased core count and it wouldn't be out of hand - mostly because they are waiting for 7nm EUV which will already have defect rate similar to 16nm.
     
  8. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,010
    Likes Received:
    4,385
    GPU:
    Asrock 7700XT
    Right, I get that, but my point is I don't think they're going to go down to 7nm and maintain a big 800mm2 die.

    Apparently so.
     
  9. Witcher29

    Witcher29 Ancient Guru

    Messages:
    1,708
    Likes Received:
    341
    GPU:
    3080 Gaming X Trio
    AMD still got trash gpu,s nobody can say otherwise, amd is on the heat for sure, watercooling for the gpu,s ?
    Whats next f16 thrusters to cool down the gpu ?

    Here were i live in the Netherlands amd gpu,s are pretty dead nobody wants them.
    Then again there cpu line is growing here thats a good thing.
     
  10. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    But the point is they don't have to? 7nm 2080Ti would be ~550mm2 - they could expand that chip to ~650 and it would still cost less than 2080Ti + there would be a power savings they could use for additional frequency and whatever other arch enhancements they come up with. If they really wanted to go drastic they could cut tensors, go with dedicated FP16 and free up a ton of space - it's not like DLSS is in enough games worth mentioning anyway.

    They are performing within 10% of their Nvidia counterparts for $100 less. That literally says otherwise.
     
    Last edited: Aug 1, 2019
    Maddness, carnivore and Loophole35 like this.

  11. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    I and countless others can say otherwise, as it's not factual that they are "trash".

    They are not, for currently released GPUs, the fastest, and i think they could be cheaper, and they don't have all the features that nvidia GPUs have, but that doesn't make them trash. They are still respectable, good performing GPUs for what they are, and if they were a tad cheaper, i'd say they'd be blowing nvidia away completely at the markets they are attacking (AKA not highest-end 2080 Ti territory)

    As to the heat issue, you go straight to water because the default blower style coolers aren't that great? See this is the kind of statements that show fanboyism, as you completely disregarded AIB coolers that haven't been released yet. If nvidia was using blower-style coolers for their founders edition GPUs they'd be hot as well. Now you could say that's where the issue is, and that AMD needs to provide better coolers as their default cooler, and i could say that's not necessarily untrue, but the cooler being inadequate does not equal the GPUs being hot.

    I like nvidia GPUs, i like the features they have, but just because that's who i am, doesn't mean others don't value the features or prices nvidia GPUs are at, and doesn't make the current 5000 series of GPUs "trash"
    Pretty sure you don't know all 17 million people in the netherlands, and i can guarantee you aren't at all shops selling GPUs, online or otherwise, at the point of sale, or have access to their historical sales data, to determine this statement.
     
  12. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,010
    Likes Received:
    4,385
    GPU:
    Asrock 7700XT
    I know they don't have to, I never said they did. In fact, I was implying they aren't likely to. ttnuagmada was suggesting they might, and in doing so would yield that 50% performance increase over the 2080Ti. I was saying I think that is highly unlikely.
     
  13. waltc3

    waltc3 Maha Guru

    Messages:
    1,445
    Likes Received:
    562
    GPU:
    AMD 50th Ann 5700XT
    AMD = Always On Time. Funny thing about early this year--when AMD formally announced the RVII, to let the world know that AMD had arrived at 7nm--first, Su gave a Youtube interview with a few Internet "pundits"--you know, the Youtube guys who "know everything" worth knowing (they believe)--and she as much as told every one of them to their faces to expect a lot more from AMD's GPU front this year--2019--I mean, it was so obvious when I watched it. She couldn't say more, of course, because she didn't want to undercut RVII sales right off the bat. But you know--not a one of those guys--not a single one caught it. I was amazed when in the aftermath of that interview I read comments by every one of those guys and not a one stated he had confidence that AMD was going to ship anything else in the way GPUs in 2019! What can you say? She as much as told 'em. They just didn't listen, so enamored of their own opinions were they...!

    Make no mistake--Su intends to beat everybody, Intel and nVidia, or go down swinging. Scratch that last--she doesn't intend to lose, period. What a fortunate turn of events indeed for AMD--and how prescient of the Board, finally--to put the very person AMD desperately needed at the top of the stack--Su surrounds herself with competent people and is far from alone, but she knows where the company needs to continue to go, which is absolutely critical for any company! All I can say is that I find the 5700XT 50th Ann GPU to be mightily impressive on a number of fronts, the pundits everywhere could do a whole lot better when it comes to listening to Su about what AMD is going to do next--or I should say, "execute next" because right now AMD sits at the top of the heap when it comes to product execution, imo. Nobody does it better....;)
     
  14. cowie

    cowie Ancient Guru

    Messages:
    13,276
    Likes Received:
    357
    GPU:
    GTX
    5800 Is the number that puts a foot on nvidias neck most of the time.
    I did not read her "hints" but it cant be too close but it would be great if it would normalize prices for the higher end cards before nv hits 7nm
     
  15. nevcairiel

    nevcairiel Master Guru

    Messages:
    875
    Likes Received:
    369
    GPU:
    4090
    Both AMD and NVIDIA would be happy to do chiplets for GPUs, both companies have done extensive research and posted several whitepapers on it. But, GPU cores are not like CPU cores. You can't just tie them together through some bus and have it work just like as if they were one die. The bandwidth and latency of such a connection would have to be insane to not degrade much of the gained performance - and there is the problem. Either we need much better on-package interconnects, or we need some really smart solution to not require that, either way, tech is not there yet.
     
    Maddness likes this.

  16. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    I see all the NVidia fanboys are already worried..... So, how much are you guys paid to post in AMD related threads?
     
    carnivore and Loophole35 like this.
  17. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    You felt the need to respond to my post, which validates it, but failed to answer my question.

    A "technology fanboy" would not put so much effort into attacking a tech company or it's technology. You're either an NVidia fanboy or a shill. It's as simple as that.
     
    carnivore likes this.
  18. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    See, i like nvidia, but that statement right there is where people will claim you're an nvidia fan boy.

    I mean really, 5 years? RX 5700 XT isn't Navi's top of the line GPU. 5 years ago we had the 900 series from nvidia. Are you really trying to say the 5700 XT, let alone whatever comes after it, isn't better then the 980 ti? Even though reviews show it to be twice as fast, sometimes reaching close to three times as fast?

    Sure, you might be saying "I said architecture, not performance", but one: What does any architecture have to do with anything if it also doesn't bring performance, and two: At that point, where is Navi's architecture so bad that it is similar to Maxwell?

    I'm not saying that nvidia at 7nm can't or won't bring out something that'll blow away the Navi cards, in fact i expect it to, but 5 years, really? What do you have to back this up with?
     
  19. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,010
    Likes Received:
    4,385
    GPU:
    Asrock 7700XT
    Because it's not as big of a difference as you think it is.
    To give you the benefit of the doubt, let's say Nvidia gives us a healthy 25% increase for next-gen parts just by doing some optimizations and a die shrink, with the same transistor count.

    7nm transistors are 58% the size of 12nm transistors, for the sake of argument, let's assume you can fit 42% more 7nm transistors in the same die area, and, let's just say the dies are perfectly square and 2-dimensional to keep things simple and favoring your perspective.
    You need a 437mm^2 die to have the same amount of transistors in a 754mm^2 die at 12nm (7nm is 58% the size of 12nm, 437mm is 58% the size of 754mm).
    437 is 28% smaller than the 600mm^2 die you propose, implying you can fit 28% more transistors than you could before.

    At this point you're probably thinking "aha! 28% more transistors that are 25% faster is a 53% increase in performance!" but as discussed earlier, the TU102 already has 3x the transistor count of the TU116 but only 2x the performance. It's hard to say exactly how much performance your getting, but it's definitely going to be less than 53% (or 50%).
     
  20. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    I read a "fairly in-depth and specific discussion" that contained a lot of conjecture....

    As to that claim of me being an AMD fanboy..... I'm the world's worst fanboy....lol
    NVidia: MX200 (x2), MX400 (x2), FX5700XT, GF6200, GF6800, GF7300, GF7600GT, GF8600GT (x2), GF9600GT, GT210, GT220, GTS240, GTS250, GTX275, GTX460, GTX560Ti, GT640, GTX660 (x2), GTX970, GTX1660Ti
    AMD/ATi: Rage Pro, 9200SE (x2), 9600XT, x700 Pro, HD2400, HD4850, HD7870, HD7950, R5 240, RX470, RX5700

    You, on the other hand, have spent this entire thread claiming AMD is "5 years behind" NVidia....however, the performance shows the contrary. Actual facts, say you're wrong. Your response to those facts seems to be to attempt to compare AMD's current products to NVidia's future (unreleased) products. If someone needs a graphics card TODAY, they are going to be looking at cards that are currently available. They aren't going to be looking at cards that are still months away at best.
     
    airbud7, carnivore and Loophole35 like this.

Share This Page