AMD Navi 31 with 15360 cores may no longer feature “Compute Units” (7900XT)

Discussion in 'Frontpage news' started by CPC_RedDawn, Jul 26, 2021.

  1. CPC_RedDawn

    CPC_RedDawn Ancient Guru

    Messages:
    8,963
    Likes Received:
    1,220
    GPU:
    6800XT Nitro+ SE
    https://videocardz.com/newz/amd-navi-31-with-15360-cores-may-no-longer-feature-compute-units

    Quote:
    "Following Kopite7kimi’s claim that the next-gen flagship GPU called Navi 31 could feature as many as 15360 cores, another person appears to confirm this rumor. According to Bondrewd, a member of Beyond3D forums, there are 30 RDNA Workgroup Processors (WGP) in Navi 31 GPU. This graphics processor is now rumored to be a dual-die (MCM) design, which would suggest 15360 cores, should each cluster contain 256 shadings units (FP32 cores/Stream Processors), a double of what RDNA2 WGP features."

    "A discussion is also taking place in regard to Navi 31 memory configuration. While no one has yet shared the memory capacity of the next flagship feature, it is commonly believed that Navi 31 is still to offer a 256-bit bus. What is important though is that the Infinity Cache is to expand from RDNA2’s 128MB to 256MB or even 512MB on RDNA3, depending on who is reporting."

    image_2021-07-26_182032.png

    RUMOURS are just that, rumours
    . Take this with a pinch of salt. Quite interesting either way.

    My only concern would be how on earth would you feed so many cores properly?
     
    Last edited: Jul 26, 2021
  2. Undying

    Undying Ancient Guru

    Messages:
    16,626
    Likes Received:
    5,540
    GPU:
    Aorus RX580 XTR 8GB
    Rtx40 and rdna3 already on its way and we still didnt get the rtx30 and rdna2 properly. I feel like we skipped entire generation.

    Why are we even talking about when many people still use pascal and polaris...
     
    itpro likes this.
  3. CPC_RedDawn

    CPC_RedDawn Ancient Guru

    Messages:
    8,963
    Likes Received:
    1,220
    GPU:
    6800XT Nitro+ SE
    Not really AMD or Nvidia's fault the industry is so messed up. Blame Covid, and supply chains not coping. Its also not just the GPU market that is effected, with everything nowadays from fridges to toasters all contain chips and only having a few companies supplying the whole world. It was a recipe for disaster. A perfect storm if you will.

    Life and more so in this circumstance, business goes on. If they stop making and releasing new stuff their shareholders wouldn't be happy. Money is king for any one of these companies.

    Its still fun to speculate on new stuff.
     
  4. tsunami231

    tsunami231 Ancient Guru

    Messages:
    12,179
    Likes Received:
    939
    GPU:
    EVGA 1070Ti Black
    they could stop putting chips in to everything but that never gona happen at this point for that matter they could stop trying to have everything connected to internet too :rolleyes:
     

  5. CPC_RedDawn

    CPC_RedDawn Ancient Guru

    Messages:
    8,963
    Likes Received:
    1,220
    GPU:
    6800XT Nitro+ SE
    Not really the world we live in though is it.

    Maybe it was a bad idea allowing only a handful of companies to be the only suppliers of the chips in the first place. Like the old saying goes "never put all your eggs into one basket".... In this case its about 3-4 baskets (samsung, tsmc, intel, and GF) and once you have the "perfect storm" of exponential growth in demand due to a global pandemic where people have more money and more time than ever and these companies still filling order sheets whilst still running on skeleton shifts due to the pandemic. Once the pandemic slows and things get back to normal those order sheets need completing....

    The car industry (an industry that I am involved in) shot it self in the foot big time. They cancelled their orders of chips when the lockdown began. Which freed up space at the fabs. Now you have families around the world ordering new gadgets and home appliances due to being stuck at home. New phones launching, new consoles launching, new laptops every month launching, new CPU's, new GPU's, etc, etc. Then to top it off you have bitcoin going crazy and the orders for GPU's go through the roof. Now the car industry opens back up and we all go back to the factories and the lines begin to move only then did they start to put orders for chips in, only now their spot in the line has been taken and they are told to get to the back of the queue.

    What I even didn't know was that a car has an insane amount of chips, upwards of 1500 PER CAR!!! So these car companies are not ordering a small amount of chips, but hundreds of millions per company.
     
    AsiJu likes this.
  6. AsiJu

    AsiJu Ancient Guru

    Messages:
    7,613
    Likes Received:
    2,473
    GPU:
    MSI 6800XT GamingX
    Let's just hope price isn't USD per core...
     
    HandR and GreenAlien like this.
  7. tsunami231

    tsunami231 Ancient Guru

    Messages:
    12,179
    Likes Received:
    939
    GPU:
    EVGA 1070Ti Black
    sad but true. it does prove one thing thought how ill prepared the world is when wrench get thrown in the works can only image how FUBARD everything is if CME actual hit earth that kill all electronics it will be back to the horse and buggies.


    Anyway even when get back to normal it will never truly go back to way it was. Most people i know prefer cars before the age chip this chip that. I feel old

    I do need to read up on this inifity cache on tops gpu memory did not realize amd gpu had cache on top of gpu memory, does nvidia do similar thing but not mention it?
     
    Last edited: Jul 27, 2021
    CPC_RedDawn likes this.
  8. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    2,701
    Likes Received:
    1,342
    GPU:
    RTX 3060 Ti
    this generation sucked for mid-range.

    no cause it's a workaround for lower bandwidth.it's a pretty effective one.
     
    CPC_RedDawn likes this.
  9. Astyanax

    Astyanax Ancient Guru

    Messages:
    11,602
    Likes Received:
    4,377
    GPU:
    GTX 1080ti
    RDNA3 should also be on the Float 2x Int configuration, so performance won't increase as much as what people are claiming.
     
  10. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    2,701
    Likes Received:
    1,342
    GPU:
    RTX 3060 Ti
    ?
    is that same as turing fp+int or ampere fp+fp/int ?
    or entirely different ?

    ampere needs 1.50x core count to match turing,compare 2080S to 3060Ti.Makes sense.
    same as people saying rdna3 is gonna double rdna2 performance,not triple it.
     
    Last edited: Jul 27, 2021

  11. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    12,471
    Likes Received:
    4,757
    GPU:
    2080Ti @h2o
    Please help me understand: What do compute units do on Radeon cards? What's a WGP?
    And most importantly, what do they get out of such a change? Since even with my limited understanding of true GPU architecture, I can imagine it has it's benefits and downsides.
     
  12. Aura89

    Aura89 Ancient Guru

    Messages:
    8,184
    Likes Received:
    1,285
    GPU:
    -
    There's still more then a year until these cards are "potentially" released. So bare minimum. Which makes the cards will have been out for 2 years by the time these new cards come out, at minimum. While that doesn't change your notation that we have not been able to get the rtx 30 and rdna 2 series "properly", as of yet, it's not as though the next generation cards are "on the horizon". Who knows what the stock will be like of the current generation in the next year and a half.
     
  13. Agonist

    Agonist Ancient Guru

    Messages:
    3,353
    Likes Received:
    578
    GPU:
    6800XT 16GB
    Yea midrange got shafted so hard this gen.
    The 5600xt was a solid ass deal. RX 2060 wasnt. Nvidia midrange has been crap since the 970. 1060 sucked. 1070 was good but a bit high.
     
  14. CPC_RedDawn

    CPC_RedDawn Ancient Guru

    Messages:
    8,963
    Likes Received:
    1,220
    GPU:
    6800XT Nitro+ SE
    The steam hardware survey tells a different story on the 1060.

    Well think about it this way. Say for instance RDNA3 is indeed using and MCM design and they make 80CU chiplets with a third I/O die (more in line with a scheduler to "fool" windows into thinking its 1 GPU).

    Believing they can power the damn thing properly, without any other changes to architecture, IPC, node shrinks, newer faster memory, more infinity cache, or clockspeed increases, without all of that. Theoretically this would be basically 2 6900XT's before you even begin to add on those other generational improvements we all have come to expect.

    We also don't know if they are going to use the 3D stacked cache, or is that going to be used on the CPU side first?

    The only thing that worries me about an MCM GPU is powering it properly, I have a feeling 5nm node shrinks are not going to be anywhere near enough and cut backs to the CU count will have to happen OR more like what we see in this leak of a total redesign of the die layout. Which is probably needed to be able to claw back even power efficiency so we don't end up with a 400-500W GPU at launch.
     
    Last edited: Jul 27, 2021
  15. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    2,701
    Likes Received:
    1,342
    GPU:
    RTX 3060 Ti
    1060 6G was really good imo.
    970 was a conjob.
    1070 was absolutely great,playing recent triple A titles with it no problem,UV+OC @2GHz goes to 130-140W max.Best upper mid-ranger in the recent history.

    next gen is going to be ultra power hungry.not gonna be remotely interested in 400w cards,I'm not even touching current +300w.
     
    Last edited: Jul 27, 2021
    CPC_RedDawn likes this.

  16. CPC_RedDawn

    CPC_RedDawn Ancient Guru

    Messages:
    8,963
    Likes Received:
    1,220
    GPU:
    6800XT Nitro+ SE
    yea this is my main concern for the next generation. My current 6800XT for its performance is exceptional efficient with light games being around the 80-120W and normal graphical games being around the 150-250W range and but top tier triple A games with RT can see the card hitting upwards of 320-350W easily.

    Now imagine 2x6900XT's on the same substrate with an I.O die as well. I don't care how good TSMC's 5nm node is there is just no way this will be under 400W. That is unless these architecture changes AMD are making contain some form of black magic. The same goes for Nvidia too with lovelace. I can see power this next gen going through the roof as finally both companies are competing within the high end space. Lets just hope they don't forget the mid range as that is where is bulk of the consumer base is.
     
  17. Agonist

    Agonist Ancient Guru

    Messages:
    3,353
    Likes Received:
    578
    GPU:
    6800XT 16GB
    1060 still sucked and lots of people still believe gameworks BS back then. They were pretty cheap but them being popular doesnt mean that good.

    1060 6gb was the only version worth somewhat, and GTX 970 was a killer card till the whole 3.5gb vram.
    In which I experienced the hell out of that running GTX 970 sli with triple 25 inch ultrawides. I sold those off instantly when that story broke and got R9 290x 4gbs for less and better performance.
     
  18. CPC_RedDawn

    CPC_RedDawn Ancient Guru

    Messages:
    8,963
    Likes Received:
    1,220
    GPU:
    6800XT Nitro+ SE
    That is the card we mean, the 1060 6GB was a great card for the price for 900p-1080p gaming. The vast majority of people who brought them were esports/mmo fans who wanted to play LoL, Dota2, CSGO, TF2, WoW, R6S, Overwatch, etc. There was a reason why there were so many variants of the 1060 with 3GB, 5GB, 6GB models and even different models with G5X memory and other configs.

    Heck the card is still decent today, it was upwards of 70% faster than its previous GTX960. VIDEO

    I was with you in the same boat, I too had 2xGTX970 in SLI and when the story broke I sold them and just got a 980Ti.
     

Share This Page