Big Navi specs leaked by macOS Big Sur driver

Discussion in 'Frontpage news' started by PrMinisterGR, Sep 27, 2020.

  1. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,125
    Likes Received:
    969
    GPU:
    Inno3D RTX 3090
    https://www.reddit.com/r/Amd/comments/j06xcd/number_of_cus_of_navi_23_navi_31_van_gogh_cezanne/

    Seems like Big Navi is 80CUs at 2.2GHz with HBM. NOICE.
     
    mohiuddin and XenthorX like this.
  2. Undying

    Undying Ancient Guru

    Messages:
    25,330
    Likes Received:
    12,743
    GPU:
    XFX RX6800XT 16GB
    It does support hbm2 but it will not use it, atleast not on navi 21, maybe navi 31 in the future.
     
    Maddness likes this.
  3. Maddness

    Maddness Ancient Guru

    Messages:
    2,440
    Likes Received:
    1,738
    GPU:
    3080 Aorus Xtreme
    Yeah i'd really like it to have HBM2 as i think this might be there first card to actually take advantage of it. Everything points to that not being the case though, but never say never. ;)
     
  4. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    10,780
    Likes Received:
    1,393
    GPU:
    黃仁勳 stole my 4090
    You won't see it in consumer/gamer versions of the upcoming Navi, only maybe in Fire Pro cards or something. Even then probably not, probably 32GB of GDDR6.
     
    Maddness likes this.

  5. XenthorX

    XenthorX Ancient Guru

    Messages:
    5,032
    Likes Received:
    3,404
    GPU:
    MSI 4090 Suprim X
    So we're looking at twice the performance of RX 5700 xt ?

    That's literally 3080 territory right there. Nvidia clearly knew it and didn't want to reproduce the mistake from 2013 where they ended up dropping 780 prices by 150$ 4months after release.
     
    Last edited: Sep 27, 2020
    mohiuddin likes this.
  6. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,125
    Likes Received:
    969
    GPU:
    Inno3D RTX 3090
    How do you know this? Bullzoid also said that the card they showed seems to have an HBM PCB.
     
  7. Goiur

    Goiur Maha Guru

    Messages:
    1,339
    Likes Received:
    629
    GPU:
    ASUS TUF RTX 4080
    Last newegg leak shows gddr6.
     
  8. Maddness

    Maddness Ancient Guru

    Messages:
    2,440
    Likes Received:
    1,738
    GPU:
    3080 Aorus Xtreme
    I think because most of the rumors have pointed to 256bit GDDR6 for RDNA and HBM2 for CDNA. Now that AMD has split there Professional and gaming divisions. I'd love it to have HBM2, but the cost might make it harder to compete with the pricing of Nvidias cards.
     
  9. I don't "know" anything but economically speaking Apple would be one to pref HBM2 over GDDR6 as they are loaded. Cost-efficiency wise GDDR6 would be the 1st choice for AMD, as long as it isn't a significant performance hit.
     
    Maddness likes this.
  10. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,125
    Likes Received:
    969
    GPU:
    Inno3D RTX 3090
    Even card vendor websites had Ampere cards with wrong CUDA core numbers, so that doesn't say a lot. Also actual drivers have Navi 21 chips with HBM.
     

  11. lukas_1987_dion

    lukas_1987_dion Master Guru

    Messages:
    701
    Likes Received:
    167
    GPU:
    RTX 4090 Phantom GS
    1500mhz on core for all models? Looks fake..
     
  12. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,125
    Likes Received:
    969
    GPU:
    Inno3D RTX 3090
    Two HBM2E stacks could do over 1TB/sec. They could literally do a 16GB card with that and save a ton of power, or go full and do a card with 12GB or 24GB with 1,5TB/sec. The irony is that HBM2E is also cheaper for the if you go for two stacks and save power and not need to trace 8x GDDR6X chips or whatever. It will also lower the power delivery requirements quite a lot.
     
    Last edited: Sep 28, 2020
  13. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,125
    Likes Received:
    969
    GPU:
    Inno3D RTX 3090
    That's just a power stage.
     
  14. Fediuld

    Fediuld Master Guru

    Messages:
    773
    Likes Received:
    452
    GPU:
    AMD 5700XT AE
    Base speed. We know boost is 2.2Ghz for the 6900 and 2.5Ghz for 6700/6800
     
  15. lukas_1987_dion

    lukas_1987_dion Master Guru

    Messages:
    701
    Likes Received:
    167
    GPU:
    RTX 4090 Phantom GS
    Yeah sorry, I forgot about boost lol
     

  16. Maddness

    Maddness Ancient Guru

    Messages:
    2,440
    Likes Received:
    1,738
    GPU:
    3080 Aorus Xtreme
    It would create a rather large chip and imposter though. Especially when we have no idea how much is added to the die with the dedicated Ray Tracing hardware.
     
    Deleted member 213629 likes this.
  17. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,125
    Likes Received:
    969
    GPU:
    Inno3D RTX 3090
    We know all of this already actually. The GPU die size of the Series X is 360mm2 and that includes an octocore Zen 2 CPU which is around 70mm2, I/O, and a 320bit GDDR6 controller.

    This is a 56CU GPU down there, and you can see it actually uses around 60% of the space.

    There is even a picture:
    [​IMG]

    If we are talking HBM, then AMD could stuff around 80CUs in under 600mm2 easy peasy.
     
  18. CPC_RedDawn

    CPC_RedDawn Ancient Guru

    Messages:
    10,409
    Likes Received:
    3,077
    GPU:
    PNY RTX4090
    The best option for AMD to make a proper impact with these cards would be for them to release a 6900XT with 80CU's and 16GB GDDR6 and then a 6900XTX with 80CU's and 16GB HBM2 memory. This could be their TITAN equivalent, a better binned 80CU chip with higher clock speeds and better overclocking potential.

    After all the Radeon VII had 16GB HBM2 and that released for $700 so its not as if they can't do it and release for a good price. Those highly clocked 80CU's would seriously benefit from the increased bandwidth.

    This is all, of course, if the rumoured 128MB cache isn't real. As a massive cache would also help the GDDR6 memory compete against Nvidia's G6X memory. But for a total knockout blow a HBM2 variant would kill the 3080 in normal rasterization but then again ray tracing is another matter.

    I would also like to see AMD have some form of DLSS, their own version. Some rumours point towards them partnering with Microsoft for a global upsampling technique that is said to work at a global level and doesn't require per game support and will work with any DX12/Vulkan game. Sounds too good to be true, either that or it has terrible IQ.
     
  19. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    Problem is that AMD split up their architecture into CDNA/RDNA. Unless they are going to reuse the 6900XTX model as a CDNA card it's a massive amount of money to do an HBM chip for one single application. Radeon VII was just one of their Instinct cards rebranded.
     

Share This Page