AMD Radeon RX 6900XT to feature Navi 21 XTX GPU with 80 CUs

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Oct 21, 2020.

  1. V3RT3X79

    V3RT3X79 Member

    Messages:
    44
    Likes Received:
    7
    GPU:
    EVGA RTX 3090 FTW3
    80 Cu and the performance of an rtx 2080 :D
     
  2. DonMigs85

    DonMigs85 Member

    Messages:
    42
    Likes Received:
    3
    GPU:
    Gigabyte RX 570 Gaming 4G
    hoping the 6800 comes in at around $500
     
  3. Astyanax

    Astyanax Ancient Guru

    Messages:
    9,171
    Likes Received:
    3,140
    GPU:
    GTX 1080ti
    its higher than that, where the hell did you hear that?
     
  4. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    9,607
    Likes Received:
    511
    GPU:
    Asus TUF 3080 OC
    I assume he's joking, there was this meme going around of it only being 15% faster than an RTX 2080 Ti, and we know that's absolute baloney. The card shown, whether it was the 72CU or 80CU model, was almost on par with a 3080.

    Not that we needed any proof that it was nonsense, just 2x a 5700XT would be far faster than that.
     
    Evildead666 and lukas_1987_dion like this.

  5. mitzi76

    mitzi76 Ancient Guru

    Messages:
    8,722
    Likes Received:
    19
    GPU:
    MSI 970 (Gaming)
    What we need unless you have done Hilbert is some info re Vram and the potential pitfalls esp at higher resolutions. The 16gb vs 10gb thing could seem like a clever move by Amd.

    I think it's potentially a key selling point. Personally if am gonna spend loads on a 4k screen etc i'd want a bit more Vram based of off some benchmarks have seen. (some were at 9gb but I didnt see any slowdown).
     
  6. Undying

    Undying Ancient Guru

    Messages:
    14,601
    Likes Received:
    3,767
    GPU:
    Aorus RX580 XTR 8GB
    When the price and performance is revealed it would make your 3090 purchase to be the stupidest decision you ever made.
     
  7. DeskStar

    DeskStar Maha Guru

    Messages:
    1,100
    Likes Received:
    170
    GPU:
    EVGA 2080Ti/3090FTW
    They're fully aware of the demand issue and have addressed that already.

    As for the yields AMD has already touched on that as well by speaking up on the demand and availability part also. They're locked down on 7NM and have been killing it yield wise ever since.

    Is and has there ever been a shortage of AMD processors out there? No.... Not once have I seen a shortage for them even though demand has been high.

    I expect good things this go'round from them and shaking up the competition doesn't get any better than that...
     
  8. DeskStar

    DeskStar Maha Guru

    Messages:
    1,100
    Likes Received:
    170
    GPU:
    EVGA 2080Ti/3090FTW
    No. How so. 24gb of that fast RAM for rendering is a dream at that price point. Them cuda cores WOW!

    All in the perspective, but if you bought it for gaming then yeah you're mostly right.....
     
  9. DeskStar

    DeskStar Maha Guru

    Messages:
    1,100
    Likes Received:
    170
    GPU:
    EVGA 2080Ti/3090FTW
    I mean the least it could do would allow for ultra settings at that resolution.... Step into the future.... Please...? Just messing with you..
     
  10. asturur

    asturur Master Guru

    Messages:
    974
    Likes Received:
    284
    GPU:
    Geforce Gtx 1080TI
    People see a card beating another.
    I see a performance line between the 3060 and the 3090, and many points in the middle, by AMD and NVIDIA at different price points.
     
    carnivore and Maddness like this.

  11. JamesSneed

    JamesSneed Maha Guru

    Messages:
    1,112
    Likes Received:
    472
    GPU:
    GTX 1070
    If the 6800 comes in at around $500 that should be a pretty good selling card.
     
  12. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    5,590
    Likes Received:
    2,091
    GPU:
    HIS R9 290
    That's still a matter of if the L3 is for dGPUs and not iGPUs.
    But let's say it is for dGPUs - traditionally, AMD's seemed to be ridiculously starved for memory bandwidth, to the point that Vega actually warranted HBM2. I get the impression RDNA2 is a little less bandwidth-hungry, but AMD probably realized that the only way to make this problem affordable is to add a cache. Widening the memory bus means more memory chips, and there just isn't room for that.

    Even if they provided benchmarks, I'd still take them with a grain of salt. AMD has been better about sneak-preview benchmarks but I still don't trust any manufacturer's cherry-picked results.
     
    DeskStar and fantaskarsef like this.
  13. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    12,055
    Likes Received:
    4,139
    GPU:
    2080Ti @h2o
    Absolutely true! But just like with Nvidia's "conditional truths" (when the benchmarks are cherrypicked), I'd prefer those over the wild speculations which often build up a hype not even the cherry picked benches can satisfy fully :D
     
    DeskStar and schmidtbag like this.
  14. Richard Nutman

    Richard Nutman Member Guru

    Messages:
    194
    Likes Received:
    76
    GPU:
    Sapphire 5700 XT
    Higher resolution really doesn't make that much difference to the amount of VRAM used.
    A 1080p dual 32bit frame buffers and 32bit depth buffer uses 24MB of VRAM.
    4K of the same type of buffers uses around 100MB of VRAM.

    Now some games might have more buffers for various things, but the vast majority of VRAM is used for holding textures, which don't necessarily need to be bigger.
    There's no reason you can't play 4K on 8GB just fine. Of course it depends if the game wants to use way more VRAM, but then the issue is the same at lower resolutions too.

    8GB was the sweet spot, I think 10 or 12 will become the next sweet spot. 16 or 20 is just overkill for games, even at 4k or even 8k!
     
    pharma likes this.
  15. H83

    H83 Ancient Guru

    Messages:
    3,344
    Likes Received:
    742
    GPU:
    MSI Duke GTX1080Ti
    I think AMD´s cards are going to be a little slower than Nvidia´s counterparts, around 10%, but also cheaper providing better a performance/price ratio. If MAD had cards better than the ones from Nvidia then they would say it as loud as possible so very possible customer could eard them.

    My guess of course, i need benchies!!!
     

  16. Passus

    Passus Ancient Guru

    Messages:
    1,615
    Likes Received:
    444
    GPU:
    RX 5700 Mech OC
    Its gonna be faster than a 1060 for sure, I went from a 1060 to a RX 5700 and it's like 2 to 3 times faster
     
  17. wavetrex

    wavetrex Maha Guru

    Messages:
    1,342
    Likes Received:
    957
    GPU:
    Zotac GTX1080 AMP!
    Actually, no.
    HBM(1, 2) was put on to save power, as both the memory itself and the memory controller inside the chip use less energy for a given bandwidth.
    It's also the reason why professional chips use HBM2 now, as those may run 24/7 and power becomes quite relevant in the long run !

    The unfortunate side effect is that it made BOM too expensive and the gaming cards unprofitable.

    HBM is still the future, once it drops in price enough to make GDDR irrelevant.
     
  18. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    5,590
    Likes Received:
    2,091
    GPU:
    HIS R9 290
    I don't think you understood me:
    I didn't say HBM was used because of the bandwidth, I'm saying despite its tremendous bandwidth, the GPUs were able to take advantage of it anyway.
     
  19. EngEd

    EngEd Active Member

    Messages:
    87
    Likes Received:
    17
    GPU:
    Gigabyte RTX 3080
    This looks nice from AMD side, but IMO I have doubts that neither of these cards can beat the 3080 in 4K gaming. But I truly hope they can beat the 3000 series, even though I was lucky to get a 3080 this year!
     
  20. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    5,590
    Likes Received:
    2,091
    GPU:
    HIS R9 290
    They don't need to beat the 3080 in 4K gaming. Even if they're 20% slower, they'll still be capable of delivering a good 4K experience.
     

Share This Page