Newegg is listing Radeon RX 6700 XT, 6800 XT and 6900 XT specs in its blog

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Sep 27, 2020.

  1. 0blivious

    0blivious Ancient Guru

    Messages:
    3,301
    Likes Received:
    824
    GPU:
    7800 XT / 5700 XT
    If you aren't spending $700+ for a video card, it doesn't really matter if AMD has the crown, does it? At $500 and below, AMD is in the game.

    For these to be "good" , all that matters is price/performance. If it's anything like the 5700XT was/is, these should sell pretty well.
     
    AlmondMan and Maddness like this.
  2. Elfa-X

    Elfa-X Member Guru

    Messages:
    162
    Likes Received:
    21
    GPU:
    AMD RX 6800 XT
    This. It's just speculation on NewEgg's behalf. AMD's not trying to have it removed because it's just a guess based on previous scaling and likely inaccurate.
     
    ACEB likes this.
  3. Abd0

    Abd0 Member

    Messages:
    28
    Likes Received:
    11
    GPU:
    Nvidia 1060 6GB
    if we do a little math
    6700XT has the same SP count as 5700XT but with smaller ram bus bandwidth, but there is no logic in releasing the next gen card with less performance than the current gen card, so the architecture changes should/must compensate for this difference, so if we assume that 6700XT has the same performance as 5700XT (which might be logical given that the 6700XT will have RT while 5700XT doesn't)
    comparing the 6700XT to the 6900XT. we are seeing double the SP count and 33% increase of ram bandwidth, this should roughly translate to 70% performance over the 6700XT/5700XT
    this will put the 6900XT directly against the 3080 in the WQHD res.

    the above logic might make sense if these specs are correct, which I highly doubt it, I believe as the the guys said, these are just place holders.
     
    schmidtbag likes this.
  4. alanm

    alanm Ancient Guru

    Messages:
    12,270
    Likes Received:
    4,472
    GPU:
    RTX 4080
    Dont be silly. I think most Nvidia users are really hoping that AMD delivers a kick ass card that beats the 3080. An even playing field restores sensible pricing which obviously benefits all. I would love for AMD to do to Nvidia what they had done to Intel.
     

  5. SpajdrEX

    SpajdrEX Ancient Guru

    Messages:
    3,417
    Likes Received:
    1,673
    GPU:
    Gainward RTX 4070
    Hehe, but I didn't say exactly who will be disappointed :) it just you can't please everyone ;-)
     
    sykozis likes this.
  6. HybOj

    HybOj Master Guru

    Messages:
    398
    Likes Received:
    327
    GPU:
    Gygabite RTX3080
    Anyway.. Bring it on. Still better to have these cards than the malfunctioning 10gb 3080 which you cant even buy. I need AMD to release this stuff already. I will not buy a nVidia card no matter what.
     
  7. Undying

    Undying Ancient Guru

    Messages:
    25,477
    Likes Received:
    12,883
    GPU:
    XFX RX6800XT 16GB
    Yeah if its broken piece of hardware like the 3080 many will be disappointed.
     
  8. Fediuld

    Fediuld Master Guru

    Messages:
    773
    Likes Received:
    452
    GPU:
    AMD 5700XT AE

    How? 6900XT is TWICE the size and with 30% higher clock, than the 5700XT!
     
    Last edited: Sep 27, 2020
  9. DeskStar

    DeskStar Guest

    Messages:
    1,307
    Likes Received:
    229
    GPU:
    EVGA 3080Ti/3090FTW
    OH I AM "VERY EXCITE!!!!"
     
  10. Fediuld

    Fediuld Master Guru

    Messages:
    773
    Likes Received:
    452
    GPU:
    AMD 5700XT AE
    You forgot +30% higher clock speeds for the 6900XT and +45% for the 6700/6800 over the 5700XT
     

  11. RED.Misfit

    RED.Misfit Member Guru

    Messages:
    144
    Likes Received:
    83
    GPU:
    MSI 1080 Gaming X
    How can we judge the specs, they are just raw numbers ? Ampere proved that this doesn't mean anything all in all, this could go one way or the other.
    Look at the 3080, it got double the number of cuda core of the 2080Ti, a slightly better clock and yet it manage to beat the 2080Ti by only 30%. Which suggest something is bottlenecking all this amount of transistor (maybe games too aren't prepared for this ???)

    The jump is huge from a GTX 1080 perspective, but it got more than 3 times the cuda core and still manage to achieve "only" 100/110% better performance. Which sounds odd to me. I don't expect it to scale 100% (from the cuda core number), but it is scaling less than expected and we're not talking about SLI stuff here, it's all in one card.
    It should scale better than this. No ?

    The same could be said comparing it to the RTX 2080Ti, double the cuda core and yet 30% better "only". Was expecting more than that.

    It could be node/power related of course, but still not scaling as expected.
    Sometimes big numbers mean nothing
     
    Last edited: Sep 27, 2020
    ZXRaziel and AlmondMan like this.
  12. asturur

    asturur Maha Guru

    Messages:
    1,374
    Likes Received:
    503
    GPU:
    Geforce Gtx 1080TI
    Lol at people saying 'is not even winning over the 3080' there is nothing else to win out there. The 3090 is a placeholder for rare cases is not a product you should compare with.

    Whatever this cards needs to win against, they just need a good pricing for the category they will fall into.
     
    ACEB likes this.
  13. JamesSneed

    JamesSneed Ancient Guru

    Messages:
    1,691
    Likes Received:
    962
    GPU:
    GTX 1070
    Honestly knowing how high AMD is going to clock these(going off the PS5) the top card should be around 3080 speed.
     
    Undying and kanenas like this.
  14. kanenas

    kanenas Master Guru

    Messages:
    512
    Likes Received:
    385
    GPU:
    6900xt,7800xt.
    Navi 21: 80 CUs, 2.2 GHz boost clock and 22.5 TFLOPs of compute
    Navi 22: 40 CUs, 2.5 GHz boost clock and 12.8 TFLOPs of compute
    Screenshot 2020-09-27 221742.png Screenshot 2020-09-27 221925.png
    images.jpg

    do the math boys :)
     
    Venix, Fediuld and Clawedge like this.
  15. ACEB

    ACEB Member Guru

    Messages:
    129
    Likes Received:
    69
    GPU:
    2
    Not only are those specs wrong I'm unsure why there are so many people thinking a base clock means anything anymore. AMD base of 1550/1600 last gen was reaching around 1900, sometimes more. As for the memory, unless AMD have a trick which is more than just cache based then I highly doubt they will release their mid tier card with 6GB.

    I would be interested to know if AMD will use a cache to bypass throughput needs but the only reason they would be going that route is for what's next, which is infinity fabric based GPU's, you would need a cache with them. It would be funny if the 80cu is just 2 x 40cu and they bridge it with infinity fabric and use a unifying cache for latency purposes
     

  16. Goiur

    Goiur Maha Guru

    Messages:
    1,341
    Likes Received:
    632
    GPU:
    ASUS TUF RTX 4080
    I want to believe
     
  17. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,872
    Likes Received:
    446
    GPU:
    RTX3080ti Founders
    I'm surprised none of the AMD fans have jumped on this. If this was an Nvidia rumour...
     
  18. GREGIX

    GREGIX Master Guru

    Messages:
    856
    Likes Received:
    222
    GPU:
    Inno3d 4090 X3
    Y, I want to believe too.
    What pushed me lately to Nvidia was simple. Had radeon7, had its flaws, but mostly was non problematic. Driver works = all works. Driver fails(ie some blackscreens, OC fail) back to previous, wait for new. Latest I was using was some WSL version, very good.
    But.
    But for some freaking reason, they introduced power saving feature, that I could not pass/change/disable...so, going trough asteroid belt in Elite Dangerous, my FPS drops from 200 area to 35, while freaking GPU usage was 30%, and clocks were like 800-1100Mhz area, where normal value was 90-100% and 1700-1900Mhz, anywhere else, expect damn asteroids fields...no mater was empty, crowded, whatever.
     
  19. Fediuld

    Fediuld Master Guru

    Messages:
    773
    Likes Received:
    452
    GPU:
    AMD 5700XT AE
    RVII and Vega on ESO, Elite and few others needed to set the P7 state as the minimum speed one on custom game profile.
    Otherwise they operate at minimum speed.
     
  20. Truder

    Truder Ancient Guru

    Messages:
    2,400
    Likes Received:
    1,430
    GPU:
    RX 6700XT Nitro+
    Do you use hotas with Elite? Sounds like an issue I have.... Chill doesn't detect hotas as an input so downclocks performance (chill only recognises mouse, keyboard and xinput). Way around it is to set min fps to the same as max fps in Chill

    And that too
     
    Fediuld likes this.

Share This Page