Newegg is listing Radeon RX 6700 XT, 6800 XT and 6900 XT specs in its blog

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Sep 27, 2020.

  1. Kool64

    Kool64 Ancient Guru

    Messages:
    1,658
    Likes Received:
    784
    GPU:
    Gigabyte 4070
    At this point it all comes down to how well ray tracing works on them.
     
    Valken, Maddness and AuerX like this.
  2. tunejunky

    tunejunky Ancient Guru

    Messages:
    4,346
    Likes Received:
    2,988
    GPU:
    7900xtx/7900xt
    there has been a lot of (predictable) fanboy response.

    i own both AMD and Nvidia cards and stock.
    the Nvidia Ampere series is amazing for what it can do with uArch and older production nodes.
    but because of the massive die size Nvidia has (and will continue to have) lower yields which keep production and retail prices high. which for the "price no object" guys is just fine.
    AMD on the other hand has a lower cost of production at a smaller node which increases yield and lowers price.

    to be anywhere in the discussion with the top tier of Nvidia is a massive win for AMD and the consumer.

    my opinion is the 6900xt will end up being equivalent with the 3080,
    but at a drastically lower price (although expensive for AMD) with better supply. the 3090 will continue as a halo product in short supply and low sales until the economy improves.

    the problem will continue with bit miners, which ironically may be a lifeline for some AIB partners (not being stuck with inventory). but this is a difficult time in RL for most of the world. demand is dramatically lower at a really bad time for AIB partners. early adopters and professional gamers will look sleek and deluxe but way more people will desire these (incl AMD) but hold off for better times.
     
  3. Let the console wars of 2021 begin...
     
  4. Venix

    Venix Ancient Guru

    Messages:
    3,440
    Likes Received:
    1,944
    GPU:
    Rtx 4070 super
    Arm most likely placeholders .... 6gb memory on the 6700xt would be a big foul especially since the 3060 rumors indicate nvidia moved the 60s to 8gb ram.
     

  5. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,975
    Likes Received:
    4,342
    GPU:
    Asrock 7700XT
    In Linux, the 5700XT can sometimes outperform the 2080Ti, though it's usually just barely trailing behind the 2080. These GPUs have a lot of potential. If RDNA2 has a modest IPC lift, the 6700XT ought to be faster even without boosted clock speeds. So, I think it's safe to say the 6900XT, with the IPC and clock boost, ought to have roughly double the performance of the 5700XT. That means with proper drivers, this should be a direct competitor to the 3080, which wasn't quite double the speed of the 2080 Ti.

    Yeah, I think 6GB would make the most sense for the 6500XT. That's pretty much as low as you want to go for 1080p gaming.
     
    Venix likes this.
  6. Jawnys

    Jawnys Master Guru

    Messages:
    225
    Likes Received:
    55
    GPU:
    asus tuf oc 3090
    exactly, i built a a modest system to a friend last year with a 3600x and an 5700xt and he games in 1440p over 100 fps all the time, and yeah the 5700xt was around 500 $ here in canada thats 200 $ less that the 2070S, big win for people on a budget, especially that rtx is a no go anyway with a 2070 if you want high fps
     
  7. Borys

    Borys Member Guru

    Messages:
    172
    Likes Received:
    58
    GPU:
    MSI 1660 Gaming X
    Well... just like AMD did show how high the clock will be set at consoles... and we all know that the second waves of the 7nm probably will be the best... I think the RX 6900XT (the best chips) will hit 2.200 - 2300mhz out of the box.... this will be very nice and will give a nice juice at the market.
    I cant believe that there is persons that believes that AMD wont even beat the 2080ti performance... come on guys... this is past! Prepare yourself to the real battle of 6900XT 2.3GHZ 16GB vs 3080 10Gb... "the wars already began"
     
    mohiuddin, Undying and Fediuld like this.
  8. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    Um...What? According to this 6900 XT is twice as many cores, with a smaller clock (base, anyways), where exactly are you getting your information?

    All of this is speculation, so far though, this looks underwhelming, but also not surprising. It's really right where i expect it to be, as i haven't tried to overhype the 6000 series as some on this forum have been trying to do.

    ^ Example...yourself.

    How about we all wait for them to be released rather then overhyping something just because you're excited? When has that every been good? lol


    In all honesty it's a strange opinion.

    RX 5700 XT is a 251mm2 die with an original MSRP of $500, lowered to $400 before release.
    RTX 3080 is a 628.4mm2 die with an MSRP for $699
    RX 6700 XT should be replacing the 5700 XT bracket, and be likely a similar die size, this is what is rumored. It'll likely cost the same as a 5700 XT, $400ish on release unless AMD is planning on shifting around their entire product stack pricing, which is possible.

    My predictions:
    RX 6700 XT 251mm2 die (rumored) $400-500
    RX 6800 XT 340mm2 die (rumored) $500-600
    RX 6900 XT 505mm2 die (rumored) $600-800

    Even if it's $600 and i highly doubt it'll be less then that (not saying i wouldn't be glad if it was, that'd be great) That's a $100 shaving, there's nothing about $100 shaving that speaks "drastically lower price" unless your definition of drastically is far different then, well, the general populous

    I fully expect the 6900 XT to be the same price as the 3080, maybe $50 less, loose to it in most titles, but sell well simply because of the added vram that won't actually help it in the long run just makes it more expensive then it needs to be. And i won't be surprised if it's more expensive then the 3080 because of the needless amount of ram. But i don't expect it to be.

    Ofcourse, AMD could surprise me, but i'm not gonna hold out on the idea that AMD will surprise me when they wanted the 5700 XT to go for $500 and only lowered it cause they had to.


    I really just can't wrap my head around statements like this.

    "We know how a customized architecture PS5 with 36 CUs can reach 2.23Ghz, therefore, a much larger one can obviously hit the same, or better!"

    One, what about custom do people seriously not understand?

    And two, top end products almost never have the highest frequencies, often the lower end products have higher frequencies because they are capable of getting there, unlike the higher end ones. This isn't always the case but i just can't wrap my head around looking at a PS5 and its high frequency but low CUs and somehow coming to the conclusion that higher CUs will get the same frequencies, especially since we already know that the xbox series x already has a higher CU count, and lower frequency, like is typically understood how it goes.

    The only time this standard isn't kept is when:

    1: The frequency ceiling of an architecture is hit and realistically doesn't matter how well cooled it is for the most part, you can only go so high, therefore all products will very similar in frequency, and just higher core counts/larger die

    Or

    2: When a company artificially lowers lower end products frequencies, even though they could definitely get to those frequencies, being a smaller die, lower wattage, lower heat issues, etc.
     
    Last edited: Sep 29, 2020
    ZXRaziel and Ricardo like this.
  9. Fediuld

    Fediuld Master Guru

    Messages:
    773
    Likes Received:
    452
    GPU:
    AMD 5700XT AE
    Keep up with the news mate.
    https://wccftech.com/amd-navi-21-22-and-23-massive-technical-specification-leaked/
    https://www.notebookcheck.net/AMD-R...-PlayStation-5-GPU-clock-speeds.494558.0.html

    6900XT 2.2Ghz boost clock, the rest 2.5Ghz.

    And WE KNEW over 6 months now that these gpus will be over 2.3Ghz from 3 places.
    a) They are on N7P process not N7. (not to confuse with N7+)
    b) Carney said so numerous times, on PS5 presentations. Mentioning the arc and process allowed for much higher speeds than 2.35Ghz of the PS5.
    c) MS saying they kept the clocks very low to keep power consumption and temps low hence went with a big chip.
     
  10. Maddness

    Maddness Ancient Guru

    Messages:
    2,440
    Likes Received:
    1,738
    GPU:
    3080 Aorus Xtreme
    I'm very interested to hear how much the dedicated Ray Tracing hardware takes up of the die size of the 6900xt.
     
    AuerX and Valken like this.

  11. k3vst3r

    k3vst3r Ancient Guru

    Messages:
    3,702
    Likes Received:
    177
    GPU:
    KP3090
    Depends on performance levels of the 6900xt, If it's within 5%-10% of the 3080 expect MSRP of $649. If it's faster then might be much as $749 especially if it's right behind 3090 on performance. AMD would say look 3090 performance at half the price.
     
    EspHack, Maddness and Valken like this.
  12. Mpampis

    Mpampis Master Guru

    Messages:
    249
    Likes Received:
    231
    GPU:
    RX 5700 XT 8GB
    I don't expect AMD to beat the 3090 or even the 3080 (I won't even bring RT in the discussion), but if they only have to offer a GPU that performs better than the RTX 3070 and is maybe cheaper, that IS overhyping.Because it was AMD, not the fans, that said they would compete in the hi end segment.
    And when you say compete, performance, not perf/price ratio, in insinuated.
    Yes I know AMD is a perf/price ratio winner, I've always bought AMD cards (5700 XT now, RX 480 before that, HD 4850 before that), but that's not what AMD said.
    And a GPU that performs better than the 3070 but significantly worse than the 3080, at a much better price doesn't cut it.
    On the other hand, this specs could well be plain wrong, or RDNA 2 gets a hell of a good improvement compared to RDNA.
     
    ZXRaziel and AuerX like this.
  13. Fediuld

    Fediuld Master Guru

    Messages:
    773
    Likes Received:
    452
    GPU:
    AMD 5700XT AE
    All of it. Each dual CU can do either normal texture or RT. Depends the application how much RT is needed. If the application doesn't use RT the whole GPU is used on normal textures.

    bellow is the RDNA2 as found on Navi 2x, Xbox X/S series and PS5.

    [​IMG]
     
    Last edited: Sep 28, 2020
    EspHack, Undying, Valken and 2 others like this.
  14. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    10,780
    Likes Received:
    1,393
    GPU:
    黃仁勳 stole my 4090
    On paper it already beats the 3090, it's not going to actually run at 1.5GHz (I expect a minimum of 2.2GHz reference), and has a lot of features you don't know about... ray tracing included, and with a DLSS equivalent in the works. The bad driver reputation is largely over-exaggerated nonsense by fanboys, and coming from me that means a lot. Few people have actually experienced AMD's BS Radeon driver issues more than I have, and I can say with full confidence nVidia have done nearly as bad.

    If the rumoured $600 USD price tag, undercutting nVidia by $100, turns out to be true... this could be a win for AMD unless their drivers/support and potential lack of a DLSS equivalent being out in time screws things for them. If the wild rumour of price cuts to $550 or even $500 to kick nVidia in their overpriced and overhyped nuts turns out to be true... that'd have to be a win.

    Too really take a stab at nVidia and make people buy a 6900XT instead of a 3080 I really believe they need to make it $550 or less, even at $100 lower than a 3080, while being faster, a lot of people are going to buy nVidia instead because they don't trust AMD's Radeon group and/or are fanboys. The nVidia fanbase border on Apple territory with their aggressive worship of their corporate overlords.

    Edit: Apparently I like leaving sentences incomplete... for great justice or something, I don't know.
     
    Last edited: Sep 28, 2020
    EspHack, Vlooi, itpro and 3 others like this.
  15. rl66

    rl66 Ancient Guru

    Messages:
    3,924
    Likes Received:
    839
    GPU:
    Sapphire RX 6700 XT
    And the hype war restart... lol
    Let the CPU to be reviewed in here (as an exemple) before said that it will be beter or worse than another GPU.
    about the bandwith remember the GTX960/950... bandwith were tiny, but it was compensated by using comp. and it was really great in it's time...
    Never take the Hype train, only fact matter and right now, we have only an incomplete preview of spec.
     
    ZXRaziel likes this.

  16. Silva

    Silva Ancient Guru

    Messages:
    2,048
    Likes Received:
    1,196
    GPU:
    Asus Dual RX580 O4G
    This is all probably a bunch of BS, people just playing the guess game and making headlines.
    I don't believe AMD will beat 3080 in performance, but I believe it can beat it by price and finally start a war.
    Fury and Vega were failures, RDNA is the beginning of a long road to catch up.
    I think they will, eventually, not this generation.
     
  17. AuerX

    AuerX Ancient Guru

    Messages:
    2,537
    Likes Received:
    2,332
    GPU:
    Militech Apogee
    What's the price difference between a Ryzen 3800XT and a I7 10700?

    I don't think AMD is gonna go cheap with their GPU's unless they are slower than their Nvidia equivalents.

    Anyone waiting for a $500 "3080 Killer" is going to be disappointed.
     
    Fediuld likes this.
  18. AuerX

    AuerX Ancient Guru

    Messages:
    2,537
    Likes Received:
    2,332
    GPU:
    Militech Apogee
    Most people stick to a brand that they are familiar with and hasn't given them any major issues.
    Why switch from one to the other if things keep working for you?

    $100 price difference is nothing for peace of mind, and that's what drives a lot of consumers who arent enthusiasts or in a GPU cult.
     
  19. ninja750

    ninja750 Master Guru

    Messages:
    366
    Likes Received:
    89
    GPU:
    RX 6800 REF
    my bet is 650$ big navi and 550$ the cut "flounders"
     
    Denial likes this.
  20. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    10,780
    Likes Received:
    1,393
    GPU:
    黃仁勳 stole my 4090
    To get the most for your money. There's a lot of us who will stray from comfort to get the most we can for our money ESPECIALLY when it comes to GPUs in the Jensen Huang age of "Frack you, we're doubling prices this gen, even though we already doubled it recently, pay me".
    I'm well aware, that's why I really hope AMD aim for $500 USD to try to claw back some of the market share.
    If they decide on those prices they're legitimately retarded. I'd buy nVidia just to spite them if they did that. You can't launch months after your opponent with nearly the same price while having less features out of the gate (I don't think they'll have their version of DLSS ready any time soon) and a bad reputation.

    The best case scenario sounds like it'll be an Oct 11 paper launch, assuming no more delays. That means between the scalping bots (which AMD will do absolutely NOTHING about no matter what they claim) and low volume to begin with (IDGAF if Lisa herself flies to my home to assure me they will have stock, they absolutely will not) you'll be lucky to have the honour to crap out your hard earned cash for one by the end of November.

    I can't repeat this enough: You can't be that many months behind your opponent, have a crap reputation, and even have your own fans angry at you, then ask for nearly the same price. That's just retarded.

    If they launch it at $650 USD and not have something to match DLSS I really might just say frack it and buy a 3080, even if it performs lower without DLSS. AMD need to know very clearly they can't position themselves as seller of borderline luxury goods if they can't match features and are months late.

    And that's in my case where I care a lot about RAM for outside of gaming, I'm super pissed at the cheapass 10GBs on the 3080 while they're charging so much they could deliver 30GB if they wanted and still make a profit. AMD are taking so long to launch that nVidia might have the 20GB 3080 on shelves by the time there's real stock of the 6900XT in late November or December.
     
    Last edited: Sep 28, 2020

Share This Page