Review: Sapphire Radeon RX 6800 XT NITRO+

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Nov 25, 2020.

  1. Obsidian_Oasis

    Obsidian_Oasis Active Member

    Messages:
    57
    Likes Received:
    21
    GPU:
    ASUS RTX 2070s
    Never ever heard of an MMO in my life. Whatcha talking?! :confused:
     
  2. kapu

    kapu Ancient Guru

    Messages:
    5,418
    Likes Received:
    802
    GPU:
    Radeon 7800XT
    Had option 3070 or 6800 , almost same price here. But it was no brainer for me , at 1080p it destroys it , has double the vram. It's matter of perference , RT/DLSS is not a thing for me yet, i think 2-3 years maybe , then i will switch to nvidia or AMD RDNA3 if its good. This generation we have true competition , you can't go wrong with any choice imo.
    But for me AMD pros :
    +1080p 6800 often matches 3080 wich is more expensive.
    +has much more vram ( not need to worry that 16gb will be obslete in forsenable future).
    +much better power efficiency

    Those pros beats in MY book :

    DLSS , RT.
     
    Valken likes this.
  3. Obsidian_Oasis

    Obsidian_Oasis Active Member

    Messages:
    57
    Likes Received:
    21
    GPU:
    ASUS RTX 2070s
    Double does not mean better at higher resolution, though, especially the narrow bandwidth bus on AMDs 6800/XT.

    I'm still being optimistic for the perf from the 6900xt, but it's not looking good for it either since the only major difference between the 6800xt vs 6900xt is 72 CUs vs 80 CUs.

    Time will tell.
     
  4. kapu

    kapu Ancient Guru

    Messages:
    5,418
    Likes Received:
    802
    GPU:
    Radeon 7800XT
    I'm not intrested anything higher than 1080/1440p ( maybe i will switch to 1440p 27' screen next year). At those resolutions RDNA2 is very good.
    For 4k 3070 with 8gb is a joke , which soon will be shown when next gen games will appear . Overall card is good but i couldnt accept 8gb vram.
     
    Valken likes this.

  5. Obsidian_Oasis

    Obsidian_Oasis Active Member

    Messages:
    57
    Likes Received:
    21
    GPU:
    ASUS RTX 2070s
    You would think that AMDs plans were to go cheap by going with the Infinity Cache to save money for them and their customers but AMD failed to share those savings with the consumers and instead jacked the price up by $50 and its partners a minimum of $120 price hike. Huh?!?!
     
  6. xrodney

    xrodney Master Guru

    Messages:
    368
    Likes Received:
    68
    GPU:
    Saphire 7900 XTX
    @Hilbert Hagedoorn , can you please check if Nitro+ HDMI port is 2.1 or 2.0b?
    If it's 2.0b it would be a bit letdown, but it's what some shops are listing.
     
  7. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    Should be a singular HDMI 2.1 and 3x Display Port 1.4 outputs.
    But yeah that's just what is listed not sure if anyone plugged it into a 2.1 TV (And the few computer displays on the market with this already.) to actually verify this.
     
  8. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    Having double of the ram is a negative to the product not a positive, as it'd be performing the same with half the vram, and would therefor then cost less.

    More vram which doesn't help performance and increases cost = bad no matter which way you look at it.


    In 3+ years? Maybe. But by that point, anyone buying one of these cards will likely be interested in a newer GPU that performs much better then a 6800/xt

    more vram doesn't magically make a GPU better, and this can be seen by how the 6800/xt fall off at high resolutions, even though aside from the 3090, they all have more vram the nvidias offerings.

    This should be a staple in tech oriented forums: Don't pay attention to the specs, pay attention to how it performs. If you only pay attention to specs, then the idea will be, a 128 core 256 thread system with 256GB of ram and a RTX A6000 with 48GB of memory will just be completely and totally worth it for gaming, should be 5-10 times faster then the fastest normal gaming system right? .....right....
     
    Last edited: Nov 26, 2020
  9. xrodney

    xrodney Master Guru

    Messages:
    368
    Likes Received:
    68
    GPU:
    Saphire 7900 XTX
    8GB is not enough for 4k in some cases, Godfall can go over 12GB.
    11GB is the absolute minimum I would accept for a current high-end card and I would not go below 8GB in lower midrange cards.
     
  10. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Yeah, AMD's bill of materials on RX 6800 (XT) is higher than that on RTX 3070. And not by little. It is actually on par or bit higher than RTX 3080.
    So, reference price. Quite justified. AIB's price? Comparing AMD's AIB pricing to nVidia's reference is stupid.
    Either compare reference pricing to reference pricing. Or custom to custom pricing.
    Because AIB's pricing for nVidia RTX cards is pretty abysmal to AMD's RX reference pricing too.

    @Aura89 I would not recommend 3070/6800 for 4K either. 1440p, any day. People will have comfortable gaming for few years without need to deal with performance issues and detail tuning. At 4K, those cards are not plug and play with good experience. Games already need detail level tuning. Sure, turning down/disabling few unimportant settings will enable those games to play 4K at good fps. But usual guy just takes global settings and turns it from Ultra to Very High. Then when he realizes it is not enough, he goes to High. And ends up at Medium preset, losing much more IQ that he deeded to. On 1440p, he would be fine.

    And it is not question of 3 years in future. Such situations already exist in current games even without need to cripple performance with DX-R.
     

  11. xrodney

    xrodney Master Guru

    Messages:
    368
    Likes Received:
    68
    GPU:
    Saphire 7900 XTX
    I have a 2080ti and WQHD monitor and in past, I did run out of video memory occasionally causing games to stutter, freeze or even crash. If that happened with 11GB on WQHD it would be much worse on 4K with 8GB or even 10GB.
    It just a few games today, but RDNA3 is coming around the end of 2022 which is too long to risk to not have enough video memory especially on cards with a price attacking $1k, be it $300 then maybe but I would still rather pay $50 for few gigabytes more.
     
  12. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    First of all congrats on your new card. You got yourself a great GPU and I hope it serves you well.
    But how about putting efficiency numbers into grand perspective?

    According to 4090 individual benchmarks, including guru3d,
    • 3070 is 3.2% more efficient than 6800 at 1080p.
    • 6800 XT is 1.4% more power efficient than 3080 at 4k (6.8% at 25x16).

    Anyone desiring efficiency with top performance, but unwilling to get 3090, might consider getting 3080, dial it down to 6800XT perf level, ending up with eye tearing perf/W.

    Classic rendering. nvm DLSS. Let alone RT.
    [​IMG]
     
  13. jctl

    jctl Member

    Messages:
    45
    Likes Received:
    10
    GPU:
    AMD Radeon HD 7600M
    JonasBeckman likes this.
  14. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    Was expecting closer to 1000 better than the numbers I was looking at with a ETA of 2021 for when there's anything stocked too so yeah pay over the expected even with VAT on top of the MSRP and then wait four or six months.
    It's about the same with the stock situation for the 3080's though hopefully the pricing actually comes down when availability improves.

    EDIT: Yeah 9500 SEK plus shipping and such for the one store I've found that lists these.
    So 930 something EUR.

    https://www.inet.se/produkt/5411936/sapphire-radeon-rx-6800-xt-16gb-nitro

    Actually free shipping though (But not expected until early 2021) but at 3000 or 4000 SEK above the standard versions yeah.
    (Granted those are pretty much gone and little to no further stock is expected so it's going to be 7000 - 8000 as the baseline pricing then another 1000 - 2000 for the higher-tier variants.)
     
  15. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Wrong context. Very wrong context. Very sad to look at, if you mean it seriously.

    So here are points which should never be ignored, yet you ignored them:
    - Higher end cards tend to have worse performance per watt. And at same time worse performance per $. => 3070 being under 6800 while those metrics are so close says, that 6800 is clear winner.
    - 1080p resolution brings situation where CPU is bottleneck in many games. => Gives numerical advantage to weaker 3070 as 6800 can't show how much stronger it really is. But that situation will change over time with much heavier games.

    As for 3090, it is not efficient in any way. Card needs extra power above official TBP to properly pull from 3080. And as for that same image pushed around over and over again.
    Not replicable. Every forum, every tech site says otherwise than your WTFTech source.

    And it is really sad, that anyone pushes that site around like gospel, just because their numbers are nice.
     

  16. kapu

    kapu Ancient Guru

    Messages:
    5,418
    Likes Received:
    802
    GPU:
    Radeon 7800XT
    Bad perspective. 6800 is quite faraway in new games. Don't care older games with cpu bottlenecks. Also performance per watt is much better. If you like i can post some graphs. Only thing thay 3070 beats 6800 is Dlss/rt im some games. Other than that it gets destroyed.
     
  17. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    Complete nonsense. 3090 is equally efficient as 6800XT at 4k.
    (-0.7% difference according to collection of 4090 benchmarks)

    I know you can post some benchmarks.

    I posted them ALL.

    You did say RT is not for you. Somehow I didn't conclude from that that you only care about new games :)
     
  18. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Who cares. 6800XT is not 4K GPU. Neither is 3090.
    You can as well go to 8K benchmarks where 3090 will have solid lead.

    I did see and read what actual owners of those cards post. That's enough for me.
    https://www.techpowerup.com/forums/threads/rtx-3080-undervolting-adventures.272936/post-4363787
    https://bjorn3d.com/2020/10/undervolting-the-rtx-3080-and-the-rtx3090/2/#split_content
    https://www.reddit.com/r/nvidia/comments/iwt953/rtx_3080_to_undervolt_or_not_to_undervolt_that_is/
     
  19. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    here we go again with famous tactics of: Scr3w multitude of (4090) benchmarks. Let me throw at you some endless garbage from random users, that I myself didn't bother to read.
    Because if you read it, you'd see it said:

    "Even at 850 mV/1860 MHz we see a big gain in all fronts."

    Point if any?
    This is overclocking. With undervolting.

    Plx stay focused: your claim was "3090, it is not efficient in any way."
    Turns out that according to collection of 4090 benchmarks, 3090 is 0.7% less eff. than 6800XT. While being 20% faster! Meaning 3090 smashes 6800XT at same perf level.

    Time to say I was wrong..... maaaybe... try it for once?
     
  20. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Well, what you call garbage are owners of cards, your fellow enthusiasts. And what you call gospel is worst site there is in PC HW space.

    If one site should be taken as gospel here, it would be G3D. But I do respect actual owners of cards and their data. Because they have to live with cards. And they'll notice unstable undervolt and increase voltage till their card is stable.
    That's in contrast to those who just run 3DMark and if there is no crash, they declare UV successful.

    And with all those UV stories being pushed around, I have feeling that Hilbert may have article in works. Maybe even some comparison between different manufacturing processes.

    Now to the bold part: Yeah. Except you again ignored reality of: "Performance per watt per $". Here it is for you in dumbed down way:
    - Since you have already delivered normalized performance per watt, only performance per $ remains to be counted.
    -> 3090 has double price over 6800XT while providing 20% higher performance at same power draw.
    -=> In simple numbers: 1x power draw, 1.2x performance, but 2x price. That's reality of 3090. (And that's on 4K, where 6800XT's performance sux due to memory bandwidth.)

    All your arguments are based on tiny fragments taken out of big image. Because big image is exact opposite than what you would like it to be.
     

Share This Page