Review: AMD Radeon RX 6700 XT (reference)

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Mar 17, 2021.

  1. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,208
    GPU:
    AD102/Navi21
    what are you even reporting here ?
    base clocks ? turbo clocks ? all core ? single core ?
    how does this number relate to the number of rtx 3090 owners,the card used for their tests ? is it more than 5.02% ? is that who you are concerned about ? people who can afford a 3090 and can't get a 4GHz cpu cause life is hard ?
    I like good,informative piece of tech journalism,and this is not that.
     
    ManofGod likes this.
  2. CronoGraal

    CronoGraal Ancient Guru

    Messages:
    4,194
    Likes Received:
    20
    GPU:
    XFX 6900XT Merc 319
    This is a bad time to be a PC enthusiast. Sadly a lot of us do well enough in life to pay the premiums when in reality our communities should basically sit back and refuse to buy a damn thing after prices spill over a certain threshold.

    AMD and Nvidia don't really care, their stock is getting sold like crazy regardless of any prices they set.
     
    Robbo9999 likes this.
  3. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,872
    Likes Received:
    446
    GPU:
    RTX3080ti Founders
    Well, I remember a thread about availability for gpus going back to normal mid 2021. I said then that we're looking at at least end of 2021. Considering where we are today, I can see this shortage lasting 2yrs.

    Also, every next-gen card is going to have exactly the same problem. I think Nvidia's 4000 and AMD's 7000 gpus will be even worse. The only thing that could stop it is if mining was banned. Considering many banks have already invested in it, then, we know that's not happening.

    What that means for the consumer is a wait of a few years before being able to get the "latest" thing. Potentially someone only being able to get an RTX3080 for example around 2022-2023 at retail prices. Meanwhile RTX4000 etc just released. We'd see a trend of the majority of people having to buy the previous generation (at retail prices) due to the lack of stock of the latest gen products.
     
  4. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    RTX 3090 was not only GPU used in comparison showing differences between brands, right?
    They did show 3080 vs 6800 XT. And difference was noticeable even between RTX 2060 and RX 5600 XT. (Which in regular CPU OC benchmarks are about same. Yet with regular non-OC 6 core CPUs RX 5600 XT had advantage.)

    Those are really not top dogs GPUs, right? Those are quite nicely priced mainstream GPUs. $279 USD is not high MSRP.

    Did you by accident tried to take entire thing out of context? Problem affects entire nVidia's GPU stack.
     
    Last edited: Mar 28, 2021

  5. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,208
    GPU:
    AD102/Navi21
    care to answer my question ?
    what is being reported in those clock charts ? hm ?
     
  6. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Base clock. That's simple. In other words clock 4/6 core CPUs have in CPU intensive games. And reason why non-OC CPU results are more important. Because most of CPUs in gaming PCs simply can't override boost clock.
     
  7. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,208
    GPU:
    AD102/Navi21
    then do not accuse ME of taking things out of context,alright sport ?
    base clock for intel has nothing to do with anything except running desktop apps.the 2500k I had a decade ago was 3.3G base my current one has 2.9G base and 4.6G turbo.but what difference 1.7GHz makes to you.
     
  8. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Turbo depends on number of loaded cores.

    AMD's GPU in system: less threading in driver, smaller load on CPU = Higher clock due to fewer loaded cores
    nVidia's GPU in system:
    GPU in system: more threading in driver, higher load on CPU = Lower clock due to fewer loaded cores
    - - - -
    And yes, when entire nVidia's stack is affected and problem is visible even on mainstream GPUs, attempt to put it only as 3090 discussion is taking things out of context.

    You made argument:
    You tried to take $1500 MSRP GPU as base for argument that people who would be affected can as well buy proportionally expensive CPU.
    But reality is that $300 GPUs are affected too. And $300 is not some magical end of problem. It is lowest they tested on both sides. And it still shown quite some difference.

    Now, go and make statement on what's usual CPU clock while gaming for 90% of gamers. You can look at steam stats on number of CPU cores, brands and base clocks.
    You will not like it. As it does not fit your narrative.

    Only tiny fraction of nVidia users are not affected by this.
    - - - -
    And that post of yours with increased boost clocks? Composition Fallacy. So here is correction: "Some people have new non-K intel CPUs which clock under few low threaded workloads to 4.6GHz. But that does not mean All People have them."
    Still overwhelming majority have those low clocked CPUs.
     
    kapu likes this.
  9. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,208
    GPU:
    AD102/Navi21
    show us how many intel cpus come without turbo.please.or don't bother responding.
    :rolleyes::rolleyes::rolleyes::rolleyes:
    they did
    as if you could know that

    please...stop

    I gotta watch the second video,but knowing hub,it's misleading one way or another.why isn't any normal tech site reporting on this ?
    im trying to find average performance charts,there is a part of a video with this title,but no charts.
    why ?
     
  10. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    bold : Read what you quoted. My statement: "Turbo depends on number of loaded cores."
    Your reply: "show us how many intel cpus come without turbo."
    Where did I imply that intel's CPU have no Turbo? Your reply is complete false construct presuming I wrote something I did not!

    RED : They shown same effect on wide variety of GPUs. You specifically took GPU with MSRP of $1500 USD as base for fallacious argument.

    underlined : Only tiny portion of gaming systems are running overclocked CPUs. You know it too. Asking me to "stop" will not change reality.
     

  11. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,208
    GPU:
    AD102/Navi21
    yes,and there is a reason why it's called turbo and base is called base.I would imagine a person of your knowledge would figure it out.
    yeah I'm just seeing the second video now,wasn't aware of it before.in the first one they tried 3090 and 10100.they must've felt the response wasn't positive.

    there is just about a couple of games out of those they chose where rtx2060 is comparable to 5600xt,like they are very close in performance.results are okay there,or the impact is very tiny in e.g. death stranding.
    they are limiting cpus either on thread count or single core.there is certainly someting going on there on nvidia side on those.

    9400f+2060 is a good,realistic example.but why is no 6/12 cpu tested in either of those videos ?

    from what I can gather,hub made two videos,and while the problem was evident in the first one already,they made like 50 minutes of material and it's still impossible to judge the extent of the problem.results vary from margin of error when hardware is paired well to huge when they're testing entry level cpus with high end cards.

    I still haven't got the answer to the first thing I asked thoug.Is this something nvidia can address and fix or alleviate or is it just how turing and ampere work.
     
    Last edited: Mar 28, 2021
  12. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    i5-9400F being 6C CPU. Not long ago, this very forum would argue that 6C/6T CPUs are to last for another 4 years.
    And when people come to me for new system advice or to build it for them, I sometimes have to persuade them to get 6C/12T as they would be "fine" with 4C/8T.
    And I even tell them that when they build new system which is to last till AM5 prices settle, they should aim for 8C16T.

    So while 6C/12T CPUs may help in those constrained situations, they would have lower turbo clock. And people on the ground generally felt OKish with fewer cores.
    - - - -
    Thing you wrote about other tech sites not reporting similar results on not testing is simple. They decided against or have articles in works.
    Because even article saying: "We did tests on this and this HW and found no difference from our OC test platform." Would be valuable and could make for big article.

    So, maybe when HH is done with his tests, he may bring much more detailed data.
     
    kapu likes this.
  13. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,208
    GPU:
    AD102/Navi21
    this is different kind of fish.how good or bad 6/6 is another matter.the question is how much performance hit there is across a stack of cpus,and it's a question hub have not answered.maybe it gave other tech journalists a clue to check that out tho.

    6/6 was never an option for me,neither was 8/8 frankly.I only got 6/12 cause it was cheap at the time but the plan was to get 8/16 from the very beginning once they drop in price.going with intel turned out better than amd,i paid less for 10500 than 3600 and got a faster cpu,same for 10700f vs 3700x.
    I'd like an example of current midrange tested,eg. 10400.

    and let's not kid ourselves,a review that would find no difference would earn hub very little clicks,and one that would put nvidia in any way over amd would straight up fire up their comment section and make tremendous damage.YT channels are not like tech sites.If you wanna make money you gotta find the right people to subscribe with content suited for them.That's why when I watch a video on human evolution the next thing they're suggesting I watch is not the genesis to show me all kinds of different perspectives.
     
    Last edited: Mar 28, 2021
  14. Robbo9999

    Robbo9999 Ancient Guru

    Messages:
    1,858
    Likes Received:
    442
    GPU:
    RTX 3080
    As a quick aside, (& related), I'm in a queue for a 3080 GPU, and I'd be planning to use it with my 6700K which is at 4.69GHz with 16GB (2 sticks) DDR3 RAM at 3233Mhz (14-15-15-32-240-1T, dual rank). I was/am concerned that I'll get less fps in CPU limited games with my prospective 3080 vs my existing GTX 1070, what do you reckon? (I included my RAM details as dual rank RAM combined with the quite tight timings I have has proven to boost CPU performance in games.) I've not looked into enough detail to know the answer to this question, but I can see you have investigated this topic and figured I'd get your viewpoint. I've got a 180Hz G-sync monitor, so in my case I'm aiming for 171fps/Hz stable in games - for example I can keep that stable pretty much constantly in BF1 except for on Amiens map.

    (As it stands I probably won't ever get a 3080 GPU as I believe my vendor will provide a refund rather than live with the loss they'd make given I bought the GPU slightly above MSRP).
     
    Last edited: Mar 28, 2021
  15. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,872
    Likes Received:
    446
    GPU:
    RTX3080ti Founders
    I'm looking at the same upgrade and the cpu bottleneck is roughly 30% or more. Forgot the exact numbers, but, there's definitely a bottleneck vs latest cpus.
     

  16. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Not possible to get lower performance from stronger GPU when comparing two nVidia's GPUs. At worst you'll see no improvement in some of games.
    Entire thing is that when people have weaker CPUs, they are better off having AMD's GPU. (At least RDNA ones.)
     
    Robbo9999 likes this.
  17. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Possibility of not finding difference was meant for other sites. Because article/video which did find difference is already out there. So moment someone disputes it on one or more HW configurations, there is controversy. (Worth a lot of clicks for both sides.)

    And as of AMD vs intel CPUs, it is not really relevant here. If you can get better CPU from intel for your use case, great for you.
    But this AMD vs nVidia effect been shown on both intel/AMD CPUs.
    And in places it likely shows 1/2 AVX2 performance on Zen1 Ryzen 5 1400 which manages to tank badly.

    For testing, I would go different route. 8C/16T CPU from AMD and intel. Would set baseline on stable 4GHz for both. Then exaggerated problem by clocking them on 3/3.5GHz. And "fixing" it by going to 4.5GHz with both. And then 5GHz on all cores, which is benchmarking standard for tech sites.

    That way they would use same setup to demonstrate difference in between tech sites testing and Average Joe's PC.

    And then, it is not problem to disable cores or SMT to show effects of each. Maybe nVidia's problem is not completely in core count, but in SMT optimizations which lose its effect once it is not available.
    (Can't really say till there is proper side by side testing. They pointed shotgun in general direction of problem and did hit something. But for now, it is not clearly defined. And effect of many variables remains unclear.)
    - - - -
    Years ago when I had 2500K and Fury X, there was talk about bottleneck and that some games exhibit it a lot and some don't. AvP 2010 was one which did not really care about CPU. I did clock 2500K from 4.5GHz down to 2GHz in 500MHz steps, and last test I did on 1.6GHz.
    But with many games, it was not good anymore, 4C/4T became good enough for 60fps gaming, and today in some cases not even that. I did move to 2700X because Vermintide 2 was stuttering like hell and required fps limiter to be set quite low.
    But that was CPU bottleneck due to Game requirements. And thing is that issue was due to Game threads using all available CPU resources and choking everything (GPU driver included) in background.

    I do wonder if Vermintide 2's situation would be worse with nVidia's GPU. But game's benchmark has some randomness due to AI. (But user can set "worker threads" from 1 to number of logical_cores-2.)
     
  18. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    Inb4 Nvidia disables it and then performance sucks and then everyone is b*tching even worse.

    And still NOBODY has tested Threaded Optimization.
     
  19. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,208
    GPU:
    AD102/Navi21
    same for me,3570k oc 4.7G,8gigs of dc 2200 c11 ram and a r9 290 trix running 1150mz
    far cry 4 was unplayable as soon as you came near any outpost or settlement.out in the open - smooth sailing.there wasn't an option to enable dynamic vsync either,and hacks didn't work.
    and mind you 3570k was one of the fastest gaming cpus back in 2014.it wasn't an entry level one like stock 9400s or ryzen 1400s
    I got so pissed that I splurged on gtx980 and never looked back.it looks like it may go a full circle now with ampere,but first I gotta get that 3080 im waiting for.
     
    Fox2232 likes this.
  20. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Yeah, Ivy had quite a few advantages over Sandy. Faster memories did help a lot and PCIe 3.0 did make some difference too.
    IPC wise with same memory configuration, they would be same. But improved IMC made big difference.

    Before Sandy, I had i7-720qm. It was not good time for HT. Even people with desktop i7s disabled it to get higher OC. But threads become important for gaming in time.
     

Share This Page