Review: AMD Ryzen 7 3800X

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Aug 16, 2019.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    36,282
    Likes Received:
    5,315
    GPU:
    AMD | NVIDIA
    No that is a way too simplified assumption as graphics quality evolves year by year demanding more powerful GPUs. The framerate targets stay at 60~144 FPS, so for the bigger part that range is all that matters for a processor.

    Game developers choose a target FPS say 60 or 144 FPS, and adapt image quality settings based on the mainstream hardware available. With faster graphics cards comes better graphics. Compare the first and last Tomb Raider and notice what happens.

    Do you still game at a graphics quality level of 2006? CPU limitation, therefore, is less of a factor compared to GPU limitation, it's also the same reason that so many people still have an older CPU and play their games just fine. You're focussing on the 720p results a little too much, whereas 1440p paints a more realistic picture as ion the en, 99% of the time any PC is GPU limited in games.


    75476.jpg
     
  2. Ricardo

    Ricardo Active Member

    Messages:
    88
    Likes Received:
    56
    GPU:
    1050Ti 4GB
    Yes, any processor other than a OC'ed 9900k sucks for gaming. You're absolutely right. No question there. Anyone that buys anything else is just wrong. Period. Did I said you're right? Because you are.
     
  3. Richard Nutman

    Richard Nutman Active Member

    Messages:
    87
    Likes Received:
    27
    GPU:
    Sapphire 5700 XT
    I wonder if any or many games studios use Intel's compiler to build? It has been shown to disable optimal code paths at runtime if it detects it's not an Intel cpu.
    It's just strange how the Zen 2 often beats Intel considerably in single threaded and multi-threaded tests, but when it comes to games, Intel still just has a slight edge. An edge that I don't think is worth caring about, but still.
     
  4. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    4,495
    Likes Received:
    1,390
    GPU:
    HIS R9 290
    The compiler might have something to do with it, but even if you removed all architecture-specific advantages, I'm sure Intel would still get a slight edge, mostly because of lower latency. For most heavy-compute tasks, latency doesn't matter because they're usually churning a lot of data upstream in their own little bubble, not really needing to synchronize data downstream all that often. The more a thread can do by itself without synchronizing, the less a delay will have any real impact. This is why Cinebench, for example, works so well on AMD - the only time the cores really need to synchronize is to show their progress. But they're not really working together, they're just handling their own chunk of data by themselves, and submit the work to the complete scene when they're done.

    For an analogy, take for example a phone conversation with someone on the other side of the planet: sometimes there can be as much as a 1 second delay, but the conversation can still operate smoothly as though you were face-to-face as long as each person is talking long enough for the other to come up with a response at the appropriate time. But, have ever encountered a situation where you accidentally talk over someone on the phone, and both people stop speaking? There's usually an awkward long pause, at which point both people decide to speak up again, only to speak over each for a 2nd time with yet another long awkward pause. This is because both sides are trying to synchronize, but, synchronization is inefficient when there's such a long delay. When people accidentally talk over each other face-to-face, the problem is typically solved in less time than it took to wait for that first awkward pause over the phone because the delay is eliminated.

    Games have code that is heavily dependent upon synchronization, particularly with the GPU. So even though AMD doesn't have that big of a latency deficit, it becomes more noticeable when data is being synchronized millions of times per second. The more synchronization you have to do, the worse it gets, which is why AMD performs worse as frame rate goes up (that, and the GPU is being bottlenecked).
     
    HandR likes this.

  5. MonstroMart

    MonstroMart Master Guru

    Messages:
    591
    Likes Received:
    184
    GPU:
    GB 5700 XT GOC 8G
    Considering the next gen consoles coming out in 2020 are rumored to have 8 core and 16 threads cpu i'm not sure i would recommend anything lower than that if someone intend to keep his cpu for 6-7 years. The 7600k looked good when it was released but latest revisits of the cpu show that it is struggling to maintain an "acceptable" low 1% fps in some newer titles because it is limited to 4 threads. There's a couple of threads on BF V forum with 7600k owners complaining about performance.

    6/12 and 8/8 will likely be enough moving forward but devs often lazily optimize for consoles only and not sure i would be confident with anything less than 8/16 since this is what next gen consoles will likely have and considering how weak they are devs will have to use those threads to push them to their limit toward the end of the generation. Current consoles have 8/8 cpu and some newer titles seem to be optimized for 8 threads.
     
    Last edited: Aug 16, 2019
  6. MonstroMart

    MonstroMart Master Guru

    Messages:
    591
    Likes Received:
    184
    GPU:
    GB 5700 XT GOC 8G
    Yeah but any difference in fps over 144 is not worth talking about imo. Personally i think it's dangerous to assume a difference in fps at high fps will translate to the same % of difference in a gpu bound scenario with a lower fps.
     
  7. JamesSneed

    JamesSneed Master Guru

    Messages:
    606
    Likes Received:
    205
    GPU:
    GTX 1070
    Its likely down to frequency and latencies involved trying to produce 120+FPS. Also some games are simply going to be more Intel optimized which will give n percent better performance but that percentage makes a higher number when you get into very high frame rates so it seems like a larger difference than it really is. See this video for a 3700x and 9900k at 4Ghz on all cores compared. AMD is winning in CS:GO in this test and many of the workstation tests. For the most part what this means is AMD needs to get the boost frequency up to around 5ghz to make the very high FPS gamers choose them over Intel which we hopefully will see on the 7nm+ process next year.
     
  8. Toadstool

    Toadstool Active Member

    Messages:
    87
    Likes Received:
    26
    GPU:
    Vega 64
    The 3800x is more interesting than I thought I'd be. It's still pretty much a stopgap between the 3700x and 3900x but the bump over the 3700x is more than I thought it'd be, still maybe not as good bang for your buck as the 3700x but interesting none the less.

    Also, karma777police never disappoints.
     
  9. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    4,495
    Likes Received:
    1,390
    GPU:
    HIS R9 290
    I share your opinion, though I know others here would strongly disagree. I personally am satisfied with 60FPS, which is why I'm still fine with my mediocre 1500X.
     
    sykozis likes this.
  10. MonstroMart

    MonstroMart Master Guru

    Messages:
    591
    Likes Received:
    184
    GPU:
    GB 5700 XT GOC 8G
    Personally i like to have between 80-90. I have a 144Hz screen and honestly i can tell when my fps drop in the 60 range. But i'll be honest i can't tell the difference between 100 and 144fps. Maybe some can but personally i can't.
     
    OnnA likes this.

  11. alanm

    alanm Ancient Guru

    Messages:
    8,935
    Likes Received:
    1,301
    GPU:
    Asus 2080 Dual OC
    So about +10% base clock vs 3700x yet most of the benches are only about 2-3% diff. I know bin milking is a common practice, but would have expected better results tbh.
     
  12. Rich_Guy

    Rich_Guy Ancient Guru

    Messages:
    12,281
    Likes Received:
    445
    GPU:
    MSI 2070S X-Trio
    They don't suck at all for gaming now, ive just gone from a 5930K, to a 3700x, and couldn't be happier, they're great for gaming :)
     
  13. nizzen

    nizzen Master Guru

    Messages:
    786
    Likes Received:
    153
    GPU:
    3x2080ti/5700x/1060
    My 3900x is pretty good @ gaming now after "tweaking" :D

    [​IMG]



    [​IMG]

    9900k is still better in gaming, but who cares :p
     
    jaggerwild likes this.
  14. NCC1701D

    NCC1701D Member Guru

    Messages:
    108
    Likes Received:
    42
    GPU:
    RTX 2080 Ti
    I'm wondering if these results are from an older bios/AGESA. For my 3700X, I applied the new chipset drivers and asrock released an updated bios just a couple of days ago for my board. My cinebench scores are right there with the 3800X that HH shows and the 3700X scores shown are right about what my chip used to get before I updated. Assuming the 3800X would be even a notch better than what's shown here depending on which version is used for testing.
     
  15. Arbie

    Arbie Member Guru

    Messages:
    169
    Likes Received:
    58
    GPU:
    GTX 1060 6GB
    "Overclocking is the weak spot for Ryzen."

    Why do you keep saying this?

    • It begs the question that manual OC is even desirable.
    • It ignores the fact that Ryzen "overclocks" itself just fine.
    • It makes a strength appear to be a weakness.

    People will read this one sentence and conclude that Ryzen is seriously lacking. All the words after that won't matter.
     

  16. Agonist

    Agonist Ancient Guru

    Messages:
    2,747
    Likes Received:
    172
    GPU:
    2x RX 480 Nitro 4GB
    I still sew moronic fanboyism runs amuck with Intel shills.
     
    sykozis likes this.
  17. Evildead666

    Evildead666 Maha Guru

    Messages:
    1,237
    Likes Received:
    242
    GPU:
    Vega64/EKWB/Noctua
    I'm going froman i5-3570K to a 3800X.
    Keeping the Vega64 EKWB though. Seem to have changed everything else almost.
    Can't wait to actually use it watercooled.

    edit : Have a 6670LP in it atm...best aircooled card I have left.
     
  18. Global unlimited

    Global unlimited New Member

    Messages:
    6
    Likes Received:
    3
    GPU:
    2080Ti
    Amen, brother! You and I will stick to 720p gaming while the rest of the uninformed will purchase Ryzen and game on that newfangled 1440p nonsense!

    In all seriousness. If you're gaming at 1080p Intel is a good choice. Anything above that and either CPU is a good choice. If you're spending the bucks on a 9900K though, 1080p is probably not the resolution you're going to run at. Unless of course you like those fancy 240Hz monitors,then by all means, get the 9900K. Your creep explanation makes no sense however, and that will not be forgiven.
     
  19. DW75

    DW75 Maha Guru

    Messages:
    1,161
    Likes Received:
    564
    GPU:
    ROG GTX1080 Ti OC
    I am shocked that karma777police has not been banned yet. Great review as always, Hilbert. I am a 3800X owner, and I am getting higher than average stock boosting. My CPU boosts up to 4.55 on 5 cores, and 4.65 on 3 cores. The CPU randomly picks which ones will hit 4.55 and 4.65. I would bet that it will hit even higher clocks once winter comes around. I see no need to overclock this CPU at all.
     
  20. H83

    H83 Ancient Guru

    Messages:
    2,732
    Likes Received:
    406
    GPU:
    MSI Duke GTX1080Ti
    Very nice CPU but i think the 3700x is a better buy over this one.

    I don´t think anyone can spot that type of differences but the good thing of having an higher FPS than needed is that it compensantes frame drops much better. For example if we have a game running between 80/90 frames and something more taxing happens suddenly, the frame rate can drop bellow the 60 mark and you are going to notice it. But if the game game is running at more than 120 FPS and the same taxing stuff happens, the frame rate will probably drop to the 80/90 mark and we are not going to notice it.
    So more more is better unless i´m mistaken...

    Great review as always!
     

Share This Page