1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

AMD Ryzen 5 3600 CPU Benchmarks Surfaces

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jun 21, 2019.

  1. Noisiv

    Noisiv Ancient Guru

    Messages:
    6,607
    Likes Received:
    461
    GPU:
    RTX 2070 Strix

    Depends how technical and rigorous you want to be. It depends on what do you mean by power consumption. And it depends on whose TDP definition we are talking about.

    Yeah... turns out that when we turn to the real world even the simplest physical concepts (like heat and power) require substantial nitpicking and attention to details. And neither AMD nor Intel found it necessary to rise to that level or rigor when it comes to publicly communicated definition of "TDP". And much of the confusion around "TDP" stems from the fact that AMD's and Intel's TDP definition themselves are not particularly well badly defined.

    Honestly its a relief that finally someone has caught up with this (@Fox2232 in this case), and that I am not alone trying to explain that in first approximation and for 99% of our discussion needs, that indeed TDP = power consumption, and that that is the only way to make the term useful and relevant.
    The reason being, again, that if you want to be technical and precise, you'll turn out to be too clever for your own good, because there are multiple definitions to begin with, both bad, and you'll end up with useless physical term.

    If you want to know more, this is a good article, albeit with some glaring mistakes :eek:

    PS
    Yes Intel is worse when it comes to TDP and its usability because they are talking about base clocks(pffftt), and AMD's tdp is more representative of the real world power. From the technical standpoint I'd argue that AMD's definition is even worse than Intel's.
     
    Last edited: Jun 24, 2019
    Fergutor likes this.
  2. Noisiv

    Noisiv Ancient Guru

    Messages:
    6,607
    Likes Received:
    461
    GPU:
    RTX 2070 Strix
    My 32'' FreeSync HP is fully prepared to be replaced with 144MHz (Freesync/Gsync).

    Next time I see TDP .NE. Power Consumption or god forbid "TDP has nothing to do with power"... it's a goner. Pow! Straight to the Moon :mad:
     
    Loophole35 likes this.
  3. Dazz

    Dazz Master Guru

    Messages:
    795
    Likes Received:
    68
    GPU:
    ASUS STRIX RTX 2080
    The first review is out, for the Ryzen 5 3600, in Spanish but pictures speak louder than words. The 3600X is giving the 2700X and 8700K a good kicking in multi threaded: https://elchapuzasinformatico.com/2019/06/amd-ryzen-5-3600-x470-review/

    Memory bandwidth looks really strange compared to all the other CPU's more so the write speed, memory latency is worse than all others, maybe a BIOS bug in the X470?

    Gaming tests conducted at 1080p with a RTX2080Ti
     
    Last edited: Jun 24, 2019
    ZXRaziel, Fergutor and H83 like this.
  4. H83

    H83 Ancient Guru

    Messages:
    2,663
    Likes Received:
    358
    GPU:
    MSI Duke GTX1080Ti
    Just read the review. The positive points are the IPC increase, around 5% to 10% worse then the Intel 9900K, and the multithreaded performance, almost as good as the 2700x despite having 2 fewer cores. In gaming is also better than the 2700x and slower than the 9900K by a 10% average.

    The negative points seem to be the latency, the very low memory speeds, must be some sort of bug like Dazz wrote, and the power comsuption seems to be unimproved despite the 7nm node.

    Conclusion, the 3600 seems to be a great chip for a value oriented gaming rig!

    I hope all this is correct because my spanish is not very good.
     

  5. Astyanax

    Astyanax Ancient Guru

    Messages:
    2,100
    Likes Received:
    498
    GPU:
    GTX 1080ti
    memory latency is where it should be.
     
  6. Aura89

    Aura89 Ancient Guru

    Messages:
    7,484
    Likes Received:
    793
    GPU:
    -
    This is what i read as well and yet in their own article they show that the 3600 compared to a 2600 uses 10 less watts, at a higher frequency, and better performance. So i have no clue how they came to the conclusion that the 3600 power consumption isn't better....when it uses less wattage, at a higher frequency, and has better performance.....
     
    ZXRaziel and Alessio1989 like this.
  7. H83

    H83 Ancient Guru

    Messages:
    2,663
    Likes Received:
    358
    GPU:
    MSI Duke GTX1080Ti
    I also thought about that but maybe they expected even better figures... After this small review i´m very curious to see how the 3700 performs.
     
  8. Alessio1989

    Alessio1989 Maha Guru

    Messages:
    1,299
    Likes Received:
    196
    GPU:
    .
    meanwhile Agesa combopi 1.0.0.3 is being tested by OEMs... In less then one month? there has been... 3-4 revision of combo pi 1.0.0.x.y agesa... Hopefully ASUS will start deply BIOS updates only with the last update...
     
    Fox2232 likes this.
  9. Fergutor

    Fergutor Active Member

    Messages:
    59
    Likes Received:
    12
    GPU:
    Asus GTX 970 Strix
    Interesting. I will read that link. Hey, and what about those reviews in which they measure the power consumption from the "socket itself" (or some other more exact point) to measure the CPU power consumption only? (can't remember who does that) And the ones that measure the SoC power by software, soft like HWiNFO? Do you think those are trustworthy to see the real power consumption of a CPU?
     
  10. Fox2232

    Fox2232 Ancient Guru

    Messages:
    9,380
    Likes Received:
    2,025
    GPU:
    -NDA +AW@240Hz
    That's technically close to impossible. One can desolder all VRMs and put ampere meter between them and CPU. And that may introduce all kinds of other issues.
    But I am yet to see someone doing that. Please go and find that. I am sure many here would like to see it.
     

  11. Dazz

    Dazz Master Guru

    Messages:
    795
    Likes Received:
    68
    GPU:
    ASUS STRIX RTX 2080
    Yeah 5-10% behind the 9900K but the Intel chip does have 19% higher frequency advantage over the 3600 which is the lowest end SKU, with the likes of 3800X and 3900X having around 7.5-10% higher clock which should bring it to around the same as the 9900K in single threaded performance with the 3950X over taking it.

    The 3600 does seem to compete like for like with the 8700K very well for nearly half the price, well more than half since the 8700k doesn't come with a cooler. Can't go wrong with that really. $200 8700K for all. What maybe the deciding factor is overclocking however as we know nothing on Ryzen 3000 overclocking but they (AMD) say it will attempt to boost as much as it can on all cores within TDP, but motherboard designers are going to have a TDP over-ride feature so it can go out of spec.


    Check again it's 80ns which is worse than first Gen Ryzen at 75ns and Zen + 66ns the memory write speeds looks about half of what it should be 47GB/sec down to 25GB/sec. games require lots of small read and writes so this can hamper it's game single threaded performance. I am sure it's just a bug in the BIOS since well there is no final revision of the BIOS for any of the motherboards out there... yet at least.

    This maybe why in synthetic benchmarks single threaded performance is 15-17% as the data can be preloaded into the cache and run, while games will have to keep fetching to and from the system memory as and when the L3 cache fills up. Thats why low latency memory shows better gains than raw bandwidth in games regardless of platform, AMD or Intel.
     
    Last edited: Jun 25, 2019
    Fox2232 likes this.
  12. Astyanax

    Astyanax Ancient Guru

    Messages:
    2,100
    Likes Received:
    498
    GPU:
    GTX 1080ti

    And you're missing the architecture differences.

    Pay attention.

    memory latency is where it should be.
     
  13. Aura89

    Aura89 Ancient Guru

    Messages:
    7,484
    Likes Received:
    793
    GPU:
    -
    Instead of just stating that, why not explain yourself? Or show where it is explained?
     
  14. Astyanax

    Astyanax Ancient Guru

    Messages:
    2,100
    Likes Received:
    498
    GPU:
    GTX 1080ti
    it should be obvious.

    memory access is on the io die, memory latency tests are going to include the latency for cpu core to io die as well.

    Also
    [​IMG]
     
  15. Aura89

    Aura89 Ancient Guru

    Messages:
    7,484
    Likes Received:
    793
    GPU:
    -
    Given that chart, unless i'm reading it wrong, then there's something not right about the latency in the article.

    The article states 3200 memory, which would be according to AMD around 69ns, yet the article shows 80.5.

    Now either A: AMD lied

    Or B: Something wasn't right with the test

    Yes?
     

  16. Fergutor

    Fergutor Active Member

    Messages:
    59
    Likes Received:
    12
    GPU:
    Asus GTX 970 Strix
    Yeah I figured it may be very hard, but I'm almost sure I read reviews that did that...can't find them...So maybe I misread, but I'm almost sure. The link Noisiv provided talk about doing that (don't know with what grade of certainty, knowledge, security, experience...).
     
  17. Fergutor

    Fergutor Active Member

    Messages:
    59
    Likes Received:
    12
    GPU:
    Asus GTX 970 Strix
  18. Kool64

    Kool64 Master Guru

    Messages:
    206
    Likes Received:
    76
    GPU:
    Gigabyte GTX 1070
    Far Cry 5 being the anomaly it’s pretty darn close. The memory read/write and latency are still quite strange.
     
  19. Astyanax

    Astyanax Ancient Guru

    Messages:
    2,100
    Likes Received:
    498
    GPU:
    GTX 1080ti
    Using 3200mhz memory doesn't mean its running at 3200mhz.
     
  20. Aura89

    Aura89 Ancient Guru

    Messages:
    7,484
    Likes Received:
    793
    GPU:
    -
    Hence: There's something wrong with the article.

    It doesn't state speed other then: 3200

    The only thing that can be assume is that the ram is at: 3200.

    The entire point that is trying to be made here from myself, and others, to you, is that given the article and the only information we have is that the latency doesn't make sense, given AMDs own chart, and the articles stated 3200mhz speed ram, which is, again, the only thing it states.

    My only question to you now is this: Does the fact the article only states 3200mhz speed ram, and its reported latency, given the chart AMD gave us, make sense to you, still?
     

Share This Page