AMD Ryzen 5 3600 CPU Benchmarks Surfaces

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jun 21, 2019.

  1. Astyanax

    Astyanax Ancient Guru

    Messages:
    4,753
    Likes Received:
    1,405
    GPU:
    GTX 1080ti
    memory latency is where it should be.
     
  2. Aura89

    Aura89 Ancient Guru

    Messages:
    7,748
    Likes Received:
    974
    GPU:
    -
    This is what i read as well and yet in their own article they show that the 3600 compared to a 2600 uses 10 less watts, at a higher frequency, and better performance. So i have no clue how they came to the conclusion that the 3600 power consumption isn't better....when it uses less wattage, at a higher frequency, and has better performance.....
     
    ZXRaziel and Alessio1989 like this.
  3. H83

    H83 Ancient Guru

    Messages:
    2,834
    Likes Received:
    447
    GPU:
    MSI Duke GTX1080Ti
    I also thought about that but maybe they expected even better figures... After this small review i´m very curious to see how the 3700 performs.
     
  4. Alessio1989

    Alessio1989 Maha Guru

    Messages:
    1,483
    Likes Received:
    261
    GPU:
    .
    meanwhile Agesa combopi 1.0.0.3 is being tested by OEMs... In less then one month? there has been... 3-4 revision of combo pi 1.0.0.x.y agesa... Hopefully ASUS will start deply BIOS updates only with the last update...
     
    Fox2232 likes this.

  5. Fergutor

    Fergutor Active Member

    Messages:
    65
    Likes Received:
    13
    GPU:
    Asus GTX 970 Strix
    Interesting. I will read that link. Hey, and what about those reviews in which they measure the power consumption from the "socket itself" (or some other more exact point) to measure the CPU power consumption only? (can't remember who does that) And the ones that measure the SoC power by software, soft like HWiNFO? Do you think those are trustworthy to see the real power consumption of a CPU?
     
  6. Fox2232

    Fox2232 Ancient Guru

    Messages:
    9,939
    Likes Received:
    2,292
    GPU:
    5700XT+AW@240Hz
    That's technically close to impossible. One can desolder all VRMs and put ampere meter between them and CPU. And that may introduce all kinds of other issues.
    But I am yet to see someone doing that. Please go and find that. I am sure many here would like to see it.
     
  7. Dazz

    Dazz Master Guru

    Messages:
    866
    Likes Received:
    89
    GPU:
    ASUS STRIX RTX 2080
    Yeah 5-10% behind the 9900K but the Intel chip does have 19% higher frequency advantage over the 3600 which is the lowest end SKU, with the likes of 3800X and 3900X having around 7.5-10% higher clock which should bring it to around the same as the 9900K in single threaded performance with the 3950X over taking it.

    The 3600 does seem to compete like for like with the 8700K very well for nearly half the price, well more than half since the 8700k doesn't come with a cooler. Can't go wrong with that really. $200 8700K for all. What maybe the deciding factor is overclocking however as we know nothing on Ryzen 3000 overclocking but they (AMD) say it will attempt to boost as much as it can on all cores within TDP, but motherboard designers are going to have a TDP over-ride feature so it can go out of spec.


    Check again it's 80ns which is worse than first Gen Ryzen at 75ns and Zen + 66ns the memory write speeds looks about half of what it should be 47GB/sec down to 25GB/sec. games require lots of small read and writes so this can hamper it's game single threaded performance. I am sure it's just a bug in the BIOS since well there is no final revision of the BIOS for any of the motherboards out there... yet at least.

    This maybe why in synthetic benchmarks single threaded performance is 15-17% as the data can be preloaded into the cache and run, while games will have to keep fetching to and from the system memory as and when the L3 cache fills up. Thats why low latency memory shows better gains than raw bandwidth in games regardless of platform, AMD or Intel.
     
    Last edited: Jun 25, 2019
    Fox2232 likes this.
  8. Astyanax

    Astyanax Ancient Guru

    Messages:
    4,753
    Likes Received:
    1,405
    GPU:
    GTX 1080ti

    And you're missing the architecture differences.

    Pay attention.

    memory latency is where it should be.
     
  9. Aura89

    Aura89 Ancient Guru

    Messages:
    7,748
    Likes Received:
    974
    GPU:
    -
    Instead of just stating that, why not explain yourself? Or show where it is explained?
     
  10. Astyanax

    Astyanax Ancient Guru

    Messages:
    4,753
    Likes Received:
    1,405
    GPU:
    GTX 1080ti
    it should be obvious.

    memory access is on the io die, memory latency tests are going to include the latency for cpu core to io die as well.

    Also
    [​IMG]
     

  11. Aura89

    Aura89 Ancient Guru

    Messages:
    7,748
    Likes Received:
    974
    GPU:
    -
    Given that chart, unless i'm reading it wrong, then there's something not right about the latency in the article.

    The article states 3200 memory, which would be according to AMD around 69ns, yet the article shows 80.5.

    Now either A: AMD lied

    Or B: Something wasn't right with the test

    Yes?
     
  12. Fergutor

    Fergutor Active Member

    Messages:
    65
    Likes Received:
    13
    GPU:
    Asus GTX 970 Strix
    Yeah I figured it may be very hard, but I'm almost sure I read reviews that did that...can't find them...So maybe I misread, but I'm almost sure. The link Noisiv provided talk about doing that (don't know with what grade of certainty, knowledge, security, experience...).
     
  13. Fergutor

    Fergutor Active Member

    Messages:
    65
    Likes Received:
    13
    GPU:
    Asus GTX 970 Strix
  14. Kool64

    Kool64 Master Guru

    Messages:
    516
    Likes Received:
    183
    GPU:
    Gigabyte GTX 1070
    Far Cry 5 being the anomaly it’s pretty darn close. The memory read/write and latency are still quite strange.
     
  15. Astyanax

    Astyanax Ancient Guru

    Messages:
    4,753
    Likes Received:
    1,405
    GPU:
    GTX 1080ti
    Using 3200mhz memory doesn't mean its running at 3200mhz.
     

  16. Aura89

    Aura89 Ancient Guru

    Messages:
    7,748
    Likes Received:
    974
    GPU:
    -
    Hence: There's something wrong with the article.

    It doesn't state speed other then: 3200

    The only thing that can be assume is that the ram is at: 3200.

    The entire point that is trying to be made here from myself, and others, to you, is that given the article and the only information we have is that the latency doesn't make sense, given AMDs own chart, and the articles stated 3200mhz speed ram, which is, again, the only thing it states.

    My only question to you now is this: Does the fact the article only states 3200mhz speed ram, and its reported latency, given the chart AMD gave us, make sense to you, still?
     
  17. Astyanax

    Astyanax Ancient Guru

    Messages:
    4,753
    Likes Received:
    1,405
    GPU:
    GTX 1080ti
    i don't assume the ram is at 3200mhz, its clearly at 2667.
     
  18. H83

    H83 Ancient Guru

    Messages:
    2,834
    Likes Received:
    447
    GPU:
    MSI Duke GTX1080Ti
    Minimal differences between the boards. The only exception is the power figures that are even lower on the x570 board. The memory performance remains very strange.
     
  19. dorsai

    dorsai New Member

    Messages:
    4
    Likes Received:
    2
    GPU:
    Vega 56
    wow if these results were on 2667/3200 memory speed I can't wait to see the results with 3733 which is the new sweet spot for 3000 series...exciting times. the fact that the entry level part is beating a 2700x in everything but multi core work is incredible.
     
  20. Aura89

    Aura89 Ancient Guru

    Messages:
    7,748
    Likes Received:
    974
    GPU:
    -
    And yet the score is better then 2667 memory as well as on the x570 platform gets a better latency report, but still not what it should be, but clearly not 3200 or 2667 according to that chart.

    And also, you do assume, as you assume it is 2667 memory, when it states 3200.

    To assume is to say the memory is anything other then 3200 given the fact we are only given 3200 memory stated as anything in the article.

    I feel like you see what the issue is you just are refusing to admit that others are right in the fact the latency, among other memory related things, seems odd. As you didn't ever say it "seemed right as clearly the memory is 2667 not 3200" you simply stated it was right when everyone is reading the review that states 3200.
     

Share This Page