Listed in etail: AMD Ryzen 9 3900XT, Ryzen 7 3800XT, and Ryzen 5 3600XT

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jun 1, 2020.

  1. umeng2002

    umeng2002 Maha Guru

    Messages:
    1,425
    Likes Received:
    331
    GPU:
    4080 Super
    Those base clock bumps are what is going to make all the difference.
     
  2. JamesSneed

    JamesSneed Ancient Guru

    Messages:
    1,690
    Likes Received:
    960
    GPU:
    GTX 1070
    Its simply going to be process improvements allowing them to have a higher bin and still get good percentages of chips that make the higher standard. Likely we will see little to no TDP changes in real usage if it's truly better bins running at the same vcore. I don't expect these to have issues running at their stated speeds but I will certainly wait for all the reviews. As an gen 1 Ryzen owner I may think about these chips especially if Ryzen 4000 does really end up not getting X370 support(I still have a bit of hope MB vendors will do it for a few of the higher end boards while dropping old CPU support like they did for the 3000 series which isn't supported on x370 either).
     
    Last edited: Jun 1, 2020
  3. Venix

    Venix Ancient Guru

    Messages:
    3,440
    Likes Received:
    1,944
    GPU:
    Rtx 4070 super
    is it though ? sells about the same as the 3600 a bxx motherboard with out oc and some ram 2666 gaming and productivity they trade blows ending up costing you about the same ....10400 is not a bad deal .... i would take the 3600 personally but nothing wrong with the 10400
     
  4. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,975
    Likes Received:
    4,342
    GPU:
    Asrock 7700XT
    Based on what, the internal power sensor? Because that was a gimmick of a feature. It's basically just a pre-set range based on CPU usage. My old FX-6300 could be overclocked to 4.6GHz and under full load it still said 95W, which was very obviously not true. So, if you actually overclocked that CPU, you very likely did get up to 135W, but it just didn't report that.
    I'm not sure how the Ryzen watt meter works
    My board actually doesn't have an option to control the wattage, not to my recollection anyway. But like I said, this is more a matter of whether you're actually trusting the built-in sensor. To my recollection, tests with external watt meters tell a very different story. Like I said, it's not anywhere near as bad as Intel, but it's still inaccurate.

    That may be the case but I don't think there was a binning problem with the original CPUs that couldn't reach their advertised speeds. So that's kinda my point: the motherboards and AGESA seemed to be the real problem here, which gets me to wonder how these higher clocks can be achieved.
    But yeah, if my X370 board doesn't support the 4000 series, I'll most likely get the 3700X (I can't get a 3800X or XT because of power limitations).
     

  5. Ricardo

    Ricardo Member Guru

    Messages:
    165
    Likes Received:
    113
    GPU:
    1050Ti 4GB
    Yes, I know how to check effective clocks of each core. It's pointless to talk about my experience alone, but for curiosity's sake, I constantly reach 4350 on one or two cores with my 3700x using the stock cooler. If I actually close everything and just run a synthetic single-core benchmark such as cpu-z one, it'll go up to 4375 and sometimes reach 4400. Note that I take these metrics from the chart in afterburner, so I can see the spikes between reads over time.

    I'm afraid you're going to have to back that info with some statistics, as your experience alone can't be considered the broad truth and I haven't seen any reports claiming such dramatic discrepancy.
     
  6. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    HWiNFO64 reads it from VRMs on MB.

    There are no "external watt meters" measurements of CPU power draw as measurement of Current requires to be in-line. You can measure current on cables that go to MB, but that includes energy losses on VRMs and all that is powered through those cables.

    And no, That APU refused to stay overclocked under load. When there was iGPU load, CPU clock went down even more than before as OC settings had higher voltage.
     
  7. user1

    user1 Ancient Guru

    Messages:
    2,746
    Likes Received:
    1,279
    GPU:
    Mi25/IGP
    Im curious to see whether they improved the io die or not, faster fabric speeds would boost zen2's weak points, and round it out.
     
  8. Shaxuul

    Shaxuul Active Member

    Messages:
    71
    Likes Received:
    37
    GPU:
    EVGA GTX 1070 SC
    Curious to see how the Ryzen 9 3900 XT stacks up against the 3950X in games, and also its overclocking potential!
     
  9. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    10,780
    Likes Received:
    1,393
    GPU:
    黃仁勳 stole my 4090
    I don't have to back up crap. You don't want to spend the 10 seconds to look it up that's on you. What's with this common "prove it bruh!" when it's common knowledge and search engines exist? You should already know, it became common knowledge on every tech anything over the last year. Tell me, does that 4350 or 4400, does that EVER hold? Or is it a momentary blip and gone? And what's the actual clock? Does that still sound like reality to you?

    But I'll bite this time, Derbauer collected a crap ton of samples for example, and remember all these collected numbers are the maximum recorded numbers; these users were getting 100-200MHz below that in reality -

    Doesn't matter that on paper that reported clock of 4.5GHz is recorded, when no one is getting that clock to hold in any actual program. Even if it did, which it never has and doesn't to this day even on top X570 boards with insane cooling, 722 samples for the 3900X resulted in an average of 4375MHz as the absolute pseudo max. AMD's marketing straight up lied.

    An average pseudo max of 4375MHz, clocks realistically averaging more around 4.2GHz... so the actual clocks are what? 4-4.1GHz tops? That sounds like 500-600MHz to me, not 100. AMD's. Marketing. Straight. Up. Lied.
     
    Last edited: Jun 2, 2020
  10. BReal85

    BReal85 Master Guru

    Messages:
    487
    Likes Received:
    180
    GPU:
    Sapph RX 570 4G ITX
    Haha, rage at its finest. :D :D
     

  11. mgilbert

    mgilbert Member

    Messages:
    46
    Likes Received:
    7
    GPU:
    16 GB DDR3
    What AMD really needs to be reporting is the sustained all-core boost speed. Their CPUs idle at the base clock speed, but any load at all, and the speed goes up. And you'll only see the maximum boost speed only very briefly, and only on a core or two. So, both those numbers don't mean very much.

    For example, my 3700x idles at or below the advertised 3.6 GHz, and on single core loads, I'll see the advertised 4.4 GHz very briefly on one or two cores. However, put a real world load on the CPU, and it will sustain 4.0 to 4.2 GHz indefinitely, on all cores, depending on the nature of the workload.
     
    Last edited: Jun 2, 2020
  12. Exodite

    Exodite Guest

    Messages:
    2,087
    Likes Received:
    276
    GPU:
    Sapphire Vega 56
    They can't do that though, and you alluded to why in your own post.

    It varies wildly with external circumstances.

    Testing have shown that frequency scales with temperature well down into extreme cooling territory. It should also scale with power delivery, though for most enthusiasts one would imagine that's less of an issue.

    AMD can't provide sustained all-core boost numbers the same way they can't tell you exactly what clocks their GPUs run at in a particular game. There's just too many variables.

    Keeping in mind that some people were loosing their minds over 25MHz on the peak clocks I doubt a 300-500MHz span would satisfy many potential customers. It's just not particularly relevant information.
     
  13. asturur

    asturur Maha Guru

    Messages:
    1,371
    Likes Received:
    503
    GPU:
    Geforce Gtx 1080TI
    Does Ghz number on the box matter so much? no it doesn't.
    Advertising an arm cpu at 5Ghz, an intel cpu x86 at 5ghz would anyway both have 5ghz on the box but performcompletely different on different workload.
    What is a single core boost? 1 core 1 thread? 1 core 2 thread? How can you tell how many threads are going on in your CPU? simply you can't. There will be hundreds of threads scheduled from your OS.
    Also which OS? windows 10? Linux? Macos? a custom OS for testing at AMD?

    There are so many variables on those modern cpus that attaching yourself to the number on the box is wrong.
    Back in the AthlonXP/Duron days if you would buy a cpu for its mhz you would probably go wrong, since the athlonXP was faster compared to a PIII with moreMhz
    The only meaningful information that clock gives you is relative performance between same generation of CPU of the same brand.

    Also the `sustained load` core frequency is misleading too, what kind of load? if you write code that can trigger all the cpu areas each clock you will hit different current requirement than a different kind of benchmark.
     
  14. Ricardo

    Ricardo Member Guru

    Messages:
    165
    Likes Received:
    113
    GPU:
    1050Ti 4GB
    First of all, this is a discussion forum. If you don't like discussing, feel free to not post. But when you do people will definitively engage and you should be prepared for that, including proving what you're saying.

    As for "common knowledge" and "search engines", yes, I know about them. But you should also use them and figure out that much of what was assessed changed after BIOS updates, like I said.

    I just said it does, but it doesn't really matter because mine is just a single result.

    I already knew about this video, which was posted before the AGESA 1.0.0.3 ABBA. Here's the follow up, by the same Derbauer, showing exactly what I was saying: boosts really don't reach advertised speeds, but aren't nowhere near under 500~600mhz (it never was, really), more like 100~150mhz on a 3900x.

    While the behavior isn't perfectly consistent, the spikes aren't also a "momentary blip and gone" - they adapt to the load, which is fine as long as the equivalent performance is there. CPUs aren't made to look good on graphs, they are made to perform calculations as fast as possible.

    Read my first reply to you, I literally agreed and said the same thing.
     
  15. Silva

    Silva Ancient Guru

    Messages:
    2,048
    Likes Received:
    1,196
    GPU:
    Asus Dual RX580 O4G
    Pre-order prices are always spiked, no way they're going to sell for that much.
    Here 3600 is 182€ and 3600X 208€, no way someone is going to overpay +100€ for couple extra Mhz when you could get the 3700X for 316€!
    I'm not really exited with these though, I wanna see what AMD did with Zen 3.
     

  16. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    10,780
    Likes Received:
    1,393
    GPU:
    黃仁勳 stole my 4090
    When people question why you're taking the time to dubiously question them instead of doing the faster 3 second search for something that's been all over tech news for months, you should be prepared for that. This is the internet, if you don't like using the internet, feel free to not use the internet.
    1- Getting another few MHz doesn't bring that 4375MHz average within 100 or 150MHz of the advertised rates, but that's irrelevant either way, considering: A) That's still a nanosecond spike and not actual performance. B) That's still not the real clock. Still will be 500MHz+ away from reality.

    2- Stop saying nonsense like "it's adapting to the load", understating with things like "isn't perfectly consistent", or other nonsense like you're a PR rep for AMD.

    It can't hold the clocks, it's not some efficiency adaptation to only hit that when necessary, it just can't can't perform that high period. In what reality is that equivalent performance or "fast as possible"? Want to see what X amount of GHz across Y amount of cores looks like? Set it manually and see just how much faster it is than what you're trying to pass off as "equivalent". When this thing shows as 4.3GHz across all cores it's a world slower than manually setting it to 4.3GHz. Why? Because it's not 4.3GHz when it says it is at stock, it's pseudo clocks, not reality. AND THOSE ARE CLOCKS THAT HOLD NOT YOUR BORDERLINE IMAGINARY 3 NANOSECOND SPIKE THAT DOES NOTHING. It's so much faster manually OC'd because it's actually 4.3GHz then, not "4.3GHz" being more like 3.8-4GHz.

    Intel, you know, the notorious scumbags known for being as slimy and deceiving as possible? Even they deliver the speeds they promise. Or at least used to, I don't know what nonsense they're up to since they saw AMD lie harder than they do in that field. AMD didn't even need to lie about this; if anything it's more impressive that their CPUs perform so well at such low clocks.

    From the start everyone was used to actually getting the speed written on the box, everyone was taken by surprise by AMD's marketing lies. No one expected some weasel-word mental gymnastics like what you're defending like it's not that bad. That's the problem and you know it.

    There's going to be a class action lawsuit against AMD if there already isn't one filed. Just like when they lied and called their garbage Bulldozer failure's crap "cores".
     
    Last edited: Jun 3, 2020
  17. Ricardo

    Ricardo Member Guru

    Messages:
    165
    Likes Received:
    113
    GPU:
    1050Ti 4GB
    I am, hence me posting links as needed. You're the one complaining here, not me. :rolleyes:

    Okay, so why all reviews say that OC'ing Ryzen frequency manually either doesn't give a lot of benefit or actually performs worse? Did you figured something out that all other professional reviewers didn't? If so, please show it to us.

    Here's anandtech testing a 3900x overclocked to 4.3Ghz - worse single-core performance than stock, so stock is performing above a "sustained" 4.3ghz, which either means higher clocks, or simply more efficient clocks. Either way, better raw performance.

    And here's a direct quote from our boss about the 3700x:
    "As you can observe, the tweaking results are all very modest. Imho, you are better off with these processors at default, as one or two cores can Turbo higher, which is far more beneficial in games opposed to having all cores at ~4400 MHz. So in that respect, the positives do not outweigh the negatives."

    And those tests were done before the 1.0.0.3 ABBA AGESA, which further improved stock behavior.

    Note that I'm not talking about IF/memory overclocks/adjustments, but raw CPU frequency - there's no benefit to setting it manually or to let it run with it's "borderline imaginary 3 nanosecond spike". Single-core performance is within margin of error or worse with the former.

    Again, I agree with you on this point. AMD did lied about clocks. All I'm saying is that it's not nearly as bad as you said.
     
  18. wavetrex

    wavetrex Ancient Guru

    Messages:
    2,450
    Likes Received:
    2,545
    GPU:
    TUF 6800XT OC
    1.0.0.4 works fine.

    I'm seeing 4.375 to 4.4 on my 3700X holding quite fine in real single-threaded work, which is exactly how the product was advertised... 4.4
    The only times it drops below is when Windows decides it needs to do "stuff" in the background and uses other cores.

    There are no lies, just bugs/issues. Everybody has those. Do I need to remind anyone how many security issues Intel has ?
     
  19. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    10,780
    Likes Received:
    1,393
    GPU:
    黃仁勳 stole my 4090
    I didn't see that, interesting. It's within margin of error, 2 points, but actually even being at the same single core performance would mean in their tests the single core boost was holding high enough to have a real clock of at least 4.3GHz, or more likely variable clocks that give a result of something like that. That's better than some people are getting and definitely not 500MHz off from reality. So yeah, not as bad as I was saying if that's normal. Their multi-core OC scores seem low though.

    As for myself and anyone I know with a 3900X, and I'd bet the majority in Derbauer's large sample averaging 4375MHz peaks... 3ABBA and later didn't do crap. And that's on relatively expensive boards. Mine was an MSI X470 Gaming Pro Carbon, their 2nd most expensive board at the time with an OP power system. And now I'm using an Asus TUF X570-Plus Wi-Fi which has the same beefy power system as insanely overpriced boards and is running 1.0.0.4. Same crap result. It gets nowhere near 4.6GHz 99% of the time, especially if I'm using a normal power plan. And even if it did, so what? It'd still be a big fat turbo lie since it doesn't hold that long enough for any actual performance. It doesn't hold the max reported number people get for even 1 literal second, not 1 second.

    Imagine if Intel did that. Advertised 5GHz but it only hits that for long enough for a program to detect it, but isn't actually at 5GHz for even 1 second. The internet would meme them into oblivion. The 3900X is what it is, but that's definitely not a 4.6GHz chip and I seriously doubt this XT revision will ever actually hold a single core turbo of 4.8GHz. I'd be really surprised if it held 4.6GHz for even a few seconds at a time.
     
  20. Ricardo

    Ricardo Member Guru

    Messages:
    165
    Likes Received:
    113
    GPU:
    1050Ti 4GB
    It does, yeah, and it seems to be consistent between most 3700x processors. But 3900x's are not reaching their 4.65 clocks most of the time, or only reaches them for very brief amounts of time, which is sketchy at best. So they're technically kinda not lying, but have been misleading with the marketing, and in my opinion that is the same as lying. Let's not forget their ludicrous "boost behavior" video explanation, that was pure lying right there.

    I totally agree with you here. They should have sold the 3900x as a 4.5ghz turbo (or even a bit lower), which seems to be a much more reasonable and achievable clock than 4.65, and let the benchmarks do the talking. Yet another lesson for their marketing team to learn.
     

Share This Page