Review: AMD Ryzen 7 3700X & Ryzen 9 3900X

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jul 7, 2019.

  1. platypus

    platypus Guest

    Messages:
    58
    Likes Received:
    9
    GPU:
    some asus 290x
    Have you seen Robert Hallocks post on reddit?

    https://www.reddit.com/r/Amd/comments/cbls9g/the_final_word_on_idle_voltages_for_3rd_gen_ryzen/

    There is a problem with some monitoring apps polling the system too much and bumping the proc into high boost states when it should be idling.
     
  2. bballfreak6

    bballfreak6 Ancient Guru

    Messages:
    1,904
    Likes Received:
    462
    GPU:
    MSI RTX 4080
  3. mohiuddin

    mohiuddin Maha Guru

    Messages:
    1,007
    Likes Received:
    206
    GPU:
    GTX670 4gb ll RX480 8gb
    I am loving this gen cpu release from AMD alot. It redefines AMD's position in the cpu market. It is just as revolutionary as the first gen Ryzen was ,back at their time.
    But i am abit dissapointed seeing almost no improvement on maximum achievable allcore frequency over the last gen ryzen . I am still searching for any clocks over 4.4ghz@allcore all over the internet within the safe voltage margin using mainstream cpu cooling solutions . :( :(
     
  4. kakiharaFRS

    kakiharaFRS Master Guru

    Messages:
    987
    Likes Received:
    370
    GPU:
    KFA2 RTX 3090
    Corsair iCue uses 8-9% of my 9900k nonstop (lower with less peripherals but I have the headphone stand soundcard, fan controler, rgb ram, aio etc...)
    it's enough to force my cpu at max clock 99.99% of the time, for that reason my idle temps are around 45-50°C instead of 35-37°C (I quit the app and my temp and clock went down immediately)

    On the Ryzen 3000 clock speeds, I'm disappointed to say the least, we are used to be able to get more than the advertised boost clock if even for 100Mhz but AMD over-sold their cpu and the more I read reviews the lower the clocks go, Jayz2Cent talks about some cpus only managing to have 4.2Ghz or even 4.1 wth >< that's less than a 5960x on X99 plateform the voltage seem very high too
    to give you an idea here's what I currently have
    9900k 5.0ghz all cores 1.26v (stable for work I recently did a 3hrs long H.265 encoding)
    9900k 5.1ghz all cores 1.36v (unstable for real work like video compression but games don't mind at all as they don't use 100% of the cpu, 3dmark works fine too)

    something I may have already said, is that the supposed "temperature" advantage of AMDs is misleading, they are slower that's why they aren't super hot, I did various benchmarks and tests to see how my favorites games and benchmarks would react with amd-like 4.3Ghz rather than 5.0ghz
    my cpu watts went from 190 to 100 and my temp in aida64 stress test went from 85° to 60° @4.3Ghz so yeah...I can do cold too and with much lower voltage, this is a non-argument

    I was ready to switch to Ryzen (Z390 chipset 16 lanes are preventing me to use a thunderbolt3 card I run out of lanes) but this is just not good enough I expected at least 4.5 "oc"
    and this far it's 4.4 "best case scenario" with possible 4.2, no thanks I'm not going back 10years in cpu clock speed not when I can game at 5.1Ghz with no problems
    do you realize how much is 500-800Hz in speed loss ? it's crazy
     
    Roboionator likes this.

  5. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Smart Argument. Intel's $500 chip can have reasonably good power draw too, if one downclocks it to AMD's $330 CPU clocks where it delivers very comparable performance.
    But wait a moment, what is actually TDP and power draw of 3700X?
    In normal operation (Out of the box) 3700X will max out power draw close to what you operate your chip at idle with that OC and conditions you wrote above.
     
  6. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    I've had some time to play with my 3900x over the past few days. Love it. Was able to set DOCP and instantly go to 3600 at correct timings and boot instantly - no issues. My previous 7820x had an issue with SC2 - it would randomly unboost the single thread back to stock and performance would tank about 10-15 minutes into a game - 3900x has no problems. Performance seems roughly the same with the exception of working in Davinci Resolve - where I see a pretty massive boost due to the extra 4 cores - but I expected mostly the same - just tired of all the issues with my 7820x.

    Haven't had an AMD processor in my main system since the 3200+ clawhammer but felt like this was the right time to return. I honestly might end up getting a 3950x when it launches and sticking this on an ITX board in my HTPC/Server. Can't find any good ITX options for the 7820x.
     
    Last edited: Jul 15, 2019
  7. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Can you do something for us?
    Take Ryzen Master. Use Precision Boost Overdrive option. And Test PPT option from something like 140 to minimum, that is allowable for 3900X (I guess 70W).
    Test in some steps, like 10W or 20W depending on your interest or meaningful change in results.
    (Please give us scores from your usual CPU workloads at different PPT limits.)

    Thanks.

    With 2700X, I got CB.R14 1771 @105W and 1606 @65W. That was just 9.3% loss of performance. (Already posted table of results somewhere around.)
    I wonder what kind of power draw reduction you can get at under 10% loss of total performance.
    (In given test set I did not undervolt or anything special, just reduction of PPT limit.)
     
  8. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    Well my Ryzen Master keeps crashing about 10-15 seconds after loading it, regardless to whether I change anything or not so I'm not sure what's going on with that.

    But here is my result with PPT set to 140:

    https://valid.x86.fr/8216wi

    Default:

    https://valid.x86.fr/8216wi

    Doesn't really seem to be a performance difference. Also whenever I run a benchmark the EDC pegs at 100% before anything else even gets close to 100%. I'm not to familiar with ryzen overclocking though so I don't know if I'm missing something or not.
     
  9. D3M1G0D

    D3M1G0D Guest

    Messages:
    2,068
    Likes Received:
    1,341
    GPU:
    2 x GeForce 1080 Ti
    Welcome back! :) The last AMD processor I used on my main gaming/usage system before Ryzen was the Athlon XP 3200+ Barton (no longer have that chip but I still have the chip before it - the Athlon XP 1800+ Thoroughbred-B). I also still have an old Athlon II X4 system - it somehow still keeps chugging along (with occasional BSODs :p).

    I've already fully migrated over to Ryzen (one of the early adopters) and I'm waiting for the 3950X - September can't come soon enough. Hopefully, by that time, AMD and/or mobo makers will have figured out all the potential BIOS/firmware issues. Not that that'll prevent me from upgrading - all I want is the cores, man. ;)
     
  10. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    https://www.reddit.com/r/Amd/comments/cd7pqb/warning_samsung_nvme_ssds_also_subject_to_whea/

    FYI

    I'm seeing this on my system FWIW. It appears to not effect linux systems though - so most likely driver issue.
     

  11. SplashDown

    SplashDown Maha Guru

    Messages:
    1,136
    Likes Received:
    408
    GPU:
    EVGA 980ti Classy
    Ya I had the 3200+ clawhammer also, it was my first pc build I did myself was a pretty good chip clocked ok too.
     
  12. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Well, I hoped for PPT down. But as RM crashes on you...
    There is really not much reason to increase PPT up a lot. But increasing EDC makes sense for higher boosting. Those currents drawn at fraction of second are main limiters for clock. (At least for Zen+.)

    I used TDC 100A and EDC 200A. Not that CPU will go much above default 140A on EDC. I just liked limits where I can take % and know what is actual current.

    Btw. Both links appear to be same.
     
  13. CyberSparky

    CyberSparky Member Guru

    Messages:
    134
    Likes Received:
    104
    GPU:
    EVGA RTX 3080 XC3


    Thanks for this, I followed his instructions and was able to get the idle voltages down. However only on the Ryzen Power Savings plan. I've submitted a form as he requested. A little over a 24h period showed average voltage 1.47v in HWInfo prior to me seeing this post. Temps no higher than 64c. Hope no damage was done to limit the longevity of this chip simply due to leaving sensor software running in background while gaming an afk.
     
  14. xynxyn

    xynxyn Member

    Messages:
    23
    Likes Received:
    1
    GPU:
    RTX 3080
    Hey everyone, I only game singeplayer with 4K@60FPS in mind. Right now I have an i5 4670K @4.2GHz + RTX2080. I'd like to finally move on to a new platform with more cores, threads and DDR4 RAM. 3600 boxed looks mighty juicy for ~200€. Though I'm sceptical buying only 6 Cores in 2019. Not sure if it's future proof looking at 8 Core consoles in 2020 (and probably 16 threads?). The 3700X + MoBo, that be about 180€ more for almost identical gaming performance right now. Whats the Gurus take on it? :) Also how are the Wraith Coolers? I'm so used to Noctua by now... in idle I dont hear it, while gaming I don't care that much, using headphones.
     
  15. sverek

    sverek Guest

    Messages:
    6,069
    Likes Received:
    2,975
    GPU:
    NOVIDIA -0.5GB
    For 60FPS target, I don't think you need a new CPU. Really. Consider one once you try to achive 144fps for 144hz panel.
     

  16. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    He's on 4K so even then trying to achieve 144fps etc. wouldn't really care about a new processor.

    Unless you are looking to get better system performance, rather then game performance, then i wouldn't bother upgrading since you game at 4K. That being said, if you do want better system performance/program performance depending on what you are doing, i wouldn't upgrade to a 6 core processor. I'd recommend 8 core definitely. If you want to save some money, get a 2700x instead of the 3700x, again your gaming performance will be the same between the two, for the most part, only your PC/programs performance will differ between the two.
     
  17. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    7,547
    Likes Received:
    608
    GPU:
    6800 XT
    I guess I will buy 3950x for myself and slam it on this x370 board and pray to all the gods
     
  18. D3M1G0D

    D3M1G0D Guest

    Messages:
    2,068
    Likes Received:
    1,341
    GPU:
    2 x GeForce 1080 Ti
    I plan on doing the same with my X470 board. I'm hoping all these teething issues (boost clocks, power usage, etc) are resolved by then. I was one of the early adopters of Summit Ridge and, although my experience wasn't as scarring as some others, I would like a smooth, trouble-free transition to Zen 2.
     
  19. HWgeek

    HWgeek Guest

    Messages:
    441
    Likes Received:
    315
    GPU:
    Gigabyte 6200 Turbo Fotce @500/600 8x1p
    https://www.planet3dnow.de/cms/4898...zen-5-3500-und-weitere-ryzen-pro-prozessoren/
    more models coming and looks like there will be cheaper 3950 model
    AMD used the 3950 non x for the OCing WR's while the top binned 3950X is 100-000000051
    This looks interesting since we now know that 3900X is limited in OCing because of the 1 "bad" chiplet that cannot go above ~4.3~4.4 and this is exactly the max OC shown on the 100-000000033.
    and we also know that the 2nd better chiplet in 3900X can OC to 4.6~4.7Ghz, so if the binned 3950X can be OCed ~4.5~4.6Ghz all core that will be nice!
     
  20. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,011
    Likes Received:
    7,352
    GPU:
    GTX 1080ti

Share This Page