Review: AMD Ryzen 7 3700X & Ryzen 9 3900X

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jul 7, 2019.

  1. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,152
    Likes Received:
    2,973
    GPU:
    5700XT+AW@240Hz
    We did not. Because Moon in astronomical sense of the word has capital "M" as do all other major named objects in our solar system.
    For example Sun is our local star, but people often call other stars in our or galaxy suns.
     
  2. moo100times

    moo100times Master Guru

    Messages:
    274
    Likes Received:
    121
    GPU:
    295x2 @ stock
    Wait what is happening?? Last pc?? Are you dying??
     
  3. foetopsyRus

    foetopsyRus Member

    Messages:
    29
    Likes Received:
    1
    GPU:
    MSI GTX 1080 X
    maybe I don’t understand something ?! ?!
    [​IMG]
    [​IMG]
    1.89.1
     
    Last edited: Jul 11, 2019
  4. Loophole35

    Loophole35 Ancient Guru

    Messages:
    9,764
    Likes Received:
    1,121
    GPU:
    EVGA 1080ti SC
    ^What's your point?
     

  5. insp1re2600

    insp1re2600 Maha Guru

    Messages:
    1,260
    Likes Received:
    472
    GPU:
    RTX 2080TI OC H20
    im guessing hes just a geriatric bud, plenty of users from all age ranges on here.
     
    moo100times likes this.
  6. CyberSparky

    CyberSparky Member Guru

    Messages:
    101
    Likes Received:
    58
    GPU:
    EVGA RTX 3080 XC3
    Ok so here are my results with a 3600X vs my 2600X as promised.
    All results on left are 2600X, all on right are 3600X.
    Tests on 2600X had Overclock of 4,350mhz manual OC all core boost, 3466 CL14 ram, all games set to High preset only turning Vsync off with exception to WWZ on Ultra preset.
    Tests on 3600X had Stock Freq with PBO/XFR turned on and same 3466 CL14 ram and timings.
    I am GPU limited at 1440p in all of these games so just showing my results.
    Max temp on CPU through all benches and tests was 64c on 3600x, slightly lower than my 2600x with max temp of 72c

    [​IMG]


    The only thing I do not like about this CPU so far is using PBO/XFR my voltages stay at 1.45-1.51v most of the time in games and desktop. I've tried manual overclocking and had same results as most reviewers. It's definitely faster/better leaving PBO/XFR to do the work atm. I'll give the manual all core overclock a try once a new bios release.
    Also I hope Gigabyte releases a new bios for my board soon. (x470 Aorus Gaming 7 wifi Rev 1.1) Seems very buggy cause sometimes it boots as fast as my 2600x, 15-25secs roughly, other times it takes 2 minutes or more to get through the boot to windows login.
    I can't wait till my new EVGA 2070 Super arrives tomorrow. Wife just ordered it for my B-Day tomorrow with next day shipping. :D I was going to wait for 2080 Super but meh. Not going to complain one bit lol. Maybe I still will and then give her the 2070s. Or maybe that was her plan all along.... Diabolical woman.... ;)

    @Fox2232 You were correct, not much difference. :)
     
  7. Robbo9999

    Robbo9999 Ancient Guru

    Messages:
    1,506
    Likes Received:
    269
    GPU:
    GTX1070 @2050Mhz
    You say you don't like the 1.45-1.51V (and I don't blame you!) that you're seeing on your 3600X when you're leaving it at stock using PBO/XFR, is it possible to still use PBO/XFR and do some undervolting - could mean you can run the CPU at maximum efficiency if you undervolt to the max, which might mean more PBO/XFR boosting due to lower thermals and lower power demands? I don't really know much about AMD overclocking, so was just an idea.
     
    CyberSparky likes this.
  8. platypus

    platypus Active Member

    Messages:
    58
    Likes Received:
    9
    GPU:
    some asus 290x
    Have you seen Robert Hallocks post on reddit?

    https://www.reddit.com/r/Amd/comments/cbls9g/the_final_word_on_idle_voltages_for_3rd_gen_ryzen/

    There is a problem with some monitoring apps polling the system too much and bumping the proc into high boost states when it should be idling.
     
  9. bballfreak6

    bballfreak6 Ancient Guru

    Messages:
    1,627
    Likes Received:
    131
    GPU:
    Gigabyte 2070S OC
  10. mohiuddin

    mohiuddin Master Guru

    Messages:
    858
    Likes Received:
    83
    GPU:
    GTX670 4gb ll RX480 8gb
    I am loving this gen cpu release from AMD alot. It redefines AMD's position in the cpu market. It is just as revolutionary as the first gen Ryzen was ,back at their time.
    But i am abit dissapointed seeing almost no improvement on maximum achievable allcore frequency over the last gen ryzen . I am still searching for any clocks over 4.4ghz@allcore all over the internet within the safe voltage margin using mainstream cpu cooling solutions . :( :(
     

  11. kakiharaFRS

    kakiharaFRS Master Guru

    Messages:
    500
    Likes Received:
    112
    GPU:
    MSI Gaming X 1080ti
    Corsair iCue uses 8-9% of my 9900k nonstop (lower with less peripherals but I have the headphone stand soundcard, fan controler, rgb ram, aio etc...)
    it's enough to force my cpu at max clock 99.99% of the time, for that reason my idle temps are around 45-50°C instead of 35-37°C (I quit the app and my temp and clock went down immediately)

    On the Ryzen 3000 clock speeds, I'm disappointed to say the least, we are used to be able to get more than the advertised boost clock if even for 100Mhz but AMD over-sold their cpu and the more I read reviews the lower the clocks go, Jayz2Cent talks about some cpus only managing to have 4.2Ghz or even 4.1 wth >< that's less than a 5960x on X99 plateform the voltage seem very high too
    to give you an idea here's what I currently have
    9900k 5.0ghz all cores 1.26v (stable for work I recently did a 3hrs long H.265 encoding)
    9900k 5.1ghz all cores 1.36v (unstable for real work like video compression but games don't mind at all as they don't use 100% of the cpu, 3dmark works fine too)

    something I may have already said, is that the supposed "temperature" advantage of AMDs is misleading, they are slower that's why they aren't super hot, I did various benchmarks and tests to see how my favorites games and benchmarks would react with amd-like 4.3Ghz rather than 5.0ghz
    my cpu watts went from 190 to 100 and my temp in aida64 stress test went from 85° to 60° @4.3Ghz so yeah...I can do cold too and with much lower voltage, this is a non-argument

    I was ready to switch to Ryzen (Z390 chipset 16 lanes are preventing me to use a thunderbolt3 card I run out of lanes) but this is just not good enough I expected at least 4.5 "oc"
    and this far it's 4.4 "best case scenario" with possible 4.2, no thanks I'm not going back 10years in cpu clock speed not when I can game at 5.1Ghz with no problems
    do you realize how much is 500-800Hz in speed loss ? it's crazy
     
    Roboionator likes this.
  12. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,152
    Likes Received:
    2,973
    GPU:
    5700XT+AW@240Hz
    Smart Argument. Intel's $500 chip can have reasonably good power draw too, if one downclocks it to AMD's $330 CPU clocks where it delivers very comparable performance.
    But wait a moment, what is actually TDP and power draw of 3700X?
    In normal operation (Out of the box) 3700X will max out power draw close to what you operate your chip at idle with that OC and conditions you wrote above.
     
  13. Denial

    Denial Ancient Guru

    Messages:
    13,150
    Likes Received:
    2,647
    GPU:
    EVGA RTX 3080
    I've had some time to play with my 3900x over the past few days. Love it. Was able to set DOCP and instantly go to 3600 at correct timings and boot instantly - no issues. My previous 7820x had an issue with SC2 - it would randomly unboost the single thread back to stock and performance would tank about 10-15 minutes into a game - 3900x has no problems. Performance seems roughly the same with the exception of working in Davinci Resolve - where I see a pretty massive boost due to the extra 4 cores - but I expected mostly the same - just tired of all the issues with my 7820x.

    Haven't had an AMD processor in my main system since the 3200+ clawhammer but felt like this was the right time to return. I honestly might end up getting a 3950x when it launches and sticking this on an ITX board in my HTPC/Server. Can't find any good ITX options for the 7820x.
     
    Last edited: Jul 15, 2019
  14. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,152
    Likes Received:
    2,973
    GPU:
    5700XT+AW@240Hz
    Can you do something for us?
    Take Ryzen Master. Use Precision Boost Overdrive option. And Test PPT option from something like 140 to minimum, that is allowable for 3900X (I guess 70W).
    Test in some steps, like 10W or 20W depending on your interest or meaningful change in results.
    (Please give us scores from your usual CPU workloads at different PPT limits.)

    Thanks.

    With 2700X, I got CB.R14 1771 @105W and 1606 @65W. That was just 9.3% loss of performance. (Already posted table of results somewhere around.)
    I wonder what kind of power draw reduction you can get at under 10% loss of total performance.
    (In given test set I did not undervolt or anything special, just reduction of PPT limit.)
     
  15. Denial

    Denial Ancient Guru

    Messages:
    13,150
    Likes Received:
    2,647
    GPU:
    EVGA RTX 3080
    Well my Ryzen Master keeps crashing about 10-15 seconds after loading it, regardless to whether I change anything or not so I'm not sure what's going on with that.

    But here is my result with PPT set to 140:

    https://valid.x86.fr/8216wi

    Default:

    https://valid.x86.fr/8216wi

    Doesn't really seem to be a performance difference. Also whenever I run a benchmark the EDC pegs at 100% before anything else even gets close to 100%. I'm not to familiar with ryzen overclocking though so I don't know if I'm missing something or not.
     

  16. D3M1G0D

    D3M1G0D Ancient Guru

    Messages:
    2,126
    Likes Received:
    1,367
    GPU:
    2 x GeForce 1080 Ti
    Welcome back! :) The last AMD processor I used on my main gaming/usage system before Ryzen was the Athlon XP 3200+ Barton (no longer have that chip but I still have the chip before it - the Athlon XP 1800+ Thoroughbred-B). I also still have an old Athlon II X4 system - it somehow still keeps chugging along (with occasional BSODs :p).

    I've already fully migrated over to Ryzen (one of the early adopters) and I'm waiting for the 3950X - September can't come soon enough. Hopefully, by that time, AMD and/or mobo makers will have figured out all the potential BIOS/firmware issues. Not that that'll prevent me from upgrading - all I want is the cores, man. ;)
     
  17. Denial

    Denial Ancient Guru

    Messages:
    13,150
    Likes Received:
    2,647
    GPU:
    EVGA RTX 3080
    https://www.reddit.com/r/Amd/comments/cd7pqb/warning_samsung_nvme_ssds_also_subject_to_whea/

    FYI

    I'm seeing this on my system FWIW. It appears to not effect linux systems though - so most likely driver issue.
     
  18. SplashDown

    SplashDown Master Guru

    Messages:
    709
    Likes Received:
    110
    GPU:
    EVGA 980ti Classy
    Ya I had the 3200+ clawhammer also, it was my first pc build I did myself was a pretty good chip clocked ok too.
     
  19. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,152
    Likes Received:
    2,973
    GPU:
    5700XT+AW@240Hz
    Well, I hoped for PPT down. But as RM crashes on you...
    There is really not much reason to increase PPT up a lot. But increasing EDC makes sense for higher boosting. Those currents drawn at fraction of second are main limiters for clock. (At least for Zen+.)

    I used TDC 100A and EDC 200A. Not that CPU will go much above default 140A on EDC. I just liked limits where I can take % and know what is actual current.

    Btw. Both links appear to be same.
     
  20. CyberSparky

    CyberSparky Member Guru

    Messages:
    101
    Likes Received:
    58
    GPU:
    EVGA RTX 3080 XC3


    Thanks for this, I followed his instructions and was able to get the idle voltages down. However only on the Ryzen Power Savings plan. I've submitted a form as he requested. A little over a 24h period showed average voltage 1.47v in HWInfo prior to me seeing this post. Temps no higher than 64c. Hope no damage was done to limit the longevity of this chip simply due to leaving sensor software running in background while gaming an afk.
     

Share This Page