Review: AMD Ryzen 7 3700X & Ryzen 9 3900X

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jul 7, 2019.

  1. Clouseau

    Clouseau Ancient Guru

    Messages:
    2,841
    Likes Received:
    508
    GPU:
    ZOTAC AMP RTX 3070
    3800X is now in stack at Newegg.
     
  2. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,097
    Likes Received:
    2,603
    GPU:
    3080TI iChill Black
    if OC really sucks, then 3800X would be the best bet. I would rather pay extra 80€ and have better freq. and be done with OC on it.. lol

    3700x 350€
    3800x 430€
     
  3. Robbo9999

    Robbo9999 Ancient Guru

    Messages:
    1,856
    Likes Received:
    442
    GPU:
    RTX 3080
    Turning down the game details reduces load on the GPU, generally not the CPU, the 720p benchmarks were already CPU limited (I'm guessing for most of the benchmark run), so your argument about getting 200fps with Ryzen if you turn down game details is misguided (wrong). Yep, I'm aware Ryzen got 160fps in BFV, but 9900K got 180+ fps, which would only grow with Intel overclocking - we were talking about high refresh rate gaming so up to 240Hz - even 160fps average framerate when using 144Hz monitor is probably gonna mean some drops below 144fps, so even for 144Hz monitors there would be value in having a 9900K that was capable of 180+ fps average framerate, and besides there's 180Hz screens / 165Hz screens / 240Hz screens. Ryzen still does not cut the mustard for high refresh rate gaming, not if that's your main criterion for using your PC - it's as simple as that.
     
  4. Dazz

    Dazz Maha Guru

    Messages:
    1,010
    Likes Received:
    131
    GPU:
    ASUS STRIX RTX 2080
    Gamers Nexus has done this already but they only showed the 6 and 12 core, 6 core SMT was best left on and the 3900X disabling SMT netted a overclock from 4.3GHz to 4.4GHz on all cores and games performance increased around 7% with the only exception being Assassins creed origins which liked SMT enabled. Games don't seem to know what to do with 24 threads but also without SMT it's not saturating 2 threads on a single core. I already ordered my 3900X but it seems they won't be in the UK till friday, kinda pissed off really as Lisa said you CAN BUY them on launch day not preorder and wait a fking week after launch day.
     
    thesebastian and Fox2232 like this.

  5. Mesab67

    Mesab67 Guest

    Messages:
    244
    Likes Received:
    85
    GPU:
    GTX 1080
    ...a couple of things added and highlighted.
     
    Last edited: Jul 8, 2019
  6. kakiharaFRS

    kakiharaFRS Master Guru

    Messages:
    987
    Likes Received:
    370
    GPU:
    KFA2 RTX 3090
    hey guys I did a few tests just now after realizing how much going from 5.0 to 5.1Ghz cost me in term of heat and power (way too much)

    - i9 9900k @5.1Ghz all cores, 1.36v : aida cpu package power 190watts cpu temp 95°C corsair HX850i output 290watts
    - i9 9900k @4.4Ghz all cores, 1.17v (auto) : aida cpu package power 100watts cpu temp 60°C, corsair HX850i output 200watts (yes it's lower than default I wanted to compare how games behaved with a ryzen-like clockspeed)

    saying that Ryzen use less power is slightly misleading...it's true but, I believed it meant Ryzen was purely more efficient but the fact that it's slower clocked does matter
    it's something I never really tought about it before today since I ran max OC builds for like 15 years (and corsair icue is keeping me out of cpu flex-ratio anyway with it's constant 8% cpu usage)
    I'm not writing this to diminish the 3900x, it's in my webshop cart as I write this ;) I'm waiting for motherboards reviews

    edit : btw for those who are wondering why we need 180fps in BF5 or an other game, because when you are a hardcore gamer/geek/tweaker/modder you don't play in basic 1080p you play at 150% or even 200% scaling (you do gain details when rendering in 4K and downscaling to HD fyi it's very obvious in things like rivets on vehicles in battlefield games)
    in pretty much all the games I play I have tweaked settings, external or internal post-processing filters+higher res scaling and those 160 or 180 become 60-90fps and then you're back to needing more "oomph" from your CPU and GPU ;)
    "vanilla" games benchmark numbers aren't realistic for us
    to give you an idea my modded skyrim uses 95-98% of a 1080Ti @1950Mhz I'm basically running furmark for 2,3,4hours straight when I play, I had to change the pc case because the GPU was heating everything else
     
    Last edited: Jul 8, 2019
    thesebastian likes this.
  7. bballfreak6

    bballfreak6 Ancient Guru

    Messages:
    1,904
    Likes Received:
    462
    GPU:
    MSI RTX 4080
    That was a head scratcher comment for me too. Like genuine question from someone who's still using an old reliable Dell 60hz screen; I understand going to 100fps+ there is a real difference vs 60fps but beyond that is there any actual noticeable differences in game play or smoothness? As it is how many of us actually has the graphical power to run new games at those crazy frame rates? I like my games to look pretty so I turn my settings up where I can afford to and while I understand there are pro/competitive players requiring high refresh/frame rates is there really that noticeable of an advantage going to say 200fps vs 100fps? There is a difference between "not as good" and "not good enough".
     
  8. Dazz

    Dazz Maha Guru

    Messages:
    1,010
    Likes Received:
    131
    GPU:
    ASUS STRIX RTX 2080
    Yeah once coffee lake hits the 5GHz mark power leakage kinda gets out of controller gamers nexus tests indicated from the EPS lines that the 9900K consumed 3x more than stock or 2x more than a 3900X at stock or overclocked. Nearly 300w for a consumer CPU is getting a bit crazy and is close to the 384w MAX power that a 8pin EPS can provide. AMD's FX 8350 move over Intel coffee lake is the new space heater. Still Intel have done really well to push their architecture and 14nm to this level.
     
    kakiharaFRS likes this.
  9. kakiharaFRS

    kakiharaFRS Master Guru

    Messages:
    987
    Likes Received:
    370
    GPU:
    KFA2 RTX 3090
    thx for the explanation Dazz and I can confirm it's hot at 5,1Ghz I feel a "woosh" of hot air coming from the full tower under the desk when I benchmark lol
     
    Dazz likes this.
  10. MonstroMart

    MonstroMart Maha Guru

    Messages:
    1,397
    Likes Received:
    878
    GPU:
    RX 6800 Red Dragon
    You can still tell the difference between 144Hz and 240Hz if you know what to look for but let's just say the difference is definitely not as big as 60Hz versus 144Hz (where it's night and day literally imo). For most gamers out there (i would say easily 90%) it is simply not worth reducing the image quality and resolution to achieve over 144 fps (as long as the fps is consistent and the low fps is close to the average).

    If you strictly care about gaming and nothing else the 9900k is still the king. But it's not a super strong king like it used to be. And the matter of the fact is in 2019 there's a lot more people who care about productivity than 10 years ago.

    I started to do free lance work in the evening and on the weekend 6-7 years ago to make more money. Back then i was using a Core i5 cause it was good enough for gaming. I upgraded to a Ryzen 1800x 2 years ago and if i would upgrade again now it would be a Ryzen again. I have a 2k 144hz monitor but it's not worth reducing my productivity while working for a 5-10% increase in fps. And i'm certainly not putting a 2nd computer in my work room just for gaming.

    Anyway i play at 2k where the gpu is a bottleneck and as long as it will be a bottleneck at this resolution and my 1800x will be good enough for my work i'll likely not upgrade. The 3800x is tempting thought i wont lie specially if my 370 mobo supports it.
     
    bballfreak6 likes this.

  11. bballfreak6

    bballfreak6 Ancient Guru

    Messages:
    1,904
    Likes Received:
    462
    GPU:
    MSI RTX 4080
    What monitor are you currently using? I’ve been tempted to get a new monitor for a while just to see what the fuss it’s about with high refresh rate gaming and g-sync/adaptive sync but then I am running a 1070ti @1440p so not really sure if I’ll actually see any benefit. That and the fact that my 6+ year old wide colour gamut 27” Dell is still going strong and still looks great today (only thing I don’t like is maybe a bit too much anti-glare coating) especially for photo editing. Other thing that worries me is the supposedly poor QC with the IPS 144hz panels with things like bad uniformity, light bleed and even weird colour cast! Not sure if it’s over exaggerated issue or it really is a thing.

    Ps sorry for the off topic
     
  12. MonstroMart

    MonstroMart Maha Guru

    Messages:
    1,397
    Likes Received:
    878
    GPU:
    RX 6800 Red Dragon
    I have a MSI Optix MAG27CQ. To be fair when i bought it i was looking for a 2k monitor mostly for my work. I decided to buy this one over a 60Hz because well i liked the monitor a lot and it was relatively cheap. Even if it's not IPS the color are very good and i still got a secondary calibrated non curved IPS 1080p monitor when needed for color and pixel accuracy. I would not upgrade a 2k60Hz monitor to a 2k144H don't think it's worth it. Maybe when you'll upgrade to 4k take a good look at the 4k144Hz ones. 144Hz is nice and it definitely feels more fluid but it's not the end of the world if you're not a very competitive online gamer.

    My main problem right now is my 1070 is not really a proper 2k card. Also it doesn't support FreeSync with my current monitor (nVidia :mad:). I mean it's fine in most games but i'm more in the 80-90 fps range in most games which is not much better than 60. But when i'm at a stable 144 fps i can say it feels a lot more fluid and clearer. I still play wow (i know, i know, ...) and for Mythic raiding and Mythic+ i would say i like the fluidity and clarity of 144Hz. I would probably get more fps with a better cpu than the 1800x but upgrading the gpu would give me more improvement at 2k than upgrading the cpu and that cpu is still perfectly fine for my work.
     
    bballfreak6 likes this.
  13. oxidized

    oxidized Master Guru

    Messages:
    234
    Likes Received:
    35
    GPU:
    GTX 1060 6G
    ...As expected as OC able as previous ryzen gens, but this time is actually a bit weirder because turbo clocks are actually better than what you can reach with OC, so i guess it's better to leave it at stock settings, power consumption i guess is just ok, and value is pretty good, performance in games got better but still not as solid as intel's, overall i'd say a decent upgrade over second generation, but nothing incredible or that'd make anyone sane screaming out a miracle. I was interested in the 3800X i guess i'll just wait a bit longer to see if this all actually gets better with microcode fixes and i honestly hope AMD clarifies about x570 facts der8auer shed light on.
     
  14. bballfreak6

    bballfreak6 Ancient Guru

    Messages:
    1,904
    Likes Received:
    462
    GPU:
    MSI RTX 4080
    Thanks for the info! Guess I’ll be holding on to my Dell just that bit longer then until such a day I get a gfx card that can drive 1440p at high frame rates lol.
     
    Last edited: Jul 9, 2019
  15. jwb1

    jwb1 Guest

    Messages:
    725
    Likes Received:
    157
    GPU:
    MSI GTX 2080 Ti
    Where I am this is pricing:

    9900k $639
    9700k $509
    3900X $689
    3800X $539

    I mean, this pricing is not that great AMD vs Intel. And rumor is Intel is dropping prices soon. And with the more headroom of OC on Intel, it really doesn't make sense in my mind. How many people really need 8+ cores? I even went with a 6 core and have been just fine with gaming and workload.

    The only X570 mobo I would buy is the Gigabyte Xtreme to get the no fan. $929 on that. Absurd. That is more expensive than their Z390 Xtreme board. What is going on?!
     
    Last edited: Jul 9, 2019

  16. oxidized

    oxidized Master Guru

    Messages:
    234
    Likes Received:
    35
    GPU:
    GTX 1060 6G
    Get what you think fits your need, don't mind any fanboy's perspective. Value is on AMD side no doubt, considering what you get, not for anyone, but surely for most.
     
  17. jwb1

    jwb1 Guest

    Messages:
    725
    Likes Received:
    157
    GPU:
    MSI GTX 2080 Ti
    Well, where I am I don't see AMD value. Even if I don't compare the high end Xtreme board. The Z390 Master is $374. The X570 Master is $479. Everything on Intel's side is cheaper right now, and this is before the official Intel price cut.

    AMD, retailers, and the mobo makers are milking these products as the prices are actually higher.
     
  18. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    Oh great more intel employee replies, i was hoping you were gone permanently as intel employees shouldn't be able to masquerade as a normal user.

    The idea that a 6-core system is "just fine for workload" means you don't have a workload. Sure, can someone get stuff done? Absolutely. But what's the point if comparing a 6-core to a 12 core for exceptionally cheap? Right. There is no point.

    Then the whole idea that where you're at a 3900x for $689 is somehow "not that great" in comparison to the 9900k.... ... ... . hahaha...ha? That was a joke right? Oh wait you're an intel employee that wasn't a joke just misinformation.

    [​IMG]


    [​IMG]

    [​IMG]

    The list can go on, but....lets not bother with the fact that you can get 50% more performance for, according to your stats, by far not 50% more cost.

    And then you go to the x570. You know it's so, so easy to find an intel troll, employee, or fanboy when they constantly come in with this whole "BUT BUT BUT X570 IS SO EXPENSIVE!" as though PCI-Express 4.0 is even ON THE INTEL SYSTEM. As well as this genius idea they are giving themselves that there are not much, much cheaper boards out there, b350, x370, b450, x470 and...even better yet a320 boards, as a possibily. But no, it's the same tune time and time again: "X570 boards are too expensive, guess AMD isn't cheap anymore! MUAHAHAHAHAHAHA!!!!"

    [​IMG]

    I'm relatively certain you were put on time out due to your sudden disappearing act yesterday, i just hoped it was longer. These forums are a lot nicer without the trolls. Another one on the ignore list.
     
    Venix, carnivore, HandR and 4 others like this.
  19. jwb1

    jwb1 Guest

    Messages:
    725
    Likes Received:
    157
    GPU:
    MSI GTX 2080 Ti
    My life does not revolve around this board or these products, as yours does. My workload consist of gaming and a Plex server + Handbrake, which my 6 core handles just fine. I'm simply pointing out facts that Intel is actually more value now and its interesting because now people are prepared to pay MORE for AMD over Intel. Interesting how people's points of view change. You can argue they have older chipsets, but I am comparing like for like with Intel's latest and AMD's latest. They released X570 for a reason, to sell it with the new CPUs, especially to attract new customers.

    Yes, if you need 12 or more cores, go with AMD, it makes sense. But I highly doubt most people require these amount of cores.
     
    Last edited: Jul 9, 2019
  20. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,016
    Likes Received:
    7,355
    GPU:
    GTX 1080ti
    intel ipc decreases every time windows updates.
     
    Venix, Undying, carnivore and 4 others like this.

Share This Page