1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

AMD Ryzen 9 3950X: 16 cores and 32 threads - A Gaming CPU to be released in September

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jun 11, 2019.

  1. angelgraves13

    angelgraves13 Maha Guru

    Messages:
    1,090
    Likes Received:
    205
    GPU:
    RTX 2080 Ti FE
    When is Zen 3 coming? Are we getting Zen 2+ next year or moving straight to Zen 3? I can't wait to see what comes next.

    Navi to me looks like their attempt to Ryzen their GPU lineup. This is basically their first gen Ryzen on the GPU front. I expect we'll be getting yearly improvements to IPC and memory speed and overall architectures going forward. This was a step in the right direction, but overall massively disappointing, much like first gen Ryzen. I don't expect AMD will have anything to truly compete with the RTX lineup until 2020 or 2021 when they move on the RDNA 2.
     
  2. sverek

    sverek Ancient Guru

    Messages:
    4,988
    Likes Received:
    1,707
    GPU:
    NOVIDIA -0.5GB
    Really? You sure it's not mods or SLI that messes things up?
     
  3. Koniakki

    Koniakki Ancient Guru

    Messages:
    2,819
    Likes Received:
    433
    GPU:
    ZOTAC GTX 1080Ti FE
    It's was too powerful! Even for Dr. Su! :D

    [​IMG]
     
    Undying likes this.
  4. las

    las Master Guru

    Messages:
    205
    Likes Received:
    24
    GPU:
    1080 Ti @ 2+ GHz
    Plenty of games can be CPU bound at 4K/UHD using low details. Especially less demanding esport titles. Higher fps = More CPU. It's not rocket science really. At 100+ and especially 120+ you'll be limited hard by your cpu. This is when Ryzen chokes. Yeah, I have tried the Acer X27.. 2160p/144 Hz monitor.. I have tried pretty much every high refresh rate monitor worth buying. I know exactly how hard it is it optain 144-240 fps

    The reason your performance is crap in Fallout is because you're using an AMD CPU. I bet you're far from 100% GPU usage on your 1080 Ti's. The games loves Intel, like many many others. Multi GPU is pretty much dead at this point tho. Outside of chasing benching records that is. The game support is terrible these days. 0.00000001% uses multi GPU so dev's don't care at all about it. The minimum fps is most often lower with SLI compared to a single card of the same model. Crappy frametimes and microstutter is present in many games, if they even support multi GPU in the first place.
     
    Last edited: Jun 13, 2019 at 2:18 PM

  5. D3M1G0D

    D3M1G0D Ancient Guru

    Messages:
    1,676
    Likes Received:
    986
    GPU:
    2 x GeForce 1080 Ti
    I regularly game at 4K; I don't know how much 4K gaming you've done, but I'm speaking from experience here. Whether it's Fallout 4, The Witcher 3, Doom, Metro Exodus or most other AAA games, I can barely maintain 60 FPS at 4K at the lowest settings, with the GPU fully maxed out. I don't game on low settings because I like it - it's what I have to do for decent frame rates.

    Right, my GPU reading 100% usage in HWMonitor somehow means my CPU is at fault. :p Also, I don't use SLI (I use both GPUs for computing, but only use one for gaming).

    As I said, the idea that you can game with a CPU cap at 4K simply by lowering settings is a myth - I don't know where this myth came from, but it needs to go away.
     
  6. Jayp

    Jayp Member Guru

    Messages:
    120
    Likes Received:
    49
    GPU:
    2080 Ti
    Skylake X is not Intel's best gaming offering. Coffee Lake and especially the 9900K are the best Intel has to offer for gaming. Skylake X can game well considering core counts but no model is better for gaming from Intel than the 9900K. With the 9900K already costing around $500 it is hard for Intel to say $1200+ CPUs are gaming CPUs when they game worse than their 115x parts. Especially as core counts go up, clocks decrease and Intel Skylake X performance decreases in things like games. The whole platform isn't labeled necessarily for gaming as it is extremely expensive and doesn't offer anything better to gamers, same goes for threadripper, it has workarounds for gaming but isn't a gaming CPU. AMD gets away with offering the 3950X as a gaming CPU because it is a mainstream AM4 part priced at $750 and very like will be good at gaming maybe not the best but best compromise. AMD said at Computex, that the 3800X is the best for gamers CPU.

    One thing we don't know yet is how Zen 2 will boost. It is possible that a 3950X while only taking on gaming loads could run much closer to max boost versus when it is fully loaded on 32 threads. We also don't know what the separate chiplets will work like with gaming. It is possible that for gaming the 3950X operates on one chiplet. We really just have to wait for independent reviews to put this all together. It will all be very interesting. Also curious how expensive of memory Zen 2 will need to perform most ideally.
     
  7. Denial

    Denial Ancient Guru

    Messages:
    12,117
    Likes Received:
    1,261
    GPU:
    EVGA 1080Ti
    Due to the new thread grouping I'd expect most games to stay on one chiplet but I don't see why it wouldn't cross to the second if the game or OS requires it. The latency, while slightly higher than Zen+ is still lower than Zen and should be far more consistent between cores.
     
  8. Jayp

    Jayp Member Guru

    Messages:
    120
    Likes Received:
    49
    GPU:
    2080 Ti
    There are instances where certain CPUs can bottleneck at 4K but that won't be in the games you said you play. If anything you'd see it in BFV Conquest or ACO or some other very CPU demanding game. You're going to need a lot of GPU first off and secondly low settings usually has less draw calls and makes it more difficult to bottleneck but not impossible. Zen and Zen+ in my personal experience bottlenecked easily at 1440p using a 1080 Ti and especially a 2080 Ti. I could see these CPUs challenged at isolated 4K gaming but overall I agree with you that CPU bottlenecks are much less likely to occur at 4K. This could all change in the future of course, but currently, nah. Additionally, 4K gaming at low seems like a waste at least at this time. Yea yea esports but still, nah. esports still lean 240hz so there's that.
     
  9. Jayp

    Jayp Member Guru

    Messages:
    120
    Likes Received:
    49
    GPU:
    2080 Ti
    Yea I would expect them to cross if needed but I was thinking more of a game mode" that isolated the chiplets. It is really hard to say. If we reflect on AMD E3 slides for gaming performance versus Intel we can see that the 3900X did a better job gaming than their own 3800X (I just went back to check). It would seem that the way the I/O die handles the two chiplets could be a non-issue for gaming. I am really curious how the CPU is handling these loads. The 3900X is interestingly a better gaming CPU, stock anyways. Based on the lack of information we have it seems that the two chiplet design doesn't hurt gaming.

    What really interests me though is how well the 8 core parts will overclock compared to 12 and 16 core and could the 3800X end up being a better overclocker and thus AMD's gaming champ? The 12 and 16 core may be better binned but probably limited overall by power and thermals when overclocking. I also can't wait to see what kind of overclocking options will exist. Seems like the 3900X is the best overall deal. 3950X will be cool but unless you have a use for all those cores its $250 more may not be worth. Being that the 3900X showed better results than the 3800X in gaming and has 50% more cores it seems like the best buy. Anyways just a bunch of thoughts.
     
  10. las

    las Master Guru

    Messages:
    205
    Likes Received:
    24
    GPU:
    1080 Ti @ 2+ GHz
    No it's not. I have tried both Asus and Acer's 4K/UHD at 144 Hz and I'm not talking about demanding AAA games obviously. I'm talking about eSport games. These are less demanding and more CPU intensive.

    Personally I couldn't care less about 4K for PC atm. 1440p is the perfect blend between IQ and performance for me. I'll take 100 fps minimum over image quality any day.
    The GPU's are simply too weak for 4K/UHD at 100+ fps with good IQ. Maybe Ampere 7nm will fix this. Not even 2080 Ti will do it properly today.
     
    Last edited: Jun 17, 2019 at 9:44 AM

  11. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    4,007
    Likes Received:
    1,024
    GPU:
    HIS R9 290
    There is some truth to what both of you are saying. It is possible to game at 4K with modern hardware without the GPU being the bottleneck, though, that is pretty uncommon.
    I generally agree that if high frame rates are what you seek, 4K should be a low priority (as of today) and that 1440p is a good middle-ground of decent resolution and decent performance

    Just curious but do you have AA on? Because I don't see how you, with a pair of 1080Tis, could struggle to get 60FPS on those games. To my understanding, AA becomes exponentially more GPU intensive as resolution goes up, so I could see how it could be very punishing at 4K. For that very reason, whenever I switch to 4K, I have no intention on using AA at all (maybe SMAA). To me personally, the performance loss and extra heat generated just isn't worth the visual difference, especially at such a high resolution.

    It was implied that this was referring to Skylake X when it was released, since the context is about how the CPUs were marketed when they were first being sold.
     

Share This Page