AMD Ryzen 9 3950X: 16 cores and 32 threads - A Gaming CPU to be released in September

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jun 11, 2019.

  1. Arbie

    Arbie Guest

    Messages:
    169
    Likes Received:
    58
    GPU:
    GTX 1060 6GB
    Youtuber "Moore's Law Is Dead" makes this point in his latest video: with 16 cores we can simply disable SMT and still have a top chip for almost everything including gaming. I'm not saying this is the best value for a gamer, just that it will be available at a fair price to those who want it. And at this point I'm planning on it.
     
  2. airbud7

    airbud7 Guest

    Messages:
    7,833
    Likes Received:
    4,797
    GPU:
    pny gtx 1060 xlr8
    hahaha....This^.....I'll post screenshots of that sh*t just for bragging rights I tell ya!
    [​IMG]

    :D
     
    Keitosha likes this.
  3. Hog54

    Hog54 Maha Guru

    Messages:
    1,247
    Likes Received:
    68
    GPU:
    Asus Tuf RTX 3070
    I never had a problem playing anything with 6 cores.I dont know why the hell you would need 12 or 16?
     
  4. Exodite

    Exodite Guest

    Messages:
    2,087
    Likes Received:
    276
    GPU:
    Sapphire Vega 56
    So back in 2011 I got this i7 2600K that I'm still using. The choice was between that and the i5 2500K, which was notably cheaper and offered the same, and occasionally even better, performance in most tasks. Games at the time were a mixed bag with SMT, some showed negative scaling.

    In the end I decided to go for the i7, partially because SMT was useful in some of my use cases (compiling) but mostly because I wanted the best.

    That turned out prescient, as while I'm occasionally CPU-bound with the 2600K now the 2500K would have been insufficient already.

    The same applies here.

    It might be hard to imagine why you'd need a 12 or 16 core today but I intend to stay on my upgraded platform for the best part of a decade, meaning all the way through the next console generation and beyond.

    By the time I upgrade next I wouldn't be surprised if 8 cores are struggling.
     
    Aura89 and MonstroMart like this.

  5. sverek

    sverek Guest

    Messages:
    6,069
    Likes Received:
    2,975
    GPU:
    NOVIDIA -0.5GB
    You make your point, but wouldn't it be more efficient to buy CPU for $400 now and modern CPU for $400 5 years later.
    Issue is, there no way to properly utilize that amount of cores for gaming now, and 5 years later, there might be faster CPUs for the lesser price.

    It doesn't make sense to invest into 3950x for gaming right now. Unless streaming 4k content or something.
     
    las likes this.
  6. Exodite

    Exodite Guest

    Messages:
    2,087
    Likes Received:
    276
    GPU:
    Sapphire Vega 56
    Some good points, I'll try and address them.

    First things first; I don't only game. If I did I might have done fine with 8 cores as an upgrade path. As things stand 12 is probably the sweet spot but much like how I came around to buying the 2600K over the 2500K I'm willing to spend slightly more, from a full system perspective, to get the best at the time of purchase.

    As for the upgrade path that's only slightly true. If you upgrade on a two-year cadence then sure, that might be wise. In a longer perspective that doesn't make as much sense. Lets say AMD's next processor line is also available on AM4, that leaves me open to upgrades through 2020 as well. Maybe with a 10-15% performance uplift, some new security hardening and one or two neat features on top. Though 2020 would be rather early to upgrade unless I commit to something subpar now.

    After that it's doubtful how easy an upgrade path I'll have. Whether it's AMD still on the top or Intel figuring out their issues I'm likely going to need a new motherboard and new memory to go with my new processor, all to put myself just slightly beyond my needs again. PCs are modular but once you start replacing the motherboard that's kinda less true then when you're talking about other parts.

    The graphics card point I agree with, it's just that graphics is on another cadence than processors. Or they have been historically, with performance being pushed further and further and each new generation being well beyond the reach of the last until 2016 or so. Since the multi-core revolution this has been less true of processors. We've gotten more cores but per-core performance hasn't increased in the same way, and looks to be stagnating as well.

    I got the 2600K and 8 gigs of memory back in 2011 and those served me well until now, or at least last fall when I got an additional 16 gigs. I've replaced my graphics card twice though, and never with the highest-end model.

    "Future-proofing" doesn't make sense in a vacuum but we live in times of unclear upgrade paths and diminishing returns on new technology. If there's a technological breakthrough within the next decade, regarding processors/motherboards/memory, it won't be available as an upgrade path for me but as a replacement for what I'm already using. And while I'm waiting on that be I'd be stuck using a system that's less than what it could have been.

    There's the rub; I can't buy a cheaper processor today and upgrade it 3/4/5 years down the line without replacing my motherboard and memory as well. That kind of of upgrade path is an illusion and have been for a while. If we were talking about a platform releasing on AM5 with DDR5, perhaps PCIe gen5, then such an argument would have more merit as AMD generally have been good with sticking to their platforms for a couple of generations. As things stand, looking at spending the same amount of money, I'd essentially be waiting to upgrade to the same hardware I could have gotten from the start in the hope it'd drop enough in price to make it worth it. A sketchy proposition to begin with, and one that leaves me using lesser hardware and forced to do more work for the possibility of a slightly lower cost.

    Maybe it's that my personal experience differs, more likely it's just middle age speaking, but I can't be arsed to deal with that.

    A good analogy would be my processor cooling situation; perhaps I could have gotten slightly better temps from an AIO (doubtful) but in 8 years time the service needs of my Noctua C14 have been met by the application of a compressed air can - twice!

    While I appreciate the sentiment I just don't think the upgrade argument makes much sense, not at this point in time and not for me personally. YMMV.
     
  7. sverek

    sverek Guest

    Messages:
    6,069
    Likes Received:
    2,975
    GPU:
    NOVIDIA -0.5GB
    2600k lasted you long cause Intel stalled progress for 6 years, until Ryzen release.
    For 5 years+, it didn't make sense to upgrade over 2600k.

    There will be much better choices in 5 years from now and I hope Intel finally brings something new on the table. But we don't know it yet.
     
    carnivore likes this.
  8. Exodite

    Exodite Guest

    Messages:
    2,087
    Likes Received:
    276
    GPU:
    Sapphire Vega 56
    While the points about Intel sandbagging are largely true I'm not that optimistic about processor advances going forward. It's not an accident that generational leaps in performance are few and far between. Every node shrink is going to make it more difficult to hit the clocks the last one did. Cores might continue to advance in numbers but diminishing returns are significant aside from a small number of use cases.

    That's why I said that if there's a big leap in performance during the next decade it's likely to represent a break from everything we're using now, hence the upgrade option isn't really on the table.
    It's even worse for me, as my local currency have dropped significantly vs. the US dollar since 2011.

    But it's also besides the point, as I don't decide the prices.

    Today I couldn't even buy into the performance tier that I've been using the last 8 years for the same money I paid in 2011.

    I'd love to choose between 3900X at $200 and the 3950X at $300 but that's not on the table.

    Ultimately the choice is the same as the one I had back in 2011 though.
    Look, I could do a point-for-point rebuttal of each of these points (my 1992 A1200 still works fine/I've had motherboards break down in 6 months. | You can't get good prices for used hardware, period. And I have better things to do than haggling with randoms over nickles. | My 2600K is just starting to show its age after 8 years.) but we're well into red herring territory now.

    You've both alluded to the point I'm making without addressing it directly;

    The 2600K has served me as well as it did because it, at the time, marked a true generational leap in performance that only very rarely shows up.

    As I mentioned in a previous post I've been lucky to ride that wave for a long time (A64 1800+ -> A64 X2 -> 3800+ -> C2D E6600 -> i7 2600K) and I've decided to put down the money for the 3950X because I believe it represents the next true generational leap within my upgrade window. I find it unlikely that Zen 3 or whatever Intel puts out the next couple of years would represent a large enough leap over the 3950X to make waiting worth it.

    I could be wrong about that but, for previously stated reasons, I don't think I am and there's been no point raised in contention that I haven't already considered.

    Obviously I'm not presuming to speak for everyone, I'm not saying we all should get the 3950X, but this is exactly what I've been waiting for as my next upgrade.
     
  9. Exodite

    Exodite Guest

    Messages:
    2,087
    Likes Received:
    276
    GPU:
    Sapphire Vega 56
    Yeah, I don't miss the active cooling. :(

    It'll be interesting to see what the reviews say about that but I'm eyeing the top-end Gigabyte board just in case as that one is passive. Funnily enough the cost of the x570 boards makes it easier for me to accept the cost of the new processors.
     
  10. BReal85

    BReal85 Master Guru

    Messages:
    487
    Likes Received:
    180
    GPU:
    Sapph RX 570 4G ITX
    1. It's very funny to read your comment as a 9900K is around 15-20% faster in average than a 2700X in FHD when combined with a 2080Ti. And you tell that there is a 20-50% difference in 1440P (where the 9900K advantage lowers to 7-8% against a 2700X with a 2080Ti) with a 1080 Ti (2080). Sorry, but I do not believe you.
    2. Have you checked the the graphs saying that in FHD the Zen2 performs the same as Intel CPUs in games?

    Well, if you check which GPUs are leading in sell charts, those are the mid and high/mid range ones. So yes, the few % who buys a 1080Ti/2080/2080Ti, they were disappointed by the RX5700. Those who want a 2060/2070+10% card for less price (in the case of RTX 2070) can be happy. Yes, the RX5700 is 30$ more expensive than the RTX2060, but it's 8-9% more expensive for a 10% performance boost, and don't forget that the RX5700 has +2GB VRAM (and as you saw in Doom and other games, you can easily run out of 4G on FHD ultra, not to speak of 1440P or 4K), so all in all the RX5700 is a better value card than the RTX2060.

    And those 100$ price reduction is still not a confirmed, plus AMD can also lower their prices after. And why do you think this rumour appeared just before the release of Navi? :D NV had -50% less GPU sales thanks to the pathetic RTX pricing - they could have made that move earlier.
     
    Last edited: Jun 12, 2019

  11. illrigger

    illrigger Master Guru

    Messages:
    340
    Likes Received:
    120
    GPU:
    Gigabyte RTX 3080
    Literally everything you just said is based on the assumption that developers WON'T code their games to take advantage of wider chips, despite their current state and future continued prevalence in development. This is like saying that electric cars will never take off because everyone drives gas ones now, despite seeing every manufacturer producing electrics already.

    You can go ahead and bet on steady state. As a person who has been doing this for 30 years, I know when to bet on the future at this point.
     
  12. D3M1G0D

    D3M1G0D Guest

    Messages:
    2,068
    Likes Received:
    1,341
    GPU:
    2 x GeForce 1080 Ti
    This myth again? No, you cannot just lower graphics settings and play with a CPU cap at 4K in modern games - I need to lower the settings to the lowest to even reach 60 FPS in Fallout 4 on my 4K monitor (on 1440p, I can crank up the details to max). Do you have any idea how taxing 4K graphics is?
     
  13. sverek

    sverek Guest

    Messages:
    6,069
    Likes Received:
    2,975
    GPU:
    NOVIDIA -0.5GB
    Really? You sure it's not mods or SLI that messes things up?
     
  14. Koniakki

    Koniakki Guest

    Messages:
    2,843
    Likes Received:
    452
    GPU:
    ZOTAC GTX 1080Ti FE
    Undying likes this.
  15. D3M1G0D

    D3M1G0D Guest

    Messages:
    2,068
    Likes Received:
    1,341
    GPU:
    2 x GeForce 1080 Ti
    I regularly game at 4K; I don't know how much 4K gaming you've done, but I'm speaking from experience here. Whether it's Fallout 4, The Witcher 3, Doom, Metro Exodus or most other AAA games, I can barely maintain 60 FPS at 4K at the lowest settings, with the GPU fully maxed out. I don't game on low settings because I like it - it's what I have to do for decent frame rates.

    Right, my GPU reading 100% usage in HWMonitor somehow means my CPU is at fault. :p Also, I don't use SLI (I use both GPUs for computing, but only use one for gaming).

    As I said, the idea that you can game with a CPU cap at 4K simply by lowering settings is a myth - I don't know where this myth came from, but it needs to go away.
     
    Fox2232 likes this.

  16. Jayp

    Jayp Member Guru

    Messages:
    151
    Likes Received:
    61
    GPU:
    2080 Ti
    Skylake X is not Intel's best gaming offering. Coffee Lake and especially the 9900K are the best Intel has to offer for gaming. Skylake X can game well considering core counts but no model is better for gaming from Intel than the 9900K. With the 9900K already costing around $500 it is hard for Intel to say $1200+ CPUs are gaming CPUs when they game worse than their 115x parts. Especially as core counts go up, clocks decrease and Intel Skylake X performance decreases in things like games. The whole platform isn't labeled necessarily for gaming as it is extremely expensive and doesn't offer anything better to gamers, same goes for threadripper, it has workarounds for gaming but isn't a gaming CPU. AMD gets away with offering the 3950X as a gaming CPU because it is a mainstream AM4 part priced at $750 and very like will be good at gaming maybe not the best but best compromise. AMD said at Computex, that the 3800X is the best for gamers CPU.

    One thing we don't know yet is how Zen 2 will boost. It is possible that a 3950X while only taking on gaming loads could run much closer to max boost versus when it is fully loaded on 32 threads. We also don't know what the separate chiplets will work like with gaming. It is possible that for gaming the 3950X operates on one chiplet. We really just have to wait for independent reviews to put this all together. It will all be very interesting. Also curious how expensive of memory Zen 2 will need to perform most ideally.
     
  17. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    Due to the new thread grouping I'd expect most games to stay on one chiplet but I don't see why it wouldn't cross to the second if the game or OS requires it. The latency, while slightly higher than Zen+ is still lower than Zen and should be far more consistent between cores.
     
  18. Jayp

    Jayp Member Guru

    Messages:
    151
    Likes Received:
    61
    GPU:
    2080 Ti
    There are instances where certain CPUs can bottleneck at 4K but that won't be in the games you said you play. If anything you'd see it in BFV Conquest or ACO or some other very CPU demanding game. You're going to need a lot of GPU first off and secondly low settings usually has less draw calls and makes it more difficult to bottleneck but not impossible. Zen and Zen+ in my personal experience bottlenecked easily at 1440p using a 1080 Ti and especially a 2080 Ti. I could see these CPUs challenged at isolated 4K gaming but overall I agree with you that CPU bottlenecks are much less likely to occur at 4K. This could all change in the future of course, but currently, nah. Additionally, 4K gaming at low seems like a waste at least at this time. Yea yea esports but still, nah. esports still lean 240hz so there's that.
     
  19. Jayp

    Jayp Member Guru

    Messages:
    151
    Likes Received:
    61
    GPU:
    2080 Ti
    Yea I would expect them to cross if needed but I was thinking more of a game mode" that isolated the chiplets. It is really hard to say. If we reflect on AMD E3 slides for gaming performance versus Intel we can see that the 3900X did a better job gaming than their own 3800X (I just went back to check). It would seem that the way the I/O die handles the two chiplets could be a non-issue for gaming. I am really curious how the CPU is handling these loads. The 3900X is interestingly a better gaming CPU, stock anyways. Based on the lack of information we have it seems that the two chiplet design doesn't hurt gaming.

    What really interests me though is how well the 8 core parts will overclock compared to 12 and 16 core and could the 3800X end up being a better overclocker and thus AMD's gaming champ? The 12 and 16 core may be better binned but probably limited overall by power and thermals when overclocking. I also can't wait to see what kind of overclocking options will exist. Seems like the 3900X is the best overall deal. 3950X will be cool but unless you have a use for all those cores its $250 more may not be worth. Being that the 3900X showed better results than the 3800X in gaming and has 50% more cores it seems like the best buy. Anyways just a bunch of thoughts.
     
  20. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,016
    Likes Received:
    4,395
    GPU:
    Asrock 7700XT
    There is some truth to what both of you are saying. It is possible to game at 4K with modern hardware without the GPU being the bottleneck, though, that is pretty uncommon.
    I generally agree that if high frame rates are what you seek, 4K should be a low priority (as of today) and that 1440p is a good middle-ground of decent resolution and decent performance

    Just curious but do you have AA on? Because I don't see how you, with a pair of 1080Tis, could struggle to get 60FPS on those games. To my understanding, AA becomes exponentially more GPU intensive as resolution goes up, so I could see how it could be very punishing at 4K. For that very reason, whenever I switch to 4K, I have no intention on using AA at all (maybe SMAA). To me personally, the performance loss and extra heat generated just isn't worth the visual difference, especially at such a high resolution.

    It was implied that this was referring to Skylake X when it was released, since the context is about how the CPUs were marketed when they were first being sold.
     

Share This Page