1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

AMD Ryzen 9 3950X: 16 cores and 32 threads - A Gaming CPU to be released in September

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jun 11, 2019.

  1. D3M1G0D

    D3M1G0D Ancient Guru

    Messages:
    1,676
    Likes Received:
    986
    GPU:
    2 x GeForce 1080 Ti
    Well, Intel also marketed the X-series Core i9 CPUs to gamers. I think any CPU which offers fast single-core performance could be considered a gaming CPU, although that doesn't mean that they can't be used for anything else. Note that many gamers are also now content creators, either streaming or uploading videos, so a CPU with a lot of cores might fit the bill.
     
  2. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    4,007
    Likes Received:
    1,024
    GPU:
    HIS R9 290
    Good points, though I was about to write about Skylake-X to Japy. Looking back at some of the previews and announcements for that series, it didn't mention gaming all that often; I couldn't find anything where it was headlined just as a gaming CPU by Intel, so that's why I'm rolling my eyes at AMD here. This is despite the fact that Skylake-X (last time I checked) offers the best gaming performance you can currently buy.

    For the record, this isn't really that big of a deal to me, people just keep bringing it up, so it seems like I care about this more than I really do.
     
  3. Borys

    Borys Active Member

    Messages:
    63
    Likes Received:
    8
    GPU:
    Intel X3100 256MB
    So... the Ryzen 3900x already its better than the 9900++++++++++! Point! Even without any premium mobo, cooler and DDR4 4000Mhz and above...
    The 3950x its the BEST CPU at the market costing half of price of what Intel has near to offer. Any more questions?
    Congratulations AMD! The new KING cpu in ALL WAYS!
     
    Last edited: Jun 11, 2019
  4. las

    las Master Guru

    Messages:
    205
    Likes Received:
    24
    GPU:
    1080 Ti @ 2+ GHz
    There's tons of games that still barely uses 4 cores... Especially in esport titles where people want extremely high fps and are 100% CPU bound. Ask any 120-240 Hz gamer.

    Coding software, especially games, for massive multi threaded work is not an easy task. Very very few does it right and 6C/12T generally is more than enough, even when done right and using a high-end GPU.

    You're still good with 4C/8T in pretty much all games AAA released today, using a mid-end GPU (as in 1070 Ti, 1080, RTX 2060, 2070 or those 2 new Navi GPU's - That's mid-end for me).

    If you want to crank up graphical settings in AAA games to "Ultra" you're going to be GPU bound and not CPU bound anyway. You simply just need a CPU than can feed the GPU which is pretty easy. GPU bound gaming is not CPU intensive at all.

    In 5 years 8C/16T will easily max everything. Next gen consoles will get low clocked Zen 2 cores with only 6-7 cores reserved for the actual games. They will probably not even get SMT.
    You won't see any benefit from going above 8C for gaming. Going 12-16C is completely waste of money for gaming and you should invest the extra money in a better GPU instead. Thats 2080 -> 2080 Ti for example.

    3950X is not a gaming chip. But it will play games, sure.. Even the 8700K will beat it in gaming tho, when CPU bound. They will be pretty even in GPU bound gaming. Most of the cores will do nothing. I have already tried gaming on a i9-9960X and that was exactly what happend. Tons of cores and threads just idle. Or workload jumping across cores. If you do content creation or any CPU demanding work outside of gaming, it's a fine choice. But I seriously don't hope anyone will pick up the 3900X or 3950X, for purely gaming.

    Going wider is "easy" - Yeah. Wider is simply not just better, for gaming. This is why next gen consoles will stay at 8 cores like current gen. Clockspeed and IPC improves alot tho.
     
    Last edited: Jun 12, 2019
    alanm and schmidtbag like this.

  5. Arbie

    Arbie Member Guru

    Messages:
    158
    Likes Received:
    52
    GPU:
    GTX 1060 6GB
    Youtuber "Moore's Law Is Dead" makes this point in his latest video: with 16 cores we can simply disable SMT and still have a top chip for almost everything including gaming. I'm not saying this is the best value for a gamer, just that it will be available at a fair price to those who want it. And at this point I'm planning on it.
     
  6. airbud7

    airbud7 Ancient Guru

    Messages:
    6,876
    Likes Received:
    3,324
    GPU:
    pny gtx 1060 xlr8
    hahaha....This^.....I'll post screenshots of that sh*t just for bragging rights I tell ya!
    [​IMG]

    :D
     
    Keitosha likes this.
  7. Hog54

    Hog54 Maha Guru

    Messages:
    1,042
    Likes Received:
    2
    GPU:
    Asus GTX 1050
    I never had a problem playing anything with 6 cores.I dont know why the hell you would need 12 or 16?
     
  8. Exodite

    Exodite Ancient Guru

    Messages:
    1,843
    Likes Received:
    128
    GPU:
    Sapphire Vega 56
    So back in 2011 I got this i7 2600K that I'm still using. The choice was between that and the i5 2500K, which was notably cheaper and offered the same, and occasionally even better, performance in most tasks. Games at the time were a mixed bag with SMT, some showed negative scaling.

    In the end I decided to go for the i7, partially because SMT was useful in some of my use cases (compiling) but mostly because I wanted the best.

    That turned out prescient, as while I'm occasionally CPU-bound with the 2600K now the 2500K would have been insufficient already.

    The same applies here.

    It might be hard to imagine why you'd need a 12 or 16 core today but I intend to stay on my upgraded platform for the best part of a decade, meaning all the way through the next console generation and beyond.

    By the time I upgrade next I wouldn't be surprised if 8 cores are struggling.
     
    Aura89 and MonstroMart like this.
  9. sverek

    sverek Ancient Guru

    Messages:
    4,988
    Likes Received:
    1,707
    GPU:
    NOVIDIA -0.5GB
    You make your point, but wouldn't it be more efficient to buy CPU for $400 now and modern CPU for $400 5 years later.
    Issue is, there no way to properly utilize that amount of cores for gaming now, and 5 years later, there might be faster CPUs for the lesser price.

    It doesn't make sense to invest into 3950x for gaming right now. Unless streaming 4k content or something.
     
    las likes this.
  10. las

    las Master Guru

    Messages:
    205
    Likes Received:
    24
    GPU:
    1080 Ti @ 2+ GHz
    You know, getting very expensive parts just to ensure it will last a decade is not the right way to do it.

    You can use half that money and get a nice rig that will last for 4-5 years and then upgrade, to see a huge performance increase by then. Even the best hardware will be severely outdated after 10 years and you will pay a very high premium for it.


    Going 12 or even 16 cores will do absolutely nothing for gaming going forward, since next gen consoles will get 8 cores with 1-2 reserved. You are not going to need more than 8C/16T. Even 6C/12T will probably do just fine.

    Don't try and futureproof like that. I've seen it too many times.
    When more than 8 cores will actually be needed for some games, Zen 2 will be very outdated and deliver a subpar performance anyway. Remember FX's 8 cores!? ;)
    You buy 12-16 cores if you actually need it, outside of gaming. And this is why it's not a gaming chip in the first place.

    It's like buying a Titan GPU just because of the VRAM, hoping it will last for 10 years. It won't. In 2-3 generations you'll see mid-end GPU's beating it for 1/4 of the price. After 4-5 years it will be mediocre.

    Upgrading more often with "cheaper parts" is the answer for pretty much everyone. Why treat your PC like a console with a 10 year hardware upgrade? The awesome thing about PC is that you can upgrade it whenever you want.
     
    Last edited: Jun 12, 2019

  11. Exodite

    Exodite Ancient Guru

    Messages:
    1,843
    Likes Received:
    128
    GPU:
    Sapphire Vega 56
    Some good points, I'll try and address them.

    First things first; I don't only game. If I did I might have done fine with 8 cores as an upgrade path. As things stand 12 is probably the sweet spot but much like how I came around to buying the 2600K over the 2500K I'm willing to spend slightly more, from a full system perspective, to get the best at the time of purchase.

    As for the upgrade path that's only slightly true. If you upgrade on a two-year cadence then sure, that might be wise. In a longer perspective that doesn't make as much sense. Lets say AMD's next processor line is also available on AM4, that leaves me open to upgrades through 2020 as well. Maybe with a 10-15% performance uplift, some new security hardening and one or two neat features on top. Though 2020 would be rather early to upgrade unless I commit to something subpar now.

    After that it's doubtful how easy an upgrade path I'll have. Whether it's AMD still on the top or Intel figuring out their issues I'm likely going to need a new motherboard and new memory to go with my new processor, all to put myself just slightly beyond my needs again. PCs are modular but once you start replacing the motherboard that's kinda less true then when you're talking about other parts.

    The graphics card point I agree with, it's just that graphics is on another cadence than processors. Or they have been historically, with performance being pushed further and further and each new generation being well beyond the reach of the last until 2016 or so. Since the multi-core revolution this has been less true of processors. We've gotten more cores but per-core performance hasn't increased in the same way, and looks to be stagnating as well.

    I got the 2600K and 8 gigs of memory back in 2011 and those served me well until now, or at least last fall when I got an additional 16 gigs. I've replaced my graphics card twice though, and never with the highest-end model.

    "Future-proofing" doesn't make sense in a vacuum but we live in times of unclear upgrade paths and diminishing returns on new technology. If there's a technological breakthrough within the next decade, regarding processors/motherboards/memory, it won't be available as an upgrade path for me but as a replacement for what I'm already using. And while I'm waiting on that be I'd be stuck using a system that's less than what it could have been.

    There's the rub; I can't buy a cheaper processor today and upgrade it 3/4/5 years down the line without replacing my motherboard and memory as well. That kind of of upgrade path is an illusion and have been for a while. If we were talking about a platform releasing on AM5 with DDR5, perhaps PCIe gen5, then such an argument would have more merit as AMD generally have been good with sticking to their platforms for a couple of generations. As things stand, looking at spending the same amount of money, I'd essentially be waiting to upgrade to the same hardware I could have gotten from the start in the hope it'd drop enough in price to make it worth it. A sketchy proposition to begin with, and one that leaves me using lesser hardware and forced to do more work for the possibility of a slightly lower cost.

    Maybe it's that my personal experience differs, more likely it's just middle age speaking, but I can't be arsed to deal with that.

    A good analogy would be my processor cooling situation; perhaps I could have gotten slightly better temps from an AIO (doubtful) but in 8 years time the service needs of my Noctua C14 have been met by the application of a compressed air can - twice!

    While I appreciate the sentiment I just don't think the upgrade argument makes much sense, not at this point in time and not for me personally. YMMV.
     
  12. sverek

    sverek Ancient Guru

    Messages:
    4,988
    Likes Received:
    1,707
    GPU:
    NOVIDIA -0.5GB
    2600k lasted you long cause Intel stalled progress for 6 years, until Ryzen release.
    For 5 years+, it didn't make sense to upgrade over 2600k.

    There will be much better choices in 5 years from now and I hope Intel finally brings something new on the table. But we don't know it yet.
     
    carnivore likes this.
  13. las

    las Master Guru

    Messages:
    205
    Likes Received:
    24
    GPU:
    1080 Ti @ 2+ GHz
    Well I'm still not sure how you can compare 2600K which was around $300 to 3950X which is $750, this is HEDT pricing. Even considering inflation.
    2600K was only 100 bucks more than 2500K and yes, it was a wise decision (and the only right decision if you knew back then you would have it for 8+ years).

    If you do CPU demanding stuff outside of gaming tho, how could you be satisfied with 2600K till now?

    Also, hardware really don't last forever. I've seen tons of motherboards wear out after the 5+ year mark. CPU's generally last, unless overvolted like crazy, but motherboards have tons of components that can fail over time. I would not bet on a motherboard lasting 10 years thats for sure, unless enterprise-grade.

    Building every 4-5 years also mean you can get a decent price on your used parts. After 8-10 years all you're going to get is peanuts, for "premium parts". Nobody is going to pay well for hardware that old.

    And for the last half of the time, your performance won't be great. No matter what you buy it will be outdated after 5 years, pretty much. This is especially true NOW, when the CPU war is back on. There has been pretty much zero competition in the CPU market while you had your 2600K. Remember that.
     
    Last edited: Jun 12, 2019
  14. Exodite

    Exodite Ancient Guru

    Messages:
    1,843
    Likes Received:
    128
    GPU:
    Sapphire Vega 56
    While the points about Intel sandbagging are largely true I'm not that optimistic about processor advances going forward. It's not an accident that generational leaps in performance are few and far between. Every node shrink is going to make it more difficult to hit the clocks the last one did. Cores might continue to advance in numbers but diminishing returns are significant aside from a small number of use cases.

    That's why I said that if there's a big leap in performance during the next decade it's likely to represent a break from everything we're using now, hence the upgrade option isn't really on the table.
    It's even worse for me, as my local currency have dropped significantly vs. the US dollar since 2011.

    But it's also besides the point, as I don't decide the prices.

    Today I couldn't even buy into the performance tier that I've been using the last 8 years for the same money I paid in 2011.

    I'd love to choose between 3900X at $200 and the 3950X at $300 but that's not on the table.

    Ultimately the choice is the same as the one I had back in 2011 though.
    Look, I could do a point-for-point rebuttal of each of these points (my 1992 A1200 still works fine/I've had motherboards break down in 6 months. | You can't get good prices for used hardware, period. And I have better things to do than haggling with randoms over nickles. | My 2600K is just starting to show its age after 8 years.) but we're well into red herring territory now.

    You've both alluded to the point I'm making without addressing it directly;

    The 2600K has served me as well as it did because it, at the time, marked a true generational leap in performance that only very rarely shows up.

    As I mentioned in a previous post I've been lucky to ride that wave for a long time (A64 1800+ -> A64 X2 -> 3800+ -> C2D E6600 -> i7 2600K) and I've decided to put down the money for the 3950X because I believe it represents the next true generational leap within my upgrade window. I find it unlikely that Zen 3 or whatever Intel puts out the next couple of years would represent a large enough leap over the 3950X to make waiting worth it.

    I could be wrong about that but, for previously stated reasons, I don't think I am and there's been no point raised in contention that I haven't already considered.

    Obviously I'm not presuming to speak for everyone, I'm not saying we all should get the 3950X, but this is exactly what I've been waiting for as my next upgrade.
     
  15. las

    las Master Guru

    Messages:
    205
    Likes Received:
    24
    GPU:
    1080 Ti @ 2+ GHz
    Yeah, well there was i7-3930K, that was my build in 2012 along with 16 GB of RAM.
    It's still being used by a friend of mine and hold up very well with OC to 4.4 GHz, it was also about 500 bucks IIRC. Mainstream were not the only option back then.

    Lately we are just seeing HEDT class CPU's entering mainstream segment (price and performance). 9900K/9900KS eats into HEDT as well and beating most of Intel's 10C HEDT CPUs in raw power but especially in games. 5 GHz on all cores are very easy, tried 3 retail ones and all did 5 GHz on all cores using cheap 240 aio or dual tower air. 750 bucks is the new record for mainstream tho. Lets see, release is November I keep reading, that's not even close. I hope it's not just a halo product.

    Intel was sandbagging because of no competition. AMD would have done the same in that position. AMD have done exactly this numerous times before, when they had a lead - Intel and Nvidia always caught up tho (much bigger R&D budgets) and it will happen again. AMD have been lucky with the timing of Zen, because of Intel's 10nm issues but it won't be long before we'll see a new Intel arch on 7nm/7nm+ and this is when the real war begins. After all AMD is competing against an older Intel arch with a node-advantage on top of this. Zen is actually not that impressive when you keep this in mind. I'm not biased tho, I own a Ryzen 1700 myself, in my server, underclocked and undervolted. I like competition.

    A new arch is when the true generational leaps happen, not simply by more cores and threads. So I'm not sure why you think CPU's won't improve much over the next 10 years. It has been many years since Intel has released something truly new and they will do this in 2021. We might or might not see desktop class chips using Intel's 10nm node in 2020. I don't think we will. I expect to see a 10C/20T on 14nm later this year, or maybe more, if they dismiss the GPU part. I think only small mobile chips will use the 10nm node. It will be a very short-lived node. Intels own words. 7nm is on track. And lets not forget that Intel's 7nm node is more advanced than TSMC's 7nm node. Which is why it's not ready yet. It will be more like 7nm+ EUV.

    I'm not sure why you think you can't get a decent amount of money for 4-5 year old hardware. I always can. But after 8-10 years, no-one wants it. It's very oudated by then. Just sold a 4 year old 980 Ti for 150 bucks. Better than nothing.


    I would like to see AMD focus more on GPU and mobile CPU market tbh. Intel pretty much owns Mobile and Enterprise - Most companies won't even consider AMD servers (i sell hardware b2b for a living / build servers). This is where the real money lies. There's really not much money in the desktop consumer market. Mostly gamers here, and Intel still offers better gaming performance, this is also why Intel didn't see much of a loss here. Prices are high and gamers, especially competitive, does not want AMD. Intel + Nvidia simply delivers the best possible gaming experience across the board. AMD is very often hit or miss. This is also true in applications.

    So you went 1C -> 2C -> 4C and now you want 16C? :p

    What's scary about X570 is the active cooling part, and maybe pricing. It's been many years since I've had active cooling on my chipset o_O

    3950X is many months away still but I'm looking forward to see real world benches instead of Cinebench numbers.
     
    Last edited: Jun 12, 2019

  16. Exodite

    Exodite Ancient Guru

    Messages:
    1,843
    Likes Received:
    128
    GPU:
    Sapphire Vega 56
    Yeah, I don't miss the active cooling. :(

    It'll be interesting to see what the reviews say about that but I'm eyeing the top-end Gigabyte board just in case as that one is passive. Funnily enough the cost of the x570 boards makes it easier for me to accept the cost of the new processors.
     
  17. BReal85

    BReal85 Master Guru

    Messages:
    263
    Likes Received:
    66
    GPU:
    ASUS 270X DC2 TOP
    1. It's very funny to read your comment as a 9900K is around 15-20% faster in average than a 2700X in FHD when combined with a 2080Ti. And you tell that there is a 20-50% difference in 1440P (where the 9900K advantage lowers to 7-8% against a 2700X with a 2080Ti) with a 1080 Ti (2080). Sorry, but I do not believe you.
    2. Have you checked the the graphs saying that in FHD the Zen2 performs the same as Intel CPUs in games?

    Well, if you check which GPUs are leading in sell charts, those are the mid and high/mid range ones. So yes, the few % who buys a 1080Ti/2080/2080Ti, they were disappointed by the RX5700. Those who want a 2060/2070+10% card for less price (in the case of RTX 2070) can be happy. Yes, the RX5700 is 30$ more expensive than the RTX2060, but it's 8-9% more expensive for a 10% performance boost, and don't forget that the RX5700 has +2GB VRAM (and as you saw in Doom and other games, you can easily run out of 4G on FHD ultra, not to speak of 1440P or 4K), so all in all the RX5700 is a better value card than the RTX2060.

    And those 100$ price reduction is still not a confirmed, plus AMD can also lower their prices after. And why do you think this rumour appeared just before the release of Navi? :D NV had -50% less GPU sales thanks to the pathetic RTX pricing - they could have made that move earlier.
     
    Last edited: Jun 12, 2019
  18. illrigger

    illrigger Member Guru

    Messages:
    120
    Likes Received:
    32
    GPU:
    GTX 1080 Ti
    Literally everything you just said is based on the assumption that developers WON'T code their games to take advantage of wider chips, despite their current state and future continued prevalence in development. This is like saying that electric cars will never take off because everyone drives gas ones now, despite seeing every manufacturer producing electrics already.

    You can go ahead and bet on steady state. As a person who has been doing this for 30 years, I know when to bet on the future at this point.
     
  19. las

    las Master Guru

    Messages:
    205
    Likes Received:
    24
    GPU:
    1080 Ti @ 2+ GHz
    What FHD testing are you talking about? Going up to 1440p makes zero difference when CPU bound. You can be CPU bound even at 2160p if you want. Simply lower graphics to low or medium settings (and this is what pretty much every competitive gamer is doing). Going above 120 fps and CPU will start to matter alot more and this is when Ryzen starts to show it's weaknesses. This is why 120-240 Hz gamers still go Intel, they won't even consider AMD because it's a fact that performance is lower. I highly doubt Zen 2 is going to change this fact. Looking forward to see official benches tho. I would be very surprised if Zen 2 with 8-16 cores and fully clocked can even match a stock 8700K in CPU bound gaming.

    Well, 2060 Super is coming in a few weeks with 8GB and 256 bus, Nvidia pretty much countered Navi before it even launched and AMD's only choice is to lower prices. I couldn't care less what sells the most - I need way more power than that in 2019. My 1080 Ti beats them all and is more than 2 years old. Closing in on 2½ year mark. Ampere at 7nm Samsung EUV node is going to be my next GPU. It's going to be a slaughter. AMD even with a node advantage (and use of HBM2) can't even beat Nvidia's current (and older) offerings. That is just weak. My 1080 Ti at 2+ GHz slams Radeon VII with ease in games. Radeon VII costs more now than I paid for my 1080 Ti back then.

    Why should they? Business 101. They own the market as it is. AMD is sleeping. 7970 was the last good AMD GPU, also my last AMD GPU. Before that I had many ATi and AMD GPU's. AMD can't even match 1080 Ti performance after 2½ years. I start to miss ATi.

    They won't. PS5 and next Xbox will have 6-7 cores for games and look at Steam HW survey, quads are used by the majority, many even still at dual cores.There's simply no way that developers will start coding their games to use more than 8 cores before next-next generation of consoles, IF these gets more than 8 cores.

    Should game dev's change everything just because 0.00001% of gamers have 12-16 cores? Not going to happen. Dev's are coding for the masses, meaning 6 cores is going to be enough for years to come and with 8 cores you are going to be 100% safe for the entirety of next gen consoles.
     
    Last edited: Jun 12, 2019
  20. D3M1G0D

    D3M1G0D Ancient Guru

    Messages:
    1,676
    Likes Received:
    986
    GPU:
    2 x GeForce 1080 Ti
    This myth again? No, you cannot just lower graphics settings and play with a CPU cap at 4K in modern games - I need to lower the settings to the lowest to even reach 60 FPS in Fallout 4 on my 4K monitor (on 1440p, I can crank up the details to max). Do you have any idea how taxing 4K graphics is?
     

Share This Page