Review: Ryzen 9 7950X processor (with ASRock X670E Taichi motherboard)

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Sep 26, 2022.

  1. Valken

    Valken Ancient Guru

    Messages:
    2,431
    Likes Received:
    632
    GPU:
    Forsa 1060 3GB Temp GPU
    Very true!

    I got a company laptop and IF I need MOAR power, I run a secured VMWare session with VPN back to the company on my faster desktop.

    Go Guru GAMERS! :D
     
  2. moab600

    moab600 Ancient Guru

    Messages:
    6,497
    Likes Received:
    408
    GPU:
    Galax 3080 SC
    Zen3D might improve gaming performance in a meaningful way, I don't think RPL might change it much.

    So far, anyone with Ryzen 5000 or Alder Lake don't really need to upgrade to any new gen for now.
     
    barbacot likes this.
  3. bobnewels

    bobnewels Master Guru

    Messages:
    856
    Likes Received:
    571
    GPU:
    RTX 3080
    Awesome review,I understand now why the prices on the 7950X was lower by $100 dollars over the 5950X. For me the AMD lineup this time is not even a thought. I really like AMD CPU's and have bought around 8-10 of them for my personal PC Gaming rig,this time AMD no buy.Come on GPU's be better.
     
  4. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,044
    Likes Received:
    3,470
    GPU:
    HIS R9 290
    But that's my point... what miracles were you all expecting? This generation is more expensive and the vast majority of games aren't CPU bound anymore. Even before this CPU was officially announced, I would have told you it'd be a poor choice for gamers. With V-cache, it would be better but still a poor value proposition for gamers.

    Expecting Ryzen 7000 to be worth upgrading to for maximum FPS in gaming would be like upgrading from a Tesla Model 3 to a Porche Taycan and expecting your everyday commute to work to go by faster. It's just not going to happen, so why even consider it? But suppose you're one of the few people who plays CPU-bottlenecked games - you're either better off with Intel or waiting for the V-cache models. So I just really don't get why anybody here is underwhelmed about something that was bound to be underwhelming. What next - you step in a puddle and get irritated that your foot is wet?


    Or y'know, just look at the graphs in this article... It would take you less time to do that than it would take for you to write your comment.
     
    AlmondMan, Ivrogne and yasamoka like this.

  5. nizzen

    nizzen Ancient Guru

    Messages:
    2,244
    Likes Received:
    995
    GPU:
    3x3090/3060ti/2080t
    Running 7950x delidded is 20c lower than non delidded. Nice job AMD :rolleyes:

    Ever worse than the pigeon poop on the old Intel cpu's :p
     
    Fediuld likes this.
  6. Agonist

    Agonist Ancient Guru

    Messages:
    3,983
    Likes Received:
    1,095
    GPU:
    Dell 6800XT 16GB
    Anyone who creates content and doesnt want shitty quality doesnt use gpu encoding. Its awful.

    Its why I have friends that send me video to be rendered and encoded on my threadripper. It takes longer but looks way better. Artificats happen big time with gpu encoding.



    I find it ironic most people are acting like this a crap launch. Terrible performance and frankly i think some of you are high or hype drunk.


    I never expected anyone with a 5900x, 5950x, or 5800x3d to upgrade.

    But anyone with a 5600x, ryzen 3xxx would upgrade and didn't wanna stay AM4.
     
  7. Kaarme

    Kaarme Ancient Guru

    Messages:
    3,220
    Likes Received:
    2,031
    GPU:
    Sapphire 390
    5800X3D still remains the gaming king. I believe quite a few people here predicted that, and consequently were already looking forward to the 7000 3D models, not these vanilla ones.
     
    chispy, mackintosh, Fediuld and 2 others like this.
  8. barbacot

    barbacot Master Guru

    Messages:
    787
    Likes Received:
    742
    GPU:
    Asus 3080 Strix OC
    From what I saw in other benchmarks 5950X is more power efficient than 7950X and with prices for 5950X decreasing now is the time to buy. It's the same advice like on Nvidia RTX 4XXX - you want one? - now is the time to buy an RTX 3XXX at a decent price.

    I am curious about the temperatures: what was the room temperature during the test? - I don't think that it was specified in the review...

    [​IMG]
     
    cucaulay malkin likes this.
  9. Valken

    Valken Ancient Guru

    Messages:
    2,431
    Likes Received:
    632
    GPU:
    Forsa 1060 3GB Temp GPU
    Well, for ME and some of us on REALLY old systems, WE are totally CPU bottlenecked.

    So we will buy NEW systems, not upgrade AM4. Of course we will look at this CPU. It's is great actually but the OVERALL costs = MB + DDR5 + AIO cooler required is HIGHER than buying what I've listed.

    That is the problem.

    If the MB and DDR5 RAM costs the SAME as current GEN, then I would consider Ryzen 4 or still wait for X3D. Why not? The total cost of the platform right now is MORE than Z690 or AM4. Still needs an AIO to hit peak boost!

    Come on, the 5800X3D does not even OC well yet hangs up there neck to neck with the top CPUs, on AIR!

    I would expect the Ryzen 4 CPUs to not need AIO coolers to hit peak boost... but alas no...

    Why be cheeky about it or try to hinder our freedom of expression? We didn't insult you. Just pointing out facts based on the current economy and spending abilities of most.
     
  10. NewTRUMP Order

    NewTRUMP Order Master Guru

    Messages:
    649
    Likes Received:
    264
    GPU:
    rtx 3080
    I am glad we have you to tell us what we are looking for. I mean most of us just sit around drooling on ourselves and waiting for the nanny to change our diapers. How did you get so magnificent? [​IMG] Forum: a place, meeting, or medium where ideas and views on a particular issue can be exchanged. Not dictated to.
     

  11. WhiteLightning

    WhiteLightning Don Illuminati Staff Member

    Messages:
    29,794
    Likes Received:
    2,644
    GPU:
    Inno3d RTX3070
    This 7950 gets way too hot in my book. not even overclocked. the fact AMD recommends a liquid cooler says it all.

    I'm looking forward what Intel brings to the table now, will it get as hot as well ?
     
    Why_Me likes this.
  12. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    6,686
    Likes Received:
    3,822
    GPU:
    RTX 3060 Ti
    imo it will be just as bad, meaning clc required for 13900 for sure.
     
  13. Lebon30

    Lebon30 Member Guru

    Messages:
    137
    Likes Received:
    77
    GPU:
    EVGA RTX 3070Ti 8GB
    Let me requote this from the conclusion of the 7950X review:
    ...
    I don't like it. I discovered this with the GamersNexus video and good god... why the heck do we need excessive temperatures and electricity draw to achieve better performance?
    Now, I understand that it's not 100% of the time and that your CPU will most likely never run at 100% utilisation with all core unless you benchmark it.

    And, so, personally, I'd still choose a 5900X or 5950X over this just because they are THE KINGS of efficiency and runs cool to boot! Performance-wise, sure, 7950X looks AMAZING. But the efficiency is twice as worse for it...

    I'm still happy with my 5900X and 3070 Ti. Will stay with this until it becomes unbearably slow.
     
    Maddness and schmidtbag like this.
  14. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,044
    Likes Received:
    3,470
    GPU:
    HIS R9 290
    I myself am on one of such systems. My system is overall worse than yours. I'm not saying there's anything wrong with considering the Ryzen 7000 series to upgrade to, because there are plenty of benefits to the platform as a whole. What I'm saying is the 7950X is a stupid choice to upgrade to if gaming is all you'll do, no matter what you have now. If you want the 7000 series, the 7700X is much more sensible: it's barely any slower, it's a lot cheaper (though to me, still overpriced), and its power draw isn't anything to scowl at. I wouldn't be surprised if the 7700 or 7800X3D will be even better choices, depending on whether you want more performance or a lower price.
    If your #1 priority is gaming on a budget, this article would not have been worth reading past the 1st page. Hence me asking again: what miracle were you expecting? The only interesting thing about this particular CPU is its power draw and productivity performance. Gaming performance was a given, so why gripe over the obvious?

    And it is my view that people's expectations are misplaced. So riddle me this:
    You buy some bread, knowing very well what bread is, and complain there's too much carbs and gluten in it. Are you justified in your complaint? Sure, you have the right to complain and you are entitled to your opinion, but it's not the baker's fault you were expecting the impossible. It just makes you sound insatiable.
     
  15. Fediuld

    Fediuld Master Guru

    Messages:
    767
    Likes Received:
    450
    GPU:
    AMD 5700XT AE
    Heh. 13900 and RTX4090 will need a 1400W ATX3.0 PSU, when CPU & GPU need 1100W combined at full blast.
     

  16. Fediuld

    Fediuld Master Guru

    Messages:
    767
    Likes Received:
    450
    GPU:
    AMD 5700XT AE
    Reading the reviews and say just two things
    a) Why Guru3d still using 3090 and not 6900XT/6950XT? That they use the faster card was the argument for years yet here we are. The faster GPU is not the 3090.

    b) I think going to stick to my 5900X for the foreseeable future. Maybe side grade to 5800X3D. The Zen4 atm is meh, and shows from the 3% IPC increase. The rest of performance is from higher clock speeds and more energy consumption & heat generation. I bet my 5900X with -12 PBO2 at all cores is faster too.
     
  17. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    45,200
    Likes Received:
    11,949
    GPU:
    AMD | NVIDIA
    We can't switch every time there's a faster GPU. It's plenty fast ...

    The reason is simple, ~20 processors x three resolutions versus 6 titles. That's 360 benchmark runs let alone each processor needs its own platform windows installation again + updates.

    You're welcome to drop by our lab and do that for us though :)
     
  18. Valken

    Valken Ancient Guru

    Messages:
    2,431
    Likes Received:
    632
    GPU:
    Forsa 1060 3GB Temp GPU
    You have a less GPU but you're still on AM4 but old RAM. If you can pull it off, 5800x3D will use your RAM and still be a better buy than 12900F or 12400f + DDR4 RAM + MSI Mortar WIFI MAX DDR4.

    You will need a better GPU like all of us tho.

    Well we look at the BEST to see if the 7950X can pull extra gaming performance like how the 12900K/F can. If you say stock is nothing, you are correct but the 12900K, KF or KS can be OC'ed like crazy for extra performance and HEAT.

    So that is why we would look at it.

    I play CPU limited games, mainly ARMA (yeh, its my poker game) and my squad plays its so we look at pure CPU performance first but are price conscious about the platform.


    It is a sign of the times due to everything going on now. I think most people have to look at bang per $ or Euro now because buying power dropped.

    Not because we were not willing to spend the money but prices inflated beyond incomes in general. So we have to pull back somewhat.

    But as you said, X3D is probably the better buy and most of us who wanted the BEST CPU for gaming said we will wait for that version and compare it to the super heater 13xxx series from Intel.
     
  19. haste

    haste Ancient Guru

    Messages:
    1,965
    Likes Received:
    876
    GPU:
    GTX 1080 @ 2.1GHz
    This is the way to go: 7950x + eco mode + undervolt

    [​IMG]
     
    Embra and pegasus1 like this.
  20. MonstroMart

    MonstroMart Maha Guru

    Messages:
    1,261
    Likes Received:
    701
    GPU:
    RX 6800 Red Dragon
    Considering the price to upgrade to a ddr5 platform yeah i agree no reason to upgrade this gen i'll definitely skip and do only a GPU upgrade early 2023 and my next full platform upgrade somewhere in 2024.
     
    The Goose likes this.

Share This Page