AMD Ryzen 5000 (ZEN3) announcement thread

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Oct 8, 2020.

  1. metagamer

    metagamer Ancient Guru

    Messages:
    1,866
    Likes Received:
    731
    GPU:
    Palit GameRock 2080
    This is such a nonsense post.

    I can guarantee you you'd do much better spending your money on Nvidia, if you want value per fps.

    You will not prove me wrong on this.

    If you think a 3080 will only give you 15-20fps over your 2080S, good, you think that. But you're wrong. It will in fact give you a lot more than that. It'll give you some 50-80% more performance in games. It'll always be more than 15-20fps.

    But you go ahead, make a bad purchase.
     
    Last edited: Oct 9, 2020
  2. toyo

    toyo Master Guru

    Messages:
    379
    Likes Received:
    200
    GPU:
    Gigabyte 1070Ti 8G
    Both the 16 and 12 core CPUs have no direct competition, and the 16 core has limited use of a desktop that you don't use for VMs, rendering and the sort, so it's kinda more of a vanity move to have it on the desktop and not HEDT.
    I would say the 5900X for 380$. Obviously, that means the 10900K should be cheaper. 360$, since it's old now.
    If Intel was able to add 2 cores at 14nm from 7700K to 8700K for only 20$(ish), you can AT LEAST keep the same number of cores at the same price for the 3000 to 5000 series jump.
    And don't forget, Intel is the greedy, horrible, big bad corporation that bullies the righteous AMD. So one would assume AMD would be far, far more careful to provide better value for its customers than Intel...
    ... right?
     
  3. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,561
    Likes Received:
    224
    GPU:
    RX 580 8GB
    They need to fit around the already well established 3000 series. Don't forget that. Zen 2 CPU's won't just vanish off the shelves. Your pricing does not work.
     
  4. toyo

    toyo Master Guru

    Messages:
    379
    Likes Received:
    200
    GPU:
    Gigabyte 1070Ti 8G
    In normal times, a normal corporation would basically sell their previous gen for cheap to get it off those shelves. AMD did it just months ago with both Ryzen 1000 and 2000 series, which were at times quite low and offering great value.
    They aren't doing it now, you know why?
    They don't have to, because Intel lost gaming and single core, so AMD is on top with everything, so why lose on profit increases?
     

  5. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,561
    Likes Received:
    224
    GPU:
    RX 580 8GB
    People are also still buying 2nd gen motherboards. They have saturated the market. There's still stock for X and XT which will be even better value for some in the coming months
     
    Valken likes this.
  6. nizzen

    nizzen Maha Guru

    Messages:
    1,264
    Likes Received:
    327
    GPU:
    3x2080ti/5700x/1060
  7. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    5,640
    Likes Received:
    2,122
    GPU:
    HIS R9 290
    He's a troll; that's all the input you need.
     
    JAMVA, AlmondMan and Aura89 like this.
  8. leszy

    leszy Master Guru

    Messages:
    325
    Likes Received:
    17
    GPU:
    Sapphire V64 LC
    In my opinion, it's nice that AMD has left some space for Intel's weaker processors. Otherwise, Intel would have had big problems. Intel did the same for AMD a few years ago.
     
  9. jbscotchman

    jbscotchman Ancient Guru

    Messages:
    5,454
    Likes Received:
    4,208
    GPU:
    MSI 1660 Ti Ventus
    Forgive me for not reading through this entire thread, but what's the word on B450 support and Zen 3? Is it official, or is it up to motherboard manufacturers?

    Edit: Nevermind, just saw it in HH's article. Pretty much what I was expecting.
     
  10. Biffo

    Biffo Active Member

    Messages:
    59
    Likes Received:
    3
    GPU:
    ati
    When is Threadripper due?
     

  11. nosirrahx

    nosirrahx Master Guru

    Messages:
    323
    Likes Received:
    94
    GPU:
    HD7700
    With how good the current offerings are and how delayed the X299 successors are, they hardly need to rush these out.
     
  12. nosirrahx

    nosirrahx Master Guru

    Messages:
    323
    Likes Received:
    94
    GPU:
    HD7700

    Completely agree with this. Team '1080p with medium settings' is completely set for a very long time. Toss in a new mid-range GFX card every 2 or so years and you will be very happy with your setup.
     
  13. tty8k

    tty8k Master Guru

    Messages:
    238
    Likes Received:
    63
    GPU:
    Ati 5850
    If I was a gamer on Intel 8k to 10k series or AMD 3k and still care about a few hundreds to invest in my machine, this 5k Ryzen is a step to skip.
    Your fps/money ratio is much better by getting an nvidia card (or amd if proven to be at same level).
    If you're gaming at 2k resolution that bar could be even lower.

    This series only makes sense to those into 3d rendering, where render times are mostly long enough to justify.
    Not even video encoding benefits significantly, most productions are up to 10min, even small commercials/ads area matter of a few minutes of video.

    All above doesn't matter if you have a very good income and $800 (cpu + mb) is not alot to you.
     
  14. nosirrahx

    nosirrahx Master Guru

    Messages:
    323
    Likes Received:
    94
    GPU:
    HD7700
    You are probably right and certainly right if we count 2022 mid-range GPUs.
     
  15. Valken

    Valken Ancient Guru

    Messages:
    1,695
    Likes Received:
    171
    GPU:
    Forsa 1060 3GB Temp GPU
    Exciting times and like many Haswell owners here, we are itching to get up to 8 cores or more.

    But the 5800X and MB pricing is pushing it into Intel territory in terms of value. There needs to be a combo price drop. As many said, we may as well get the 5900X...

    Price per core, the 5900x is best, then 5600x / 5950x, then 5800x... WTF...

    Maybe that was the game all along... the Amazon marketing way to just "buy" a tier higher for better "value".

    Edit --- I forgot about the "binning" process so maybe the 5900 is really a good 5800 + crap 4 core CCD that failed binning? We have to wait for reviews....

    Also, this is the LAST AM4 DDR4 platform so the upgrade path is going to close.

    I am unsure if DDR5 MB will allow older CPUs to work later.

    But I will wait for CPU and the 6xxx GPU series benchmarks before I plunge in. If RTRT performance is good, AMD GPU will be my next buy.

    I might skip the CPU and just the 6900 GPU +4K OLED or VR Goggle instead since that will be where all the performance will be held back.

    Unless ARMA 4 comes out and it needs a 12-16 core beast to run it...
     
    Last edited: Oct 9, 2020

  16. MonstroMart

    MonstroMart Master Guru

    Messages:
    867
    Likes Received:
    347
    GPU:
    GB 5700 XT GOC 8G
    About the same level as Moore's Law is Dead.
     
    Last edited: Oct 9, 2020
  17. Supertribble

    Supertribble Master Guru

    Messages:
    868
    Likes Received:
    115
    GPU:
    1080Ti/RadVII/2070S
    Looking more closely at the VRM performance on my crappy board sees the temps, when moved from an open test bench, about 80-81c for the 3900x at stock, which seems OK? With the 5900x supposedly having the same TDP and overall power characteristics as the 3900x I might take a chance on the 5900x. One thing is for sure, if the CPU goes in, I won't want to be taking it out because I've never successfully removed a large HSF from a PGA socket without ripping out the CPU, and it wasn't from a lack of being careful.

    crappy board.PNG

    edit

    As an aside; what do people see in these utterly infuriating YouTubers with their ridiculous facial expressions, frenetic editing style and pure clickbait 'headlines'. I will never understand their appeal. :/
     
    Last edited: Oct 9, 2020
  18. Causese

    Causese New Member

    Messages:
    2
    Likes Received:
    0
    GPU:
    2070 Super
    AMD claiming best single-core performing CPU is a bit wild considering that it's only the case while boosting. the thing is you'd want an all-core OC for most games and a boost for specific games that are optimized for single core like world of warcraft. Intel makes it less awkward to have both. AMD needs to improve ryzen master to adjust OC based on the game / game's optimization. Turbo for single core games and all-core oc for others...
     
  19. asturur

    asturur Maha Guru

    Messages:
    1,018
    Likes Received:
    299
    GPU:
    Geforce Gtx 1080TI
    The claim on single core best performance is benchmark based bullshit.
    cpu-z or cinebench.

    For games you can't really start to optimize based on executable. Games will load the processor with what they need in that moment, not otherwise. If your game and os combination and background software is using all cores, you will get what the cpu can give you at that load level.
     
  20. suty455

    suty455 Master Guru

    Messages:
    440
    Likes Received:
    164
    GPU:
    Nvidia 3090
    I dont get it these CPUs are years ahead of Intel yet your complaining that its at the same price point!
    Why not its a better product in every way possible, if Intel had produced this CPU no doubt it would be getting hailed as a breakthrough of EPYC(see what I did there) proportions yet because its an AMD folks expect it to be the "value" platform, its obvious from the comments many make that they will stick with the old Intel stuff they have until hell freezes over or until Intel sorts out its CPUs and drags itself into the modern day, just remember those 4 cores you had as a max 3 Years ago until Zen launched and then take a look at the current CPUs available am typing this on a 16 core CPU think about that 16 cores, in what universe do you think Intel would have made that available to a consumer market if AMD had not forced their hand, so we have an AMD platform that has the highest single core and multicore performance, PCIE 4 and backwards compatability to previous gen Motherboards all in a 105w TDP yet it should still be cheaper than Intel?
    Cmon
     

Share This Page