AMD Might Release and Add Ryzen 5 5600X3D, Ryzen 9 5900X3D (X3D) procs

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jul 4, 2022.

  1. moab600

    moab600 Ancient Guru

    Messages:
    6,660
    Likes Received:
    557
    GPU:
    PNY 4090 XLR8 24GB
    I'm a bit tempted to get the DDR4 3800 CL14 instead of Zen4 and wait till prices will be much more affordable...

    Though i'm playing at 4K, so IDK how much mine 5950x will benefit from that kit(3200CL16 now that i oced to 3400).
     
    cucaulay malkin and Valken like this.
  2. GamerNerves

    GamerNerves Master Guru

    Messages:
    354
    Likes Received:
    102
    GPU:
    RX 5700 XT Nitro+
    This always depends on the current and upcoming requirements of games and the performance of the part itself. Sorry to say, but this is very basic comprehension what an active tech forum poster should understand. 2700X perhaps was the better option than 2600X, or 1700 better than 1600, though I'm not claiming anything on these examples, I'm just focusing on the overall performance of the part, scalability of games and the near future of game development. If you buy now, going higher than eight cores for gaming is a waste especially when we consider pricing, and currently for the games of now it is worthless. The possible 5600X3D would serve one well, but of course it is smarter to get 5600 or i5-12400F and just buy something newer after two to four years.
     
  3. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,208
    GPU:
    AD102/Navi21
    it's never changed, 6-8 is the sweet spot and will continue so.
     
  4. blkspade

    blkspade Master Guru

    Messages:
    646
    Likes Received:
    33
    GPU:
    Leadtek Nvidia Geforce 6800 GT 256MB
    Consumer APU's can only be good enough to cannibalize what goes for low end now. You just move up what the bottom actually is. Something else will always be better just by being allotted more power. Its like the Core i3 going from 4 threads to 8, or six core CPUs displacing quads. Particularly in gaming frame consistency. AMD would benefit either way, in the market share increase alone. It's unfortunate that Intel has the most overall GPU market share with garbage iGPUs. They sell the most CPUs which includes them by default, but no one really wants to use or optimize them for gaming. Nvidia would be hurt most in that equation. I wasn't talking about company budget, but ultimate cost to buy such a product. There is no reason AMD can't outright make a more massive APU from an engineering POV, but the cost to the consumer would be less than appealing if they were going to integrate the VRAM. They actually have Epyc/Threadripper APU parts planned. Even without integrating VRAM in the package, they can leverage up 12 channels of DDR5. Obviously not a consumer gaming product.
     

  5. blkspade

    blkspade Master Guru

    Messages:
    646
    Likes Received:
    33
    GPU:
    Leadtek Nvidia Geforce 6800 GT 256MB
    Ultimately minimal CPU requirements for gaming are going to be dictated by the lowest common denominator, for most games. That's basically console ports with the max potential use of a 8C/16T Zen 2 CPU locked to 3.5Ghz. Theoretically any CPU with similar processing capacity should hold up until the end of the console generation. That might exclude what ever potential small number of PC-centric titles end up competing for the "Can it play" crown. If there ever is such a thing again.
     
  6. Robbo9999

    Robbo9999 Ancient Guru

    Messages:
    1,858
    Likes Received:
    442
    GPU:
    RTX 3080
    I think a recent 8C/16T is about as good as it gets for a few years now in the future, especially when talking about the 3D V-cache chips. I think that's where it's at personally, when it comes to gaming....I can't see any other CPU I would buy for my next build.
     
  7. Supertribble

    Supertribble Master Guru

    Messages:
    978
    Likes Received:
    174
    GPU:
    Noctua 3070/3080 FE
    Going back to the RAM discussion on the previous page and how cheap it is, well there seems to be a bit of a shortage of Crucial DDR4 3600 c16 here in the UK and I wasn't able to find matching modules at an affordable price so ended up having to pay £153 for an additional 16GB of Ballistix Elite. It would have been cheaper to shop for a whole other 32GB kit TBH. :/
     
  8. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,872
    Likes Received:
    446
    GPU:
    RTX3080ti Founders
    Ouch. I paid less than that for normal Ballistix 3200 4x8GB, but, that was 6yrs ago.
     
  9. JamesSneed

    JamesSneed Ancient Guru

    Messages:
    1,691
    Likes Received:
    962
    GPU:
    GTX 1070
    I wish we had a remind me in 3 years option so we could see how this turns out. I understand what you are saying. The reasons you state are exactly why APU's will take over most the volume sales over the next decade. They will be cheaper than a CPU + Graphics card for similar performance, at least up to mid-range levels. People will buy them because they are good enough for gaming and are cheap. They will come with enough on die memory at least for the GPU but the lower end ones for laptops might have on die memory for the system. The big box guys will love them because they won't have to add in another card to call it a "gaming" PC. Even with the price drops for prime day a RTX 3060 is $400-500 and a Ryzen 5700x is $250. This level of performance could easily be an APU in a couple die shrinks and could easily be sold for considerably less than$650-750 and be very profitable.
     
    tunejunky likes this.
  10. Dribble

    Dribble Master Guru

    Messages:
    369
    Likes Received:
    140
    GPU:
    Geforce 1070
    People have been claiming exactly this was going to happen since AMD announced the concept a very long time ago. Never happens, and more onboard cache isn't the margic bullet that's going to change that. The only market that has higher end gpu's in an apu is consoles. Consoles have already tried large onboard caches - the xbox one had it with the embedded sram. Not only did it still need very fast gpu memory on top of that, it clearly wasn't very successful as Sony never copied the idea.
     

  11. Pryme

    Pryme Master Guru

    Messages:
    284
    Likes Received:
    206
    GPU:
    XFX RX6800 SWFT
    I learned that lesson with the FX series of CPUs, I had the 6300 and upgraded to the 8320E. They said the same thing, it lacked IPC but they have the threads, so it will be usefull in the future. When we had games that were utilizing the 8 threads, it lacked the necessary IPC and the platform was already underperforming. And the upgrade from the 6300 to the 8320E, that gave me more 2 threads, almost did nothing.

    I had friends buying the 1700x/2700x, because of the threads that I told them it was a waste of money. I did the upgrade, but I went with the 2600, it was very well priced, ok IPC with a proper overclock. For what I'm seeing in the actual market, the 5600, or even maybe the 5500 (is cheaper 50% compared to the 5600 today in my country), will be my final step on the AM4 platform.
     
    cucaulay malkin likes this.
  12. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,208
    GPU:
    AD102/Navi21
    Theres something that tells me just oh frack it just get the 7600x3d.
    I hope they make it, even though I will likely purchase something under 200eur not over 300
     
    tunejunky likes this.
  13. JamesSneed

    JamesSneed Ancient Guru

    Messages:
    1,691
    Likes Received:
    962
    GPU:
    GTX 1070
    I am aware I have followed this for 25 years at a rather deep level. The reason it never happened is because there was never enough transistors and thermal headroom to pull it off. Once we are on TSMC's 2nm or Intel's 18A that will give them enough transistors to make this possible for a reasonable price. We are talking 3-4 times the transistor density than today's chips. Those nodes are using GAA which will also be very efficient so thermal headroom should be much better. It will absolutely happen so it's more of when than if at this point.

    The Xbox 360 and Xbox One did use some EDRAM and ESRAM but it was specific purpose built and was not what I meant. I was suggesting adding the GDDR memory that is normally off die on a graphics card to the die(could be via a chiplet). This could easily be done since GPU's only need 12-16GB of memory and in this case maybe less because bandwidth could be increased.
    Memory Subsystem - The Xbox One: Hardware Analysis & Comparison to PlayStation 4 (anandtech.com)

    If you look at Apples M1 / M1 Ultra they have 100% on die memory up to 128GB. This is what everyone will copy in APU's. AMD and Intel will have too if they want to compete especially if Apple starts selling the CPU's to have ARM based Windows laptops. The M1 Ultra has 800GB/s of memory bandwidth which has it right in line with today's highest end GPU's like the RTX 3090 which has 936 GB/s of memory bandwidth. Now imagine having 4x the transistor density with most of that going to the GPU. Its really not a big jump from today's M1 Ultra.
     
    tunejunky likes this.
  14. Horus-Anhur

    Horus-Anhur Ancient Guru

    Messages:
    8,731
    Likes Received:
    10,818
    GPU:
    RX 6800 XT
    But the M1 Ultra has a different approach. Apple decided to go wide, instead of using higher clocked memory.
    The M1 Ultra has a 1024 bit memory bus. Resulting in an impressive bandwidth for an SoC.
    But it also has an impressively high memory latency, of almost 100ns. S when there is a cache miss, it hurts a lot.
     
    tunejunky likes this.
  15. JamesSneed

    JamesSneed Ancient Guru

    Messages:
    1,691
    Likes Received:
    962
    GPU:
    GTX 1070
    Yeah that's how they solved the bandwidth issue to have unified memory which was my point way long back that its doable. That approach works for laptop chips but like you pointed out latency is higher. For high perf desktop parts I really only see using GDDR chips/chiplets and letting system memory be as it is today with DDR5. This seems like a very simple way to start having high powered GPU's on an APU without any major architectural changes. At some point they could do a unified memory approach, but it would need to have no negative performance tradeoffs so neither the CPU or GPU suffers.

    When you hit around 500 million transistors per mm2 it starts to open up possibilities. For example, it would take 56mm2 to match the transistor count in a 3090. That is about the size of an iGPU today.
     
    tunejunky likes this.

  16. gwynbleidd58

    gwynbleidd58 Guest

    Messages:
    1
    Likes Received:
    1
    GPU:
    1060 6GB
    I'd love to have a 5900X3D as a later upgrade to my 5600X on B550. Plenty of cores for virtual machines and 3D cache for games, it's a win win.
     
    tunejunky likes this.
  17. tunejunky

    tunejunky Ancient Guru

    Messages:
    4,460
    Likes Received:
    3,085
    GPU:
    7900xtx/7900xt

    nailed on the head
     
  18. tunejunky

    tunejunky Ancient Guru

    Messages:
    4,460
    Likes Received:
    3,085
    GPU:
    7900xtx/7900xt
    please let's remember that we are enthusiasts and "acceptable" levels of performance are in the eyes of the beholder.

    my other point is that with the rise of browser gaming (incredibly popular if not our cup 'o tea) gpu reqs are almost non-existent and low eye candy gaming makes up for it in FPS (Overwatch).

    WE like more of everything at the max quality so we've should keep the regular folks in mind.
     
  19. user1

    user1 Ancient Guru

    Messages:
    2,785
    Likes Received:
    1,305
    GPU:
    Mi25/IGP
    afaik crucial has discontinued the ddr4 ballistix earlier this year, they only sell naked jedec dimms now. anything available is basically old stock.
     
    tunejunky likes this.

Share This Page