AMD Might Release and Add Ryzen 5 5600X3D, Ryzen 9 5900X3D (X3D) procs

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jul 4, 2022.

  1. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,872
    Likes Received:
    446
    GPU:
    RTX3080ti Founders
    tunejunky likes this.
  2. tunejunky

    tunejunky Ancient Guru

    Messages:
    4,346
    Likes Received:
    2,988
    GPU:
    7900xtx/7900xt
    i agree...mostly

    the heat that JamesSneed is talking about is created under a 3D cache blanket.

    also...
    any 3d Cache APU (it will eventually be made) is not going to have great thermal headroom, but at that point it won't matter as the gaming performance obviates any real need to OC.

    AMD has been crazy good at low(er) power uArch
     
  3. JamesSneed

    JamesSneed Ancient Guru

    Messages:
    1,690
    Likes Received:
    960
    GPU:
    GTX 1070
    Yeah they are getting OK. Just need a couple die shrinks before we see high-end desktop SKU's. Apples M2 beats the 6800U in Tomb Raider which is another example of an APU/SOC.
     
  4. JamesSneed

    JamesSneed Ancient Guru

    Messages:
    1,690
    Likes Received:
    960
    GPU:
    GTX 1070
    Apple solved the memory bandwidth issue with the M1 and M2. I suspect we will see some copy pasting of the wide memory bus and on die memory. Once we get a few more die shrinks that should net enough transistors to put this concept into desktop parts. The M1 max has 64GB of shared memory using a 512-bit bus and LPDDR5-6400 memory which for nearly all of us gamers would be enough if AMD or Intel made a similar x86 part. A couple more die shrinks down on TSMC's 2nm that will likely double to 128GB which would satisfy 99% of the desktop market except the true workstations.
     

  5. Horus-Anhur

    Horus-Anhur Ancient Guru

    Messages:
    8,638
    Likes Received:
    10,691
    GPU:
    RX 6800 XT
    Another big reason why the M1 and M2 GPU perform so well, for it's power use, it that it uses a Tiled based deferred rendering architecture.
     
  6. Kool64

    Kool64 Ancient Guru

    Messages:
    1,658
    Likes Received:
    784
    GPU:
    Gigabyte 4070
    by the time these come out it won't really matter except for a handful of specific use hold outs
     
  7. blkspade

    blkspade Master Guru

    Messages:
    646
    Likes Received:
    33
    GPU:
    Leadtek Nvidia Geforce 6800 GT 256MB
    I wouldn't say they "solved" it, because it was never a mystery. Apple devices are its own market of products. They took their existing design philosophy of not allowing after market upgrades, and worked in meaningful engineering. The bulk of people that prefer the PC platform don't want the options taken away for expansion. When I debate with people over what Apple charges to bump up RAM and Storage, they say other OEMS charge markups too. Ignoring the fact that its not soldered on, so you can choose from components in a competitive market. Apple's "luxury" pricing will always limit their market share. The power efficiency advantage is mostly only meaningful in mobile devices, because they haven't run away with raw performance. Their method runs counter to what the mass market wants, and can afford. To even approach that for PC desktop while, still allowing some freedom, the CPUs would be threadripper sized packages.
     
  8. JamesSneed

    JamesSneed Ancient Guru

    Messages:
    1,690
    Likes Received:
    960
    GPU:
    GTX 1070
    They solved the memory bandwidth issue which is what I was replying to when I said that. Nobody else has anywhere near the bandwidth in an APU. Sure every chip maker knows they can do it but alas they have not. I also disagree the bulk of people won't give a flying flip about options taken away if the base option is extremely future proof like say 128GB of on die DDR5. How many people gripe about not being able to upgrade GPU memory and who would balk at that being on die and non-upgradeable? Yeah like 1% which an APU obviously isnt for them. Now soldering the chip to the board might cause a bigger uproar but then AMD and Intel also have soldered chips in the lower end segments so its nothing new but I dont expect to see it in the desktop space from AMD nor Intel. Anyhow give it about 3-4 years once chips come out on TSMC's 2nm / Intels 18A and you will see high end APU's from both Intel, AMD possibly even Nvidia via ARM chips.
     
    Last edited: Jul 7, 2022
  9. blkspade

    blkspade Master Guru

    Messages:
    646
    Likes Received:
    33
    GPU:
    Leadtek Nvidia Geforce 6800 GT 256MB
    What I'm saying is the option of higher bandwidth memory solves the the problem not Apple. Apple's method is not a commodity solution. The best APU implementation AMD has to offer is in consoles, but those fill basically a single purpose. Consoles generally initially sell at a loss and just make up the difference in software and accessory margins until production costs come down for the hardware to be profitable. Consoles are guaranteed sales because of a large dedicated fan base, regardless of the hardware that's in them. Look at what Apple charges for that 128GB option. They only cater to those willing and able to spend stupid amounts of money. Apple computer market share rarely exceeds 16% because they're TAM is smaller. One of the reasons Windows continues to hold such a high share of the market compared to Mac OS is because there are a wide range of more affordable devices available for everyone else. Other OEMs succeed by volume, whereas Apple succeeds by fleecing the same small group of people repeatedly. It's not like AMD/Intel couldn't have done it at all, it's just harder to sell it in volume.

    I personally service and upgrade hundreds if not a 1000+ of devices a year. Because most people don't have maximum money to spend on the initial purchase. AMD literally went from single digit to 25%+ market share by offering a very upgradable and cost effective platform.

    GPUs and Consoles work on similar principals in that the its accepted that they are what they are, and that there would be no benefit from being able to upgrade memory. It's easy to run an entire general purpose computer out of memory, and be benefited by after market expansion. A console has all its software written with a single target configuration in mind. A GPU having a smaller number of potential functions can run out of processing ability before frame buffer is the problem. A GPU is a single component of a whole system, which makes your comparison asinine. That would be like expecting people to complain about not being able add bandwidth to the built-in NIC on their motherboard. You just get another better one. At a fraction of the cost of a whole new board or computer.
     
    Kaarme likes this.
  10. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,872
    Likes Received:
    446
    GPU:
    RTX3080ti Founders
    Side note: The GPD WIN Max 2 has 4x the hd space and 4x the ram for only £1099 right now vs £1549 M2 Air 8GB ram 512GB hd. It's a steal and could be a desktop replacement for some users.
     

  11. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,208
    GPU:
    AD102/Navi21
    oh how the narration has changed from people buying 2700x/3900x for "futureproofing", now more than six worthless.
     
  12. Undying

    Undying Ancient Guru

    Messages:
    25,343
    Likes Received:
    12,754
    GPU:
    XFX RX6800XT 16GB
    More cpu headroom is always better. I would never buy a six core in 2022.
     
    Robbo9999 and cucaulay malkin like this.
  13. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,208
    GPU:
    AD102/Navi21
    i'd buy one, no problem
    price being the determining factor
    have you seen what 12400f (170eur) can do with this b660 max board ?
    it would smoke my 10700, outperform a 5800x and get close to 12700k (380eur)
    but if 5600x3d starts at 300 and sells at 350-400 just like 5800x3d sells for the price of 5950x now, then now way.

    also, what you describe as headroom is just mostly performance. an older 8/12c isn't magically going to get faster with age.
     
    Last edited: Jul 8, 2022
  14. Undying

    Undying Ancient Guru

    Messages:
    25,343
    Likes Received:
    12,754
    GPU:
    XFX RX6800XT 16GB
    I dont belive that. Its already at 90% usage in a few games and if you you do anything else on while gaming (watching a youtube videos) there goes your fps...

    More headroom always better just like vram :D
     
  15. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,208
    GPU:
    AD102/Navi21
    what you describe as headroom is just mostly performance. an older 8/12c isn't magically going to get faster with age.

    it's not a matter of belief watching yt is not going to tank your fps
    you may run out of ram at some point but that's it

    buying a 8-12 for multitasking while gaming is a myth it's only useful for streaming

    i'm considering 32g ram now, honestly im thinking why with all the prices exploding im stick sticcking with 16g. it's fine but if you keep the pc on all the time it will require a reboot at some point cause system ram usage always creeps up.
     
    Last edited: Jul 8, 2022

  16. Undying

    Undying Ancient Guru

    Messages:
    25,343
    Likes Received:
    12,754
    GPU:
    XFX RX6800XT 16GB
    Im surprised you dont have 32gb system ram already. I have 2x16gb dual rank kit but 3600 cl16 is best can do becouse of it.
     
  17. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,208
    GPU:
    AD102/Navi21
    frankly i don't know either
    it won't help me much but on the other hand it's so cheap that just why not have it
     
  18. Undying

    Undying Ancient Guru

    Messages:
    25,343
    Likes Received:
    12,754
    GPU:
    XFX RX6800XT 16GB
    Why not? You dont plan going ddr5 soon so go for it.
     
  19. Kaarme

    Kaarme Ancient Guru

    Messages:
    3,513
    Likes Received:
    2,355
    GPU:
    Nvidia 4070 FE
    I agree on that. While Apple does sport an unlimited budget for sure, AMD has highly versatile technogical know-how. AMD was an original developed of HBM (and consequently also the interposer tech), which would seem like quite an obvious way of giving an iGPU a lot of high bandwidth memory if you didn't need to worry about the cost at all. AMD was the first to release MCM CPUs to the mass market, which avoids the problem of giant, monolithic chips, further helping with making any kind of CPU+iGPU monster. So, yeah, it's more of a market thing than technology thing for AMD. There's also the problem of too good APUs eating from AMD's own discrete GPU sales. Entry level graphics cards might disappear entirely soon enough, with even lowest mainstream possibly doing the same. Perhaps dragging up the prices of the graphics cards is, partially, an early method to compensate for the future lost money from the lower end. AMD itself aside, it'd hurt the board partners even more.
     
  20. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,208
    GPU:
    AD102/Navi21
    no way in hell
    it just got even more expensive, it'll take a lot of time for it to come down. I don't expect normal prices for ddr5 this year, or early next year tbh.
    i'm really fine with my b-die kit dunno if I need 32gb
     

Share This Page