AMD Might Release and Add Ryzen 5 5600X3D, Ryzen 9 5900X3D (X3D) procs

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jul 4, 2022.

  1. Robbo9999

    Robbo9999 Ancient Guru

    Messages:
    1,635
    Likes Received:
    345
    GPU:
    GTX1070 @2050Mhz
    I'm not sure why they're gonna bother releasing more 3D V Cache chips for the 5000 series. I'd rather see them release the 7000 series sooner and then go to town with various 3D V Cache options on that series. (I'm not gonna consider anything less than an eight core CPU though for my next purchase, so I wouldn't be interested in 3D V Cache options with say 6 cores).
     
    tunejunky likes this.
  2. Silva

    Silva Ancient Guru

    Messages:
    1,872
    Likes Received:
    1,011
    GPU:
    Asus Dual RX580 O4G
    If they do come out, quantities will be small and just a filler until Ryzen 7000 comes.
     
  3. GamerNerves

    GamerNerves Master Guru

    Messages:
    283
    Likes Received:
    64
    GPU:
    RX 5700 XT Nitro+
    If they will release these products they need to be released some time ahead the launch of Zen 4, because they are not going to be very cost effective for AMD. When they would eventually coexist with the Zen 4 family, they would need to sell them for a low price if Zen 4 is not going to be ludicrously expensive. Though, if still there are many people on Zen 1 or 2, I guess slotting one of these into existing AM4 board is a somewhat appealing option even if Zen 4 would already be out, but that depends on nextgen motherboard and DDR5 pricing.
    I guess for the six core part there is more room in the market, because if you only game you don't want to waste money on worthless extra cores. If the design of these 3D V-cache processors makes the cache part so hot that they don't allow overclocking on them, I don't think the six core part will clock any higher than the current 5800X3D, as someone wished in the comments. AMD has offered higher clocks on the best possible silicons anyway, which practically always end up in the best models with many cores.
     
  4. Aura89

    Aura89 Ancient Guru

    Messages:
    8,325
    Likes Received:
    1,416
    GPU:
    -
    You do realize that not all the chips may be capable of being used as a 5800x3D? due to defects? Where will those chips go if not for a 5600x3D?

    I'm not saying it will or won't happen, but i believe it's naive to think that every single 5800x3D that comes off the wafer is a either fully working or completely dead.

    If all it is, is the rejected 5800x3Ds that'd work perfectly fine as a 6 core, but not an 8, then the assumption it'd be too expensive isn't exactly realistic. No one knows what the price would be, and anything beyond the packaging/distribution costs would be profit over just throwing the chips away.

    Again, if they come out, i have no opinion honestly if they will or will not, and if they are just the rejects.
     

  5. CPC_RedDawn

    CPC_RedDawn Ancient Guru

    Messages:
    9,464
    Likes Received:
    1,769
    GPU:
    6800XT/3080Ti
    This could be a nice send off for AM4.

    However, it could also be for AMD to price hike Zen4 at higher prices...... :/
     
    Horus-Anhur likes this.
  6. FookDat

    FookDat Master Guru

    Messages:
    476
    Likes Received:
    34
    GPU:
    6950XT RD 2889/2498
    Looking for a reason to upgrade my 3950x on my MSI X570s ek carbon so 5950x3d would be greatly appreciated. Just DO NOT make it ridiculously priced. Hoping AM5 will alleviate the demand of the AM4 products that get released and spread out the availability of devices.
     
  7. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,812
    Likes Received:
    410
    GPU:
    RTX3080ti Founders
    I'm mega curious about how cache scales. How fast could a cpu with 512mb be? 1GB?:p
     
  8. Undying

    Undying Ancient Guru

    Messages:
    19,842
    Likes Received:
    8,118
    GPU:
    RTX 2080S AMP
    We'll probably find out with zen5. :)
     
  9. kakiharaFRS

    kakiharaFRS Master Guru

    Messages:
    879
    Likes Received:
    291
    GPU:
    KFA2 RTX 3090
    seeing my atrocious z690 12900k ddr5 pcie5 experience where every day I want to switch back to my 5950x X570 which had working 4 memory slots, more rear usbs and useable pcie slots ><
    I'm hoping for too much but a 5950x3d non OC would be great for me

    so now I'm at either AMD fixes the DDR5 is unstable incompatible trash problem, fixes the pcie/chipset "lost in the void" bandwith problem of 5.0 (might be why they went dual chipset) or I'm definitely going back to 5950x X570 had 29'000 cineR23 at 72°C and now that I had to multi thread render lots of videos at the same time with the Intel those e-cores are just useless for a desktop computer

    so to be clear DDR5 and PCIE5 are two of the main reasons I hate Z690, if AMD couldn't fix those I absolutely believe they are working on re-releases of ryzen 5000 maybe this time that scandal will be properly covered unlike currently where influencers say "I decided to stay with amd for reasons" rather than "omg it's crap don't buy"
     
    Last edited: Jul 5, 2022
  10. Catspaw

    Catspaw Active Member

    Messages:
    96
    Likes Received:
    52
    GPU:
    Nvidia 1080/8GB
    I currently have a 5900X which I bought after the support for Ryzen 5000 came to X370.
    Now we have 5800X3D as an "upgrade" and we might get something else, but... most enthusiasts probably are fine with their CPUs now and are waiting for the next 50/80% increase in performance (over more than one generation that is).

    We will see.
     

  11. tunejunky

    tunejunky Ancient Guru

    Messages:
    2,652
    Likes Received:
    1,312
    GPU:
    RX6900XT, 2070


    this would be true IF the 5800X (about to be 3D) wasn't binned.
    but they were binned, as this process is incredibly complicated and expensive.
    so yes, there were no "rejected CCD" 5800X3D
    this 3d Cache process is a step after wafer fabrication
     
    cucaulay malkin likes this.
  12. H83

    H83 Ancient Guru

    Messages:
    4,058
    Likes Received:
    1,449
    GPU:
    MSI Duke GTX1080Ti
    I think the same.

    Too late for AM4, bring us AM5.
     
    tunejunky likes this.
  13. pirlampas

    pirlampas Member

    Messages:
    12
    Likes Received:
    17
    GPU:
    RTX 2080
    5600X3D is the most probable of the bunch, it's just a single CCD like the 5800X3D but they don't need to have 8 good cores.
    Seeing how hot the 5800X3D gets already I don't know if having two CCD under the cache blanket is going to be adequate for a lot of cooling solutions.
     
    CPC_RedDawn likes this.
  14. JamesSneed

    JamesSneed Ancient Guru

    Messages:
    1,544
    Likes Received:
    803
    GPU:
    GTX 1070
    The costs side tells me this is unlikely. The APU's we will see when AMD gets on TSMC's 2nm will be sick. Its been a density and heat issue up until this point. On that node you can get enough CPU for most people and enough GPU for some(likely mid range) all in an APU. They do need to get on die memory like Apple is doing with the M1/M2 chips even if they do it via chiplets.
     
    tunejunky likes this.
  15. tunejunky

    tunejunky Ancient Guru

    Messages:
    2,652
    Likes Received:
    1,312
    GPU:
    RX6900XT, 2070
    it's almost a guarantee that there's going to be a TR3D as it's almost the same as Milan-X.
    there's a high likelihood of a 7900x3D, but i doubt there's going to be a 7950x3D - hope i'm wrong 'cuz that's my preference - simply because of heat management issues. both the 7900x and 7800x are very suitable as there are empty ccds and that equals less heat. but this is gen 2 and they may have something up their sleeve (possible lower clocks that are still faster than 5950x)
     

  16. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    6,824
    Likes Received:
    3,200
    GPU:
    HIS R9 290
    I don't recall temperatures being that much of a problem for existing APUs. Seems to me memory bandwidth has always been the major issue. In memory-intensive workloads, a Vega 8 isn't much slower than a Vega 11. For pretty much all APUs, overclocking the RAM has had a proportionate increase in performance, no matter how high you go. That tells me these chips are so starved for memory bandwidth, where a stacked cache would make a huge improvement.

    Ironically, unleashing this untapped performance via faster memory would make these chips run a little too hot. As far as I'm concerned, the thermals are only acceptable because the chips are bottlenecked.

    As far as I understand (which may be mistaken), the V-cache is a separate die. So, it's basically just a 5800X with what I would consider an L4 cache (though functionally it's an L3) and lower clock speeds.

    The 5600X was already a bit overpriced for what it was. The 5600X3D is likely going to need lower clock speeds. With fewer threads, it could really use those extra Hz. So, while it might be overall better than a 5600X, I don't see it being good enough for the extra premium of the V-cache.
     
    Last edited: Jul 5, 2022
    GamerNerves and tunejunky like this.
  17. blkspade

    blkspade Master Guru

    Messages:
    635
    Likes Received:
    29
    GPU:
    Leadtek Nvidia Geforce 6800 GT 256MB
    AMD has been planning to implement 3D vCache since Zen 2, which is where they started building the Through Silicon Vias in the chiplets. That basically allows them to implement it in any spare chiplets. The tech was first and foremost designed for the server space, so it's unlikely the TSV are implemented at all in the APU dies as those are just repurposed monolithic mobile dies. 5600X3Ds make sense in that any spare would be 5600s can be packaged at any time as X3D. Looking at the limitations, I wasn't surprised they didn't do the 5950X3D. The target demographic for the 16 core is productivity. Asking $800 for it to be slower at what its really meant for is a bit ridiculous. The 1.35v vcore limit would hit harder on the 5950X. Single thread boost clocks are lower because of that, not heat.
     
  18. user1

    user1 Ancient Guru

    Messages:
    2,171
    Likes Received:
    861
    GPU:
    hd 6870
    I could see it if there are rejects that failed to bond the cache properly, if they can disable the defective part of the cache, like say providing a l3 32+32mb cache instead of the usual 32+64mb.
     
    BlindBison likes this.
  19. JamesSneed

    JamesSneed Ancient Guru

    Messages:
    1,544
    Likes Received:
    803
    GPU:
    GTX 1070
    You don't recall temperatures being an issue because nobody made an APU high end enough due to temperatures. Meaning it was a big roadblock because they simply didn't attempt it. We will see some APU's that are really competent at gaming in just a couple years. Having 8 cores down on 2nm will eat up very little die space and heat budget so you can pair it with a large 100-150w GPU die and still cool it. Of course, on die memory must happen to get that kind of performance and it will. Getting say 16GB of GDDR on die at 2nm will be very easy possibly even just one or two chiplets or built into the IO die.

    The 3D cache is a true L3. They stacked it using copper via's that run down the middle of the CPU die so there is virtually no latency difference between the on die L3 and stacked L3. There is a tiny latency hit but that is primarily due to the larger cache not the fact its stacked as we are talking sub mm lengths of copper pins for the via's.
     
    Last edited: Jul 6, 2022
  20. blkspade

    blkspade Master Guru

    Messages:
    635
    Likes Received:
    29
    GPU:
    Leadtek Nvidia Geforce 6800 GT 256MB
    The roadblock for APUs has always been memory bandwidth first and foremost. Besides Vega not being all that great, there was never a point in scaling it up because dual channel DDR4 would never cut it. Infinity cache in RDNA2 wouldn't be able to make up the deficit of DDR4 which is why they waited for DDR5. GDDR has pretty much always had higher latencies than standard DDR, so the fact that initial DDR5 has higher latency than DDR4 is a non issue for the GPU portion of the APU. Heat is only a potential issue because the APUs are mobile dies that later get repurposed for desktop. Any APU that would be made explicitly for desktop would have way more thermal headroom.
     

Share This Page