AMD Ryzen 7 5700G Cezanne benchmark leaks

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Apr 5, 2021.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    40,626
    Likes Received:
    8,991
    GPU:
    AMD | NVIDIA
    And it seems to be substantially faster than its predecessor. A Ryzen 7 5700G (Cezanne ) was spotted and tested, the successor of the Ryzen 4000G (Renoir) as abenchmark leaked of the 8-core APU based...

    AMD Ryzen 7 5700G Cezanne benchmark leaks
     
    Embra likes this.
  2. CPC_RedDawn

    CPC_RedDawn Ancient Guru

    Messages:
    8,493
    Likes Received:
    849
    GPU:
    6800XT Nitro+ SE
    god damn it, I just got a PRO 4650G off ebay for a little ITX build. Ah well I doubt the iGPU is much different, seems to just be CPU that has changed from Zen2 to Zen3.
     
  3. heffeque

    heffeque Ancient Guru

    Messages:
    4,108
    Likes Received:
    79
    GPU:
    nVidia MX150
    Quite a bit faster than an i9... nice!
    Hopefully next gen ditches Vega once and for all.

    It would be nice to see if this 5700G has an updated HW video decoder for AV1 support (AV1 != AVI) so that it can run quietly when playing those videos.
    My i7 "U" can't keep up with 1080p 10bit of AV1 on complex scenes, so even though this 5700G probably can, I'm fairly certain that 4K HDR 10bit is not possible, and if it is, it will surely make the CPU's fan work like crazy.
     
    Last edited: Apr 5, 2021
  4. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    5,748
    Likes Received:
    2,193
    GPU:
    HIS R9 290
    Yeah, I don't think you're really going to be missing out on much. If you can wait for the 5000G series, you might as well wait until they transition to either RDNA2 or DDR5 (likely both). For previous generation APUs, they don't seem to go any higher than Vega 11. The GPU is too starved for memory bandwidth, so it just doesn't make sense to add more processing power when you're not going to see any difference.
    Zen3 is better with caching and seems to be less memory intensive, but I suspect it won't make that big of a difference for the iGPU.
    Lowering your texture detail would likely have a bigger impact. Anything you can do to reduce VRAM will make a noticeable impact on the iGPU performance.
     
    HandR and CPC_RedDawn like this.

  5. CPC_RedDawn

    CPC_RedDawn Ancient Guru

    Messages:
    8,493
    Likes Received:
    849
    GPU:
    6800XT Nitro+ SE
    Yea I was just joking really, the 4650G for the moment is perfect for what I have it doing.

    Just in the process of putting the system inside a INWIN Chopin ITX case with a Noctua low profile cooler. I have it all setup and running outside the case on a makeshift test bench. Was testing if Resize BAR would work on the iGPU but it seems to giving weird results with random driver crashes, once disabled those issues go away.

    Also I have it paired with 16GB 3200MHz CL14 memory so its got decent bandwidth, but this isn't really for demanding games and I only need to allocate 2-4GB for VRAM for what I need anyway.
     
  6. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    5,748
    Likes Received:
    2,193
    GPU:
    HIS R9 290
    BAR wouldn't really do anything for an iGPU, because you're just using system memory anyway. The whole point of BAR is so more VRAM is exposed and accessible, but, it's all exposed anyway for an iGPU. That being said, you also don't need to allocate any more memory to the GPU than 64MB or whatever the lowest your motherboard allows. Some games and applications might freak out about there not being enough, but otherwise, your OS will use system memory as needed, and there's no performance loss because it's all coming from the same source and memory controller anyway.

    If you can, see if you can OC your RAM a little bit. 3200MHz is plenty sufficient for the CPU but the GPU demands more bandwidth than that.
     
    CPC_RedDawn likes this.
  7. Agonist

    Agonist Ancient Guru

    Messages:
    3,088
    Likes Received:
    360
    GPU:
    XFX 5700XT Raw II
    God, this spanks my 2700x.......
     
    Undying likes this.
  8. TheDeeGee

    TheDeeGee Ancient Guru

    Messages:
    7,039
    Likes Received:
    1,253
    GPU:
    NVIDIA GTX 1070
    Maybe i'll give AMD another chance, pray for no WHEAs.

    I will sure all hell go with Gigabyte again, ASUS seems terrible with a lot of broken bios functions.

    Gonna have to wait for a review really, see how it compares to a 5800X in terms of temperatures, cuz damn that thing runs hot.
     
  9. BLEH!

    BLEH! Ancient Guru

    Messages:
    6,067
    Likes Received:
    149
    GPU:
    Sapphire Fury
    These still using 8CU iGPUs?
     
  10. Mundosold

    Mundosold Master Guru

    Messages:
    224
    Likes Received:
    85
    GPU:
    GTX 1070 Ti
    Sounds like it but there is little point in upgrading the iGPU on these when bandwidth is so contrained by DDR4. I think the next big step up will be the first APUs that support DDR5.
     

  11. bobblunderton

    bobblunderton Master Guru

    Messages:
    404
    Likes Received:
    190
    GPU:
    EVGA 2070 Super 8gb
    Some people were saying 'see if you can overclock your RAM a bit blah blah'. I saw CL14 3200mhz, shook my head a bit at what other's said, that speed is just fine for 99% of folks. You don't need to overclock it. Sure, overclocking the memory would possibly increase performance, at the cost of possible system instability which isn't worth it. There's quite a few motherboards - especially budget 300-series boards that won't go any tighter on the timings or higher on the mhz than that. B450 and higher boards is where memory speed started getting nicer as the channel-trace length was shortened on each revision of the chipset. So worry not, your PC is fine, and a 4650G is a great CPU (with unified L3 at that) and you'll get many years out of it. Of course if you transported me back to 1996 and all I had was a pentium-clone or 486 clone, and a copy of Duke3D and Doom 1 and 2 registered versions + some level editors, I'd still be extremely happy.

    On-topic, am I the only person around here who wouldn't mind a 16-core 32-thread chip with a pretty decent iGPU?
     
    CPC_RedDawn likes this.
  12. Undying

    Undying Ancient Guru

    Messages:
    15,240
    Likes Received:
    4,281
    GPU:
    Aorus RX580 XTR 8GB
    I wouldnt mind seeing such a high core apu with some rdna igpu magic and ddr5. Now thats would be something.
     
    CPC_RedDawn likes this.
  13. BLEH!

    BLEH! Ancient Guru

    Messages:
    6,067
    Likes Received:
    149
    GPU:
    Sapphire Fury
    That's a shame. These will still be graphically slower than the 3400G then. I was hoping AMD would do a 5950X style CPU with an 8-core die and a GPU die, but oh well, a man can dream.
     
  14. jura11

    jura11 Ancient Guru

    Messages:
    2,491
    Likes Received:
    556
    GPU:
    GTX1080 Ti 1080 FE
    I'm on Asus ROG Crosshair VIII Hero X570 and no issues to the date, running 3900X with all core OC at 4.35GHz, temperatures won't break 70°C in rendering and running G.SKILL TridentZ Neo 3600MHz with 1800FCLK or 1:1

    Personally I would go with Asus, didn't tried yet any Gigabyte X570 boards and really I can't comment if they're good or bad

    Hope this helps

    Thanks, Jura
     
  15. Agonist

    Agonist Ancient Guru

    Messages:
    3,088
    Likes Received:
    360
    GPU:
    XFX 5700XT Raw II
    Did you ditch Intel when they had dead chipset usb 3.0 problems on sandybridge? I bet you didnt. You just kept on going. You act like AMD released the biggest piece of crap in history. If thats your reason, you are just a fanboy that wanted a bs reason to fanboy for intel.
     

  16. TheDeeGee

    TheDeeGee Ancient Guru

    Messages:
    7,039
    Likes Received:
    1,253
    GPU:
    NVIDIA GTX 1070
    I spend half an hour writing a story to defend myself, but you're not worth an explaination in the end.

    Just going to give you the Prick of the Year award and put you on ignore.
     

Share This Page