1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Review: AMD Ryzen 5 2400G APU

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Feb 12, 2018.

  1. jststojc

    jststojc Maha Guru

    Messages:
    1,321
    Likes Received:
    2
    GPU:
    gigabyte gtx285
    Im sorry to bother you, but i didnt see any tests with overclocked ram, seeing as the FPS arent much different between the 2200g and 2400g i was thinking that the 2400g might be held back by the ram speed. Hence perhaps if you could do a test with overclocked ram (the flarex should oc decently no?). Naturally what you wrote in the review that the ram costs more than APU+Mobo is true and ridiculous. But it would still be very interesting just to see if its ram limitation (frequency better than timing etc., frequency bringing improvement, did the ram controller get improved etc)
    Thanks
     
  2. Noisiv

    Noisiv Ancient Guru

    Messages:
    6,655
    Likes Received:
    496
    GPU:
    2070 Super

    Her ya go:

    [​IMG]

    I'd love this APU for my 2D gaming needs (I think IGP gaming is kewl :D)

    Still wouldn't touch it for competitive multi-play (DOTA 2, Overwatch, TF2, CS/GO, Diablo III), no matter how light GPU load.
    Nor would I recommend it to anyone who even thinks about modern games. Lets get real for a sec: This is a very good product, might be even great!
    Now lets not go overboard: 1080p or no, 35fps @lowest is not entry level gaming, it's torture gaming.

    We've seen similar APUs from AMD earlier. TBH this one might be a notch better (closer to Intel CPUs perf. wise and closer to NV entry GPUs perf. wise), nevertheless so called light desktop gaming/entry gaming is not what makes or brakes this APU.
    Light gaming at desktop did not exist as a viable market or something which needed to be addressed 5 years ago. It's no different today.

    Its all about Notebooks (money wise), and that's where this puppy needs to shine.
     
  3. Whiplashwang

    Whiplashwang Ancient Guru

    Messages:
    2,285
    Likes Received:
    186
    GPU:
    MSI GTX 1080Ti
    Great review Hilbert!

    Did anyone see this yet? https://www.pcgamesn.com/amd-raven-ridge-overclocking

    I thought it was just the Ryzen sleep bug, but look at his benchmarks! Does anyone have any idea how this is possible? Unless the author is lying, which is possible, but I don't understand why he would when it would affect the rep of pcgamesn.
     
  4. D3M1G0D

    D3M1G0D Ancient Guru

    Messages:
    1,931
    Likes Received:
    1,240
    GPU:
    2 x GeForce 1080 Ti
    Eh? At 720p the 2400G can do an average of 215 FPS in CS:GO, 98 FPS in Overwatch, 99 FPS in Rocket League and 75 FPS in DOTA 2. I don't know about you, but I would say that's pretty good for entry-level gaming (and 1080p is also very playable in those games). PUBG is the odd man out, at least at 1080p.
     

  5. Picolete

    Picolete Master Guru

    Messages:
    270
    Likes Received:
    61
    GPU:
    R9 290 Sapphire Tri-x
    About the temepratures,
    Could it be that this CPU has the same thing that Ryzen had on release, that reported 20ÂșC above the real temperature?
     
  6. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    36,080
    Likes Received:
    5,113
    GPU:
    AMD | NVIDIA
    Some initial DOTA 2 results have been added to both articles.
     
    sverek likes this.
  7. Embra

    Embra Master Guru

    Messages:
    896
    Likes Received:
    200
    GPU:
    Vega 64 Nitro+LE
    Thank you HH. Great review! Looks pretty good.
     
  8. aKiss

    aKiss Member

    Messages:
    33
    Likes Received:
    0
    GPU:
    Gigabyte 1050ti LP
    I have asked this question in autumn last year, i know financially it made no point, but now looking at the GPU prices and knowing the limits of the socket AM4, AMD cannot add more vega cores on ryzen, but they can on threadripper. why not add 23 cores, like intel has on their ultrabook cpus but on separate silicone. AMD can do this on the TR4 socketed cpu, as it is so massive, they can do kinda like they did on ryzen, sacrifice half of cores, to add vega cores, basically replacing CCX cores with vega CUs. Of course being threadripper they actually can do this with an entire die, since they are 4 of them, it will not be a threadripper anymore, since it loses many cores by sacrificing a die, but it will be... a Frankenstein monster, just because...

    I know that the cheapest threadripper only has 2 dies, so why not only have one die for CPU and the other for vega CUs? Based on the next image, a threadripper die has 8 cores:

    [​IMG]


    We know that threadripper is basically 2 functional dies with 16 cores, the rest are dummy dies only used on epyc cpus. A ryzen CPU has the same layout and ony the top 1800x has all the cores available.

    This is the ryzen 5 2400g die, notice how it only has 1 CCX those are the names of the core complex from the left and right sides of the image above by the way:

    [​IMG]

    My question is this: Why doesn't AMD add 2 vega cores on the remaining 2 dummy dies on threadripper for a total of 11x2 = 22 vega CUs
    Will there be a problem with the infinity fabric communication between the 2 vega CU dies and the 2 ryzen dies?
    They can have 8 ryzen cores, with no hyper threading i think it would be overkill on 2 dies and 22 vega CU on the other 2 dies.

    How much would it cost? 800$ the same as intel's full NUC? Maybe more? Maybe technically and financially it makes no sense? Maybe the TDP will be close to 200 W ?
    Am i crazy for thinking about this? AMD sure could have do it if it was possible, maybe i am exaggerating on what is technically possible and especially, financially possible.
     
  9. Nintendork

    Nintendork Member

    Messages:
    24
    Likes Received:
    0
    GPU:
    Gigabyte HD4770
    Why even bother to OC the cpu when pretty much no one cares about that on a APU? Specially with 3.6-3.9Ghz clocks. It's still the same 14nm LPP "LOW POWER PERFORMANCE", so 4Ghz are a no-no unless you get a golden chip (like the latest batches of Ryzen).

    Want to OC Ryzen CPU, wait for the 12nm LP "LEADING PERFORMANCE" (meant for 4Ghz+ frequencies without losing efficiency or nasty voltages) Ryzen+ in April where all the 2000's based on 12nm will get a 300-400Mhz boost on the base clocks with the same TDP and power consumption..

    OC the igpu, the thing that everyone wants and cares about an APU.

    AMD claimed 1600Mhz+ OC, so technically it could be in between GT1030/GT1050


    From all tests it seems the 2400G is kind of BW starved vs the 2200G even with 3200 CL14, I would do a later update with the fastest memory RR can support (maybe DDR4 3600 CL15 or DDR4 4066 CL18).


    I just hope that with the 7nm next year and so much space assuming 4c/8t for the cpu part we get 1024SP (16CU's) based on Navi and 2GB HBM2 for a "3400G" or "3400GX"
     
    Last edited: Feb 12, 2018
  10. aKiss

    aKiss Member

    Messages:
    33
    Likes Received:
    0
    GPU:
    Gigabyte 1050ti LP
    I think when they go to 12nm on zen 2 and 7nm on navi, i hope they can stick my idea into the AM4 socket instead of threadripper, it should fit in that space, i am not sure, using threadripper monster TR4 for my idea is not only expensive but it generates massive heat, me thinks...
     

  11. Nintendork

    Nintendork Member

    Messages:
    24
    Likes Received:
    0
    GPU:
    Gigabyte HD4770

    I also ask myself that.

    TR socket opens new possibilities for enthusiast on the budget gaming oriented APU's.

    Easily a 4c/8t CPU with 2048SP 24Vega CU's + 4GB HBM2 @125w tdp (intel Hades Canyon is 100w TDP with 20 Vega CU's)

    Basically an i7 7700+RX570, it could be a gigantic hit in the current market where mid range gpu's are sky high in price due to miners. They could create a modified X390G mobo with video outputs and just 4 memory dimms (still with quad channel -even dual channel would not be that much of a downgrade), regular TR chip support), then you just need to replace the high end APU and you got a new gaming rig :D
     
  12. Nintendork

    Nintendork Member

    Messages:
    24
    Likes Received:
    0
    GPU:
    Gigabyte HD4770
    Ryzen+ (still Zen 1 with some tweaks) in 2 months is 12nm
    Zen 2 (Ryzen 3000 series) is 7nm very late 2018 or early 2019
    Zen 3 (Ryzen 4000 series) is 7nm+ 2020~

    HBM2 takes some space so for AM4 to support it I don't something higher than 896SP + 4c/8t and include the HBM2, could be fine if Navi offers a nice IPC uplift vs Vega (like Maxwell/Pascal vs Kepler).

    The heat of TR4 is when you fully load 16cores and 32 threads and a quad channel working like madman, 4 or 6 cores dual channel barely add heats on 3.6Ghz clock were Ryzen efficiency shine. The bigger the socket, the easier to tame it.

    An APU for a modified X390G wouldn't need quad channel built in.

    This new APU should cost less than $300, closer to $250 (4cores + 2048SP + 4GB HBM2)
     
    Last edited: Feb 12, 2018
  13. Noisiv

    Noisiv Ancient Guru

    Messages:
    6,655
    Likes Received:
    496
    GPU:
    2070 Super
    I know of no one who plays at 720p. Nor do I know anyone who cares about the games even the slightest, and to whom I would recommend 2200G/2400G. Literally no one! Do you?

    Dota 2 1080p medium details @59 fps means a year from now the guy calls me and says: You ***ed me up hardddd bro! My brand new PC gets 17fps in Stalker 2 and 19fps in Crysis 4.

    We are talking GT 1030 level of performance, a year after GT 1030 has been released! Have you ever seen any kind of gamer with 1030? I haven't.
    True enough, people who play CS;GO or TF2 might get away with 2400G, but I doubt anyone looks at it and says, OHBOI UPGRADE TIME. They might upgrade(BETTER YET NOT!) to any 2nd hand toaster just as well.

    We have had situation like this be4, and I don't remember ppl running to stores and buying Llano, A10-7650 etc (does anyone even remember those APU names?) for their TF2/CS/LoL or any desktop gaming. Wont be different this time.

    When it comes to Raven Ridge, Desktop is an afterthought. If they grab something great, if not who cares. Mobile is what matters.
     
  14. Nintendork

    Nintendork Member

    Messages:
    24
    Likes Received:
    0
    GPU:
    Gigabyte HD4770
    GT1030 is till a dgpu and tons and i mean tons of people play on GT1030/RX550 adjusting datails, even more know on the mining craze where the 1050ti/RX560 are reaching $200+.

    It's basically getting $90 worth of dgpu + an i7 7700 for $80 with cpu/apu upgrades till 2020 on the same mobo you buy now.

    You can pretty much game at any current game at 1080p low or 900 medium. Remember, there are lots of useless graphic options that barely adds visual eye candy while on the other hand makes the fps tank.

    For example on many 2017 games where you need to look a few mins on to an image capture (not even gaming) to spot the differences between high and veryhigh, meanwhile the fps dips to like 40% just to say "i got this gpu to play everything MAXXXXXXXXX", barely notices anything different)
     
    Last edited: Feb 12, 2018
  15. aKiss

    aKiss Member

    Messages:
    33
    Likes Received:
    0
    GPU:
    Gigabyte 1050ti LP
    Actually come to think about it, if the zen 2 to the 7nm fab move is true and navi will be 7 nm, they can actually fit even on the TR4 socket couple of 4 or 8 HBM2 memory modules, and make it "premium" apu on that platform, and leave the am4 to still depend on DDR4. They can stick those puny things on vega chip and that silicone piece is smaller than the tr4 cpus. vega has a die size of "just" 610 mm2 while the entire area of just the threadripper 4 dies is ... 3000 mm2 . That is enough space for hbm2 since one module is just 35mm2, come to think of it, they might fit on even am4 if they squeeze them tight... not an actual engineer so i have no idea what i''m talking about...
     

  16. Noisiv

    Noisiv Ancient Guru

    Messages:
    6,655
    Likes Received:
    496
    GPU:
    2070 Super
    Yeah I get that with adjusting level of details :)

    Funny thing is... GT 1030 barely exists on Steam Stats. Yeah I know that Steam stats are not perfect, and GT 1030 owners are less likely to be Steam users than the rest of higher end bunch, but we are talking about gaming population. And it's a data point.
    Check this data point: Revenue vise, OEM + IP business is 10x smaller than Gaming (GeForce) business. And one would expect that volume vise GT 1030 is heavily over-represented in OEM business.
    So no, I doubt they sell tons of 1030.

    IMHO its one of those myths... And this one about low-level GPUs outselling everything else, ran out few years ago.
    If that was true, NV would be hurting all these years with Intel and AMD IGPs freely chewing lower end, and we dont see that happening. There could be some hurting in Mobile this time around, but they guided well the next quarer so...we'll see.
     
  17. D3M1G0D

    D3M1G0D Ancient Guru

    Messages:
    1,931
    Likes Received:
    1,240
    GPU:
    2 x GeForce 1080 Ti
    I find this funny since not too long ago several forum members were insisting on having 720p gaming results (if I recall, it was when Coffee Lake was released). For gamers who care more about frame rates than graphics quality (which typically includes competitive gamers), such resolutions matter, and the results of the 2200G/2400G are more than adequate. And like I said, the 2400G can also do 1080p in many of those games at around 50-60 FPS (with PUBG being the exception, and this is probably due to bad optimizations).

    If someone wanted to play Overwatch or CS:GO and was on a very tight budget, I would absolutely recommend the 2400G (I would be doing them a severe disservice not to). The alternative would be to get a separate CPU and GPU, which would perform similar, but at a higher cost (because why?). When games eventually outgrow the iGPU then they can buy a dedicated GPU, but until that time the 2200G/2400G is a great way to get into PC gaming at minimal cost.

    And for your information, I am actually contemplating getting the 2400G to replace my 4790K. As an old-time gamer I occasionally replay older games on newer engines (e.g., Doomsday for Doom, Doom II, Heretic, etc.) and I hate using a high-end dedicated GPU for it (I feel like I'm wasting it).
     
  18. Picolete

    Picolete Master Guru

    Messages:
    270
    Likes Received:
    61
    GPU:
    R9 290 Sapphire Tri-x
    If you compare the results to the A10, this is an insane upgrade, many people use APU.
    It's the R3 2000 it's really good for office PCs
     
  19. Noisiv

    Noisiv Ancient Guru

    Messages:
    6,655
    Likes Received:
    496
    GPU:
    2070 Super
    @D3M1G0D
    You need to bring that up to those who advocated for 720p gaming ;)
    And frankly I doubt more than 2 ppl argued in the sense of actually playing at 720p. More likely is that they argued that 720p benchmarks are indicative of CPU gaming performance.

    I was contemplating about getting 2400G and upgrading to Ryzen+ down the road. For a second or two :D
    Then I remembered that I still wouldn't be able to play ARMA 3 at 60fps+(and, and proly same for PUBG, and everything else already runs great, so...

    SAME HERE :p

    If Ryzen 7 had this IGP, I would have said to hell with ARMA 3 and bought it,
     
  20. Fox2232

    Fox2232 Ancient Guru

    Messages:
    9,738
    Likes Received:
    2,198
    GPU:
    5700XT+AW@240Hz
    If I had This APU, and 1440p screen. Gaming would be mostly on 720p, as that would prevent blurring. Some 2D games on 1440p apparently.
    Then there is another kind of Gaming, called Android x86... Both 2200G and 2400G are lovely overkill for that.

    2400G is very exciting chip to me. I may even buy it just because it can be damn good micro test machine which is extremely quiet.
     

Share This Page