AMD Ryzen 7 3700X & Ryzen 9 3900X review with benchmarks leaks out

Discussion in 'Frontpage news' started by Rich_Guy, Jul 5, 2019.

  1. BReal85

    BReal85 Master Guru

    Messages:
    356
    Likes Received:
    106
    GPU:
    Sapph RX 570 4G ITX
    :D
     
    Venix likes this.
  2. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    6,805
    Likes Received:
    121
    GPU:
    5700 XT UV 1950~
    Somehow expecting it to reach the levels of performance of 5ghz cpus is interesting when max boost isn't that high on AMD. When game leverages all the cores AMD just wins. Also older games most likely will have optimizations on Intel that AMD won't have.
     
    Aura89 and ZXRaziel like this.
  3. Alessio1989

    Alessio1989 Ancient Guru

    Messages:
    1,769
    Likes Received:
    439
    GPU:
    .
    fps is not a CPU performance metric. frame time is.
     
    ZXRaziel likes this.
  4. Loophole35

    Loophole35 Ancient Guru

    Messages:
    9,590
    Likes Received:
    1,002
    GPU:
    EVGA 1080ti SC
    I have to disagree with you here on this. One only needs to look at the FX CPU’s and their launch. At the time all we had was the GTX680 at the top spot so it performed about identical at 1080p compared to the 2600k. Now a days you can pair a 2070 with a FX and a 2600k and see a huge difference at 1080p.

    The test has its relevance however I do feel game developers WILL focus more on the Zen architecture now that the new consoles will be using them.
     

  5. Alessio1989

    Alessio1989 Ancient Guru

    Messages:
    1,769
    Likes Received:
    439
    GPU:
    .
    Same data...

    But wrong way to show them ( sort by average frames per seconds, descending order):

    [​IMG]

    Better way to show them (frame time, sort by maximum frame time, ascending order):

    https://i.**********/B6QKqP98/right-way.png

    They don't look so dramatic, right?

    And still doesn't reflect real CPU values since nobody would play at 720p with those CPUs. Moreover we do not know anything about deep settings involving microcode updates and security fixes, nor if the so damn "acclaimed" OS scheduler fix for Ryzen 3000 is enabled or not (Windows 10 1903 is a necessary condition but is not enough). Yes all this is annoying but this is today hardware world, things like OS kernel patches, microcode updates and security fixes may change the results that little bit the graphs changes.

    EDIT: just to be clear, low resolution settings may involve different driver and game rendering path behaviours (it's pretty sane and logic to associate low-resolution with low performance hardware and do different optimisations in the code), which may add further reasons to not test at those resolutions. Games benchmarks are meant to (should?) represent real and common game performance, they are not synthetic benchmarks (even accurate synthetic benchmarks may fail to represent real CPU performance if they are unable to take advantage of a particular hardware architecture, like a new SIMD extension set or a high number of cores). And this is both for negative results and positive results (like AC:Origins).
     
    Last edited: Jul 7, 2019
  6. Alex13

    Alex13 Ancient Guru

    Messages:
    2,324
    Likes Received:
    147
    GPU:
    2070 Super
    The forums here have been high on the toxicity side of things for a while. Cant even pinpoint where it all turned real bad, but at this point you can probably get more intelligent interaction on wccftech.
     
  7. fOrTy_7

    fOrTy_7 Master Guru

    Messages:
    344
    Likes Received:
    36
    GPU:
    N/A
    Lol, Frametime and FPS (FramesPerSecond) are closely correlated values.

    They mean basically same thing.

    In both cases getting temporal min FPS or max Frametime is completely useless statistic.

    What you want to measure is 1% lows, which define an average FPS drops on demo, benchmark you're using.

    Your min FPS or max Frametime can occur for any reason and should not be indicator of performance. A median from FPS drops or Frametime spikes would be best to have.
     
  8. Alessio1989

    Alessio1989 Ancient Guru

    Messages:
    1,769
    Likes Received:
    439
    GPU:
    .
    They are multiplicative inverse: 10 fps difference doesn't mean anything. 120 fps vs 130 fps is NOT the same damn thing about 40fps vs 50fps.
    I know frame time is not the best measurement method (I simply used the same data), but FPS is the worst.
     
  9. Only Intruder

    Only Intruder Maha Guru

    Messages:
    1,234
    Likes Received:
    186
    GPU:
    Sapphire Fury Nitro
    Well, this thread certainly is full of, how can I say... passionate people but I think it's safe to say people are definitely excited about tomorrows launch.

    These leaked benches show one thing for certain - even while not being the best at absolute highs, these CPUs still offer some amazing performance at exceptional value but furthermore, these CPUs are going to be capable at so many more tasks. Multi-tasking heavy workloads is definitely going to be possible with these new processors.

    3700X looks to be a potential great buy so far :D
     
    ZXRaziel likes this.
  10. Robbo9999

    Robbo9999 Maha Guru

    Messages:
    1,466
    Likes Received:
    249
    GPU:
    GTX1070 @2050Mhz
    Ha, very good in lots of ways!

    My take on these leaked benchmarks: I'm happy they're at 720p, because it shows the difference for high fps gamers on high Hz monitors, and I also think that these new AMD CPUs have not performed particularly well, the 7700K is faster in a lot of titles, and there's only 1 title where AMD shine (Assassins Creed Origins) and I'm not sure how representative this will be going into the future, although we can say that these new AMD CPUs are more future proof than the 7700K, just I wanted them to at least equal or surpass the 7700K in current games. I don't think these new AMD CPUs are gonna be good choice for high fps / high refresh rate gamers - although that will depend on what the overclocked scores are gonna look like, I'd need to see that to create a final judgement.
     
    -Tj- likes this.

  11. ZXRaziel

    ZXRaziel Member Guru

    Messages:
    166
    Likes Received:
    53
    GPU:
    Nvidia
    Either way I can see very decent performance increase compared to the previous generation 10 to 20 % depending on software used its nothing to be ashamed for .
     
    BReal85 likes this.
  12. FranciscoCL

    FranciscoCL Master Guru

    Messages:
    208
    Likes Received:
    18
    GPU:
    GTX 1080 Ti@2GHz/WC
    Curious... the R7 (not R9) 3700X (65W TDP) has a higher power comsumption than a R7 2700X (105W TDP).
     
  13. Jonotallica

    Jonotallica New Member

    Messages:
    2
    Likes Received:
    4
    GPU:
    780Ti 3GB
    Have a look at the Handbrake score for 3900X and 3700X and then look at the power consumption numbers compared to the 9900K.

    The gaming performance will be 50/50.. some games Intel will have the edge, other games they won't. It'll depend on whether or not the game utilizes multi cores efficiently. This will improve in the next few years as well. I'm curious to see the numbers in Battlefield V, as that one utilizes high core count CPU's quite well. I imagine the Intel will be ahead but by a small margin compared to the 2700X.

    But then look at the power consumption and combine that with "CPU tasks" like encoding. The 3700X and 9900K is an apples to apples comparison.. with one clocked way higher (using 100W more power).. and it's slower in Handbrake. Ouch. The 3900X completely craps all over it and STILL uses less power.

    The resolution thing is to exaggerate the differences, I really don't see what's worth arguing about. In 1080p, there'll a difference in some games. In 1440p, there will be little difference between them. But the main thing for me, is that I encoded probably 1000 hours of Handbrake last year, probably 3 months straight of 24/7 usage.. and trust me.. 100W matters when it comes to the power bill. 235W running 24/7 will save quite a bit on the bill, compared to 330W, plus the Intel costs more in the first place. Intel at this point is really only an option if you use your PC like an Xbox.. and using it with the highest graphics cards like 2080ti at 1080p. Otherwise.. in pretty much every other way, AMD is now the option.

    I've been with Intel for 10 years, but only recently switched to AM4 in the last month. And to me, these benchmark figures are in line with expectations. They aren't going to reach 5Ghz (they never were).. but they've got good IPC, decent pricing (for the value options) and great multicore performance. The 4000 series will be even better, and those will be the ones that hold their value the most on the used market in the next 5 years.
     
  14. Jonotallica

    Jonotallica New Member

    Messages:
    2
    Likes Received:
    4
    GPU:
    780Ti 3GB
    ALL TDP numbers are BS. They all use their own logic and legalese excuses.. and try to say that it's about the thermal rating of the cooler, or when not boosted or whatever other excuses they try to say. (Intel being the worst because it's a 5Ghz CPU on 14nm for pete sake)..

    But yeah.. if you want to use them as high performance (which is the whole thing this enthusiast market tries to do).. in other words, if you want to put a decent cooler on it and try and get as stable/high clocks as possible.. then ALL of the TDP numbers are garbage and simply for marketing purposes. Which is why it's so important that there can be honest independent reviewers out there.
     
  15. Fox2232

    Fox2232 Ancient Guru

    Messages:
    10,081
    Likes Received:
    2,354
    GPU:
    5700XT+AW@240Hz
    For your board, CPU support lists Ryzen 3000 CPUs as supported since BIOS P3.30.
     
    thesebastian and signex like this.

  16. Kaaskop

    Kaaskop Member

    Messages:
    31
    Likes Received:
    2
    GPU:
    ASUS STRIX 1080A8G
    Quite more than you'd expect tbh.

    I recently swapped over from 1080p60 to 1440p144. What a massive difference, everything is just so much smoother like that.

    It's like going from HDD to SSD, or from 480p to 1080p, once used to it, you cannot go back anymore. That's the same with high refreshrates.
     
  17. Fuzzout

    Fuzzout New Member

    Messages:
    1
    Likes Received:
    1
    GPU:
    GTX 1070
    Well, wasn't this comment thread a joy to read :D

    I'm glad TSMC has brought AMD back into the game (RIP Global Foundries).
    Multicore systems will scale better into the future (just look at how FX-8350 has aged rather well due to more threads per game/application being used now); this is because modern consoles all have increased core counts (you can draw a clear co-relation between new console releases and better multithreading in games).
    With that said - I'll go with Ryzen 9 3900x only because it costs way less than the intel equivalent chip. Plus - I already have an AM4 board (built my current PC recently with the intent to slap a 3900x into it when time comes).

    If you notice these benchmarks and think "AMD sucks" - Well you're wrong.
    If you notice the benchmarks here and think "HAH intel sucks!" - You're wrong too, get the hell out of here.

    Be glad that the mainline CPU manufacturers are trading blows... This will mean that in the future they will compete in pricing and speeds (means faster and cheaper CPU's for us - on a faster iteration basis).

    Have a nice day, regardless of wether you're blue or red.
     
    Ricardo likes this.
  18. Aura89

    Aura89 Ancient Guru

    Messages:
    7,834
    Likes Received:
    1,013
    GPU:
    -
    Really curious here, which company are you accusing me of being a shill for?

    I call people out for their BS posted in AMD CPU threads, and call their BS out for information posted in AMD GPU threads, and call their BS out for information for nvidia GPU threads.

    Am i a shill for AMD and Nvidia? ..... I think that'd be one very difficult thing to do, neither company would appreciate that. Am i just the best shill of all time getting away with it all?

    Or....am i deep deep undercover and i'm really a shill for.... DUN DUN DUNNNNNNNNN................................ Intel.......

    Yeah no, every single person who knows me on this thread would never even remotely say that lol that's just crazy talk

    But then you post this:

    Which is pretty much the exact thing i would say and respond to, as you're 100% right there, the post you were replying to is utter BS.

    So....are we both shills....?

    You and i have a problem because you believe AMD would troll and bait their customers and competitors, and for the fact you don't know what "indicate" means.

    Me, i would really hope AMD would never even try and troll and bait their customers and competitors, as that'd be a horrible, horrible PR situation and lean people toward nvidia (and Intel potentially depending on if they ruin the view of AMD as a whole company), and i do know what indicate means, indicate =/= statement. AKA you can't indicate a price if you're going to give the exact price you are releasing for. It's that simple.

    So it seems to me, we both like AMD, at least, your posts seem to indicate that. Even if you like intel and nvidia too, you don't have the general intel fanboy superiority complex, hence your above quoted reply to someone that does.

    So we appear to be, if you want to call it "sides", on the same sides yet you're now suggesting i'm a shill. So for what company? What company makes sense? lol
     
    Ricardo likes this.
  19. alanm

    alanm Ancient Guru

    Messages:
    9,224
    Likes Received:
    1,507
    GPU:
    Asus 2080 Dual OC
    Couldnt have a rats ass about ultra-low res gaming. Intels swiss cheese vulnerabilities is main reason to move to these AMD chips. The extra cores, threads at non-extortionate prices are just the icing on the cake.
     
    ZXRaziel and Aura89 like this.
  20. Ricardo

    Ricardo Member Guru

    Messages:
    114
    Likes Received:
    71
    GPU:
    1050Ti 4GB
    While I understand the criticism of low res benchmarks, people need to understand that they are necessary to highlight the differences between CPUs when not bottlenecked by the GPU. This isn't a benchmark of "best gaming experience", but a benchmark of "how far can the CPU go".

    Just look at AC:Origins - it's a game that relies heavily on CPU, even though it's on the same generation as other games that are way lighter. Games tend to get heavier on CPU as the time passes by, so knowing that your CPU has enough power to spare is useful as it is indicative of better longevity of a CPU.

    Obviously, those benches are only useful if your CPU is close to bottlenecking your GPU, or in other words: you should only take that in consideration if you're running very/high end GPUs (e.g. Radeon VII or 2080+). Any other scenario is purely synthetic and irrelevant in the real world, since your GPU will choke waaaaay before you CPU even starts to flex its muscles.
     
    Tarkan2467 likes this.

Share This Page