AMD Ryzen 7 3700X & Ryzen 9 3900X review with benchmarks leaks out

Discussion in 'Frontpage news' started by Rich_Guy, Jul 5, 2019.

  1. Truder

    Truder Ancient Guru

    Messages:
    2,392
    Likes Received:
    1,426
    GPU:
    RX 6700XT Nitro+
    Well, this thread certainly is full of, how can I say... passionate people but I think it's safe to say people are definitely excited about tomorrows launch.

    These leaked benches show one thing for certain - even while not being the best at absolute highs, these CPUs still offer some amazing performance at exceptional value but furthermore, these CPUs are going to be capable at so many more tasks. Multi-tasking heavy workloads is definitely going to be possible with these new processors.

    3700X looks to be a potential great buy so far :D
     
    ZXRaziel likes this.
  2. Robbo9999

    Robbo9999 Ancient Guru

    Messages:
    1,855
    Likes Received:
    442
    GPU:
    RTX 3080
    Ha, very good in lots of ways!

    My take on these leaked benchmarks: I'm happy they're at 720p, because it shows the difference for high fps gamers on high Hz monitors, and I also think that these new AMD CPUs have not performed particularly well, the 7700K is faster in a lot of titles, and there's only 1 title where AMD shine (Assassins Creed Origins) and I'm not sure how representative this will be going into the future, although we can say that these new AMD CPUs are more future proof than the 7700K, just I wanted them to at least equal or surpass the 7700K in current games. I don't think these new AMD CPUs are gonna be good choice for high fps / high refresh rate gamers - although that will depend on what the overclocked scores are gonna look like, I'd need to see that to create a final judgement.
     
    -Tj- likes this.
  3. ZXRaziel

    ZXRaziel Master Guru

    Messages:
    425
    Likes Received:
    134
    GPU:
    Nvidia
    Either way I can see very decent performance increase compared to the previous generation 10 to 20 % depending on software used its nothing to be ashamed for .
     
    BReal85 likes this.
  4. FranciscoCL

    FranciscoCL Master Guru

    Messages:
    267
    Likes Received:
    59
    GPU:
    RTX 3080 Ti
    Curious... the R7 (not R9) 3700X (65W TDP) has a higher power comsumption than a R7 2700X (105W TDP).
     

  5. Jonotallica

    Jonotallica Guest

    Messages:
    2
    Likes Received:
    4
    GPU:
    780Ti 3GB
    Have a look at the Handbrake score for 3900X and 3700X and then look at the power consumption numbers compared to the 9900K.

    The gaming performance will be 50/50.. some games Intel will have the edge, other games they won't. It'll depend on whether or not the game utilizes multi cores efficiently. This will improve in the next few years as well. I'm curious to see the numbers in Battlefield V, as that one utilizes high core count CPU's quite well. I imagine the Intel will be ahead but by a small margin compared to the 2700X.

    But then look at the power consumption and combine that with "CPU tasks" like encoding. The 3700X and 9900K is an apples to apples comparison.. with one clocked way higher (using 100W more power).. and it's slower in Handbrake. Ouch. The 3900X completely craps all over it and STILL uses less power.

    The resolution thing is to exaggerate the differences, I really don't see what's worth arguing about. In 1080p, there'll a difference in some games. In 1440p, there will be little difference between them. But the main thing for me, is that I encoded probably 1000 hours of Handbrake last year, probably 3 months straight of 24/7 usage.. and trust me.. 100W matters when it comes to the power bill. 235W running 24/7 will save quite a bit on the bill, compared to 330W, plus the Intel costs more in the first place. Intel at this point is really only an option if you use your PC like an Xbox.. and using it with the highest graphics cards like 2080ti at 1080p. Otherwise.. in pretty much every other way, AMD is now the option.

    I've been with Intel for 10 years, but only recently switched to AM4 in the last month. And to me, these benchmark figures are in line with expectations. They aren't going to reach 5Ghz (they never were).. but they've got good IPC, decent pricing (for the value options) and great multicore performance. The 4000 series will be even better, and those will be the ones that hold their value the most on the used market in the next 5 years.
     
  6. Jonotallica

    Jonotallica Guest

    Messages:
    2
    Likes Received:
    4
    GPU:
    780Ti 3GB
    ALL TDP numbers are BS. They all use their own logic and legalese excuses.. and try to say that it's about the thermal rating of the cooler, or when not boosted or whatever other excuses they try to say. (Intel being the worst because it's a 5Ghz CPU on 14nm for pete sake)..

    But yeah.. if you want to use them as high performance (which is the whole thing this enthusiast market tries to do).. in other words, if you want to put a decent cooler on it and try and get as stable/high clocks as possible.. then ALL of the TDP numbers are garbage and simply for marketing purposes. Which is why it's so important that there can be honest independent reviewers out there.
     
  7. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    For your board, CPU support lists Ryzen 3000 CPUs as supported since BIOS P3.30.
     
    thesebastian and signex like this.
  8. Kaaskop

    Kaaskop Member

    Messages:
    33
    Likes Received:
    3
    GPU:
    MSI RTX2080 Duke
    Quite more than you'd expect tbh.

    I recently swapped over from 1080p60 to 1440p144. What a massive difference, everything is just so much smoother like that.

    It's like going from HDD to SSD, or from 480p to 1080p, once used to it, you cannot go back anymore. That's the same with high refreshrates.
     
  9. Fuzzout

    Fuzzout Guest

    Well, wasn't this comment thread a joy to read :D

    I'm glad TSMC has brought AMD back into the game (RIP Global Foundries).
    Multicore systems will scale better into the future (just look at how FX-8350 has aged rather well due to more threads per game/application being used now); this is because modern consoles all have increased core counts (you can draw a clear co-relation between new console releases and better multithreading in games).
    With that said - I'll go with Ryzen 9 3900x only because it costs way less than the intel equivalent chip. Plus - I already have an AM4 board (built my current PC recently with the intent to slap a 3900x into it when time comes).

    If you notice these benchmarks and think "AMD sucks" - Well you're wrong.
    If you notice the benchmarks here and think "HAH intel sucks!" - You're wrong too, get the hell out of here.

    Be glad that the mainline CPU manufacturers are trading blows... This will mean that in the future they will compete in pricing and speeds (means faster and cheaper CPU's for us - on a faster iteration basis).

    Have a nice day, regardless of wether you're blue or red.
     
    Ricardo likes this.
  10. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    Really curious here, which company are you accusing me of being a shill for?

    I call people out for their BS posted in AMD CPU threads, and call their BS out for information posted in AMD GPU threads, and call their BS out for information for nvidia GPU threads.

    Am i a shill for AMD and Nvidia? ..... I think that'd be one very difficult thing to do, neither company would appreciate that. Am i just the best shill of all time getting away with it all?

    Or....am i deep deep undercover and i'm really a shill for.... DUN DUN DUNNNNNNNNN................................ Intel.......

    Yeah no, every single person who knows me on this thread would never even remotely say that lol that's just crazy talk

    But then you post this:

    Which is pretty much the exact thing i would say and respond to, as you're 100% right there, the post you were replying to is utter BS.

    So....are we both shills....?

    You and i have a problem because you believe AMD would troll and bait their customers and competitors, and for the fact you don't know what "indicate" means.

    Me, i would really hope AMD would never even try and troll and bait their customers and competitors, as that'd be a horrible, horrible PR situation and lean people toward nvidia (and Intel potentially depending on if they ruin the view of AMD as a whole company), and i do know what indicate means, indicate =/= statement. AKA you can't indicate a price if you're going to give the exact price you are releasing for. It's that simple.

    So it seems to me, we both like AMD, at least, your posts seem to indicate that. Even if you like intel and nvidia too, you don't have the general intel fanboy superiority complex, hence your above quoted reply to someone that does.

    So we appear to be, if you want to call it "sides", on the same sides yet you're now suggesting i'm a shill. So for what company? What company makes sense? lol
     
    Ricardo likes this.

  11. alanm

    alanm Ancient Guru

    Messages:
    12,233
    Likes Received:
    4,435
    GPU:
    RTX 4080
    Couldnt have a rats ass about ultra-low res gaming. Intels swiss cheese vulnerabilities is main reason to move to these AMD chips. The extra cores, threads at non-extortionate prices are just the icing on the cake.
     
    ZXRaziel and Aura89 like this.
  12. Ricardo

    Ricardo Member Guru

    Messages:
    165
    Likes Received:
    113
    GPU:
    1050Ti 4GB
    While I understand the criticism of low res benchmarks, people need to understand that they are necessary to highlight the differences between CPUs when not bottlenecked by the GPU. This isn't a benchmark of "best gaming experience", but a benchmark of "how far can the CPU go".

    Just look at AC:Origins - it's a game that relies heavily on CPU, even though it's on the same generation as other games that are way lighter. Games tend to get heavier on CPU as the time passes by, so knowing that your CPU has enough power to spare is useful as it is indicative of better longevity of a CPU.

    Obviously, those benches are only useful if your CPU is close to bottlenecking your GPU, or in other words: you should only take that in consideration if you're running very/high end GPUs (e.g. Radeon VII or 2080+). Any other scenario is purely synthetic and irrelevant in the real world, since your GPU will choke waaaaay before you CPU even starts to flex its muscles.
     
    Tarkan2467 likes this.
  13. Ambient

    Ambient Member

    Messages:
    19
    Likes Received:
    4
    GPU:
    8
    Testing in 720p resolution?????? LoL!
     
  14. TurboMan

    TurboMan Guest

    Messages:
    161
    Likes Received:
    7
    GPU:
    EVGA RTX 3080 FTW 3
    Yea if you want to test the CPU performance. Whats the point of testing at 4K when CPU performance means nothing since its all bottle-necked by the GPU at that resolution ??
     
  15. Alessio1989

    Alessio1989 Ancient Guru

    Messages:
    2,941
    Likes Received:
    1,239
    GPU:
    .
    That's why a GTX 1080 TI was a good choice.
    Then you should look at optimised syntethic benchmarks only, like AIDA64.
    Except is still tested in a non-real scenarios. If those these were all 1080p, no-one would complain. Why not test at lower resolution then? Maybe VGA 640*480? I know most of modern games do not support such lower resolution.. for good... However, at lower resolution graphics drivers as well as game rendering path, may behave differently and not reflect real CPU performance too. Low-resolution scenarios may be better tuned for low-quality and low-hardware configurations. No-one never think about that.. Finally showing the performance under average FPS is wrong, especially when the frame rates are so high, period.
    That's the point of CPU reviews... They simply missed the target.
    4K CPU performance benchmarks in-game are useless, no-one doubt that. As they are all test involving surreal system configuration, like 720p or lower. And that's not only for gaming. Productivity testing like video AVC or HEVC encoding and then test it with a 640*480 video source (cough... cough... TPU?)...
     

  16. alanm

    alanm Ancient Guru

    Messages:
    12,233
    Likes Received:
    4,435
    GPU:
    RTX 4080
    And whats the point testing on resolutions that no one will use? Expecting any 3700x owners with 1080ti's to plug their shiny gear into obsolete displays that arent even sold anymore? :rolleyes: . I think the editors at PCGamesHardware.de simply forget to tell there reviewers to stop using this dumb res years ago.

    Yes, 4k CPU performance is also 'almost' pointless, but not completely. Some multi-threaded games are now showing benefit from CPU perf, even at 4k. Sure, vast majority of games wont beneft much, still far more useful to know than 720p results.
    [​IMG]

    [​IMG]
     
    ZXRaziel, Alessio1989 and Aura89 like this.
  17. Tarkan2467

    Tarkan2467 Guest

    Messages:
    758
    Likes Received:
    4
    GPU:
    EVGA GTX 1080 FTW
    Personally, I would love to see gaming tests at 640x480 (or lower!) to minimize the influence of the GPU as much as possible. I also want to see synthetic benchmark tests, as well as gaming tests at the usual resolutions (1080p, 1440p, 4K).

    I like the low-resolution tests because in the games I currently play, I am CPU-capped. I have a 240Hz monitor and it is awesome when I can run games at FPS numbers close to that refresh rate. Lowering the resolution artificially is an imperfect simulation of this, but it is a data point I like to see in CPU reviews.

    It's not like we have to litigate the pros and cons of every single testing methodology and then choose ONLY one. Reviewers can run as many or as few scenarios as they want in their testing suites. Then people can examine the results they consider the most relevant to them.

    Excited for Sunday.
     
    Ricardo and ZXRaziel like this.
  18. BReal85

    BReal85 Master Guru

    Messages:
    487
    Likes Received:
    180
    GPU:
    Sapph RX 570 4G ITX
    We will get the results when we measure 15-20 games mixed (ones not taking advantage of more cores and ones that do take) then we make an average. I'm sure there will be minimal difference in FHD with a 2080Ti equipped. And as you said, the number of gamers playing on FHD with a 2080 Ti is.... maybe 0,1% of the whole PC gaming community.
     
    Last edited: Jul 6, 2019
  19. kakiharaFRS

    kakiharaFRS Master Guru

    Messages:
    987
    Likes Received:
    370
    GPU:
    KFA2 RTX 3090
    disappointing for now...I hope it'll get better with as full review because 3.x+ clock speeds are meaningless especially since last gen motherboard all do their own version of auto-overclocking
    the reason why I went 9900k was the clock speed but I already ran into a lack of cpu lanes
    we'll see with overclocking how things work for AMD, I hope great because the Z390 plateform frankly sucks (I have some weird 360Mbyte/s bottleneck while transfering large files that I didn't have on the X99 and when you run into problems with 10Gbit on win10 good luck finding infos ><)
     
    Last edited: Jul 6, 2019
  20. kakiharaFRS

    kakiharaFRS Master Guru

    Messages:
    987
    Likes Received:
    370
    GPU:
    KFA2 RTX 3090
    people who have too much money and no sense go all 4K ,the real gamers play on 1080p or 1440p, the rest do whatever they think is good but really isn't
    I can read street names on this, I see everything almost as sharp as the still picture https://www.testufo.com/photo#photo=toronto-map.png&pps=1920&pursuit=0&height=0
    try that on a 4K 60Hz for a laugh
     

Share This Page