AMD Ryzen 7 3700X & Ryzen 9 3900X review with benchmarks leaks out

Discussion in 'Frontpage news' started by Rich_Guy, Jul 5, 2019.

  1. Jonotallica

    Jonotallica New Member

    Messages:
    2
    Likes Received:
    4
    GPU:
    780Ti 3GB
    Have a look at the Handbrake score for 3900X and 3700X and then look at the power consumption numbers compared to the 9900K.

    The gaming performance will be 50/50.. some games Intel will have the edge, other games they won't. It'll depend on whether or not the game utilizes multi cores efficiently. This will improve in the next few years as well. I'm curious to see the numbers in Battlefield V, as that one utilizes high core count CPU's quite well. I imagine the Intel will be ahead but by a small margin compared to the 2700X.

    But then look at the power consumption and combine that with "CPU tasks" like encoding. The 3700X and 9900K is an apples to apples comparison.. with one clocked way higher (using 100W more power).. and it's slower in Handbrake. Ouch. The 3900X completely craps all over it and STILL uses less power.

    The resolution thing is to exaggerate the differences, I really don't see what's worth arguing about. In 1080p, there'll a difference in some games. In 1440p, there will be little difference between them. But the main thing for me, is that I encoded probably 1000 hours of Handbrake last year, probably 3 months straight of 24/7 usage.. and trust me.. 100W matters when it comes to the power bill. 235W running 24/7 will save quite a bit on the bill, compared to 330W, plus the Intel costs more in the first place. Intel at this point is really only an option if you use your PC like an Xbox.. and using it with the highest graphics cards like 2080ti at 1080p. Otherwise.. in pretty much every other way, AMD is now the option.

    I've been with Intel for 10 years, but only recently switched to AM4 in the last month. And to me, these benchmark figures are in line with expectations. They aren't going to reach 5Ghz (they never were).. but they've got good IPC, decent pricing (for the value options) and great multicore performance. The 4000 series will be even better, and those will be the ones that hold their value the most on the used market in the next 5 years.
     
  2. Jonotallica

    Jonotallica New Member

    Messages:
    2
    Likes Received:
    4
    GPU:
    780Ti 3GB
    ALL TDP numbers are BS. They all use their own logic and legalese excuses.. and try to say that it's about the thermal rating of the cooler, or when not boosted or whatever other excuses they try to say. (Intel being the worst because it's a 5Ghz CPU on 14nm for pete sake)..

    But yeah.. if you want to use them as high performance (which is the whole thing this enthusiast market tries to do).. in other words, if you want to put a decent cooler on it and try and get as stable/high clocks as possible.. then ALL of the TDP numbers are garbage and simply for marketing purposes. Which is why it's so important that there can be honest independent reviewers out there.
     
  3. Fox2232

    Fox2232 Ancient Guru

    Messages:
    9,762
    Likes Received:
    2,204
    GPU:
    5700XT+AW@240Hz
    For your board, CPU support lists Ryzen 3000 CPUs as supported since BIOS P3.30.
     
    thesebastian and signex like this.
  4. Kaaskop

    Kaaskop Member

    Messages:
    31
    Likes Received:
    2
    GPU:
    ASUS STRIX 1080A8G
    Quite more than you'd expect tbh.

    I recently swapped over from 1080p60 to 1440p144. What a massive difference, everything is just so much smoother like that.

    It's like going from HDD to SSD, or from 480p to 1080p, once used to it, you cannot go back anymore. That's the same with high refreshrates.
     

  5. Fuzzout

    Fuzzout New Member

    Messages:
    1
    Likes Received:
    1
    GPU:
    GTX 1070
    Well, wasn't this comment thread a joy to read :D

    I'm glad TSMC has brought AMD back into the game (RIP Global Foundries).
    Multicore systems will scale better into the future (just look at how FX-8350 has aged rather well due to more threads per game/application being used now); this is because modern consoles all have increased core counts (you can draw a clear co-relation between new console releases and better multithreading in games).
    With that said - I'll go with Ryzen 9 3900x only because it costs way less than the intel equivalent chip. Plus - I already have an AM4 board (built my current PC recently with the intent to slap a 3900x into it when time comes).

    If you notice these benchmarks and think "AMD sucks" - Well you're wrong.
    If you notice the benchmarks here and think "HAH intel sucks!" - You're wrong too, get the hell out of here.

    Be glad that the mainline CPU manufacturers are trading blows... This will mean that in the future they will compete in pricing and speeds (means faster and cheaper CPU's for us - on a faster iteration basis).

    Have a nice day, regardless of wether you're blue or red.
     
    Ricardo likes this.
  6. Aura89

    Aura89 Ancient Guru

    Messages:
    7,713
    Likes Received:
    952
    GPU:
    -
    Really curious here, which company are you accusing me of being a shill for?

    I call people out for their BS posted in AMD CPU threads, and call their BS out for information posted in AMD GPU threads, and call their BS out for information for nvidia GPU threads.

    Am i a shill for AMD and Nvidia? ..... I think that'd be one very difficult thing to do, neither company would appreciate that. Am i just the best shill of all time getting away with it all?

    Or....am i deep deep undercover and i'm really a shill for.... DUN DUN DUNNNNNNNNN................................ Intel.......

    Yeah no, every single person who knows me on this thread would never even remotely say that lol that's just crazy talk

    But then you post this:

    Which is pretty much the exact thing i would say and respond to, as you're 100% right there, the post you were replying to is utter BS.

    So....are we both shills....?

    You and i have a problem because you believe AMD would troll and bait their customers and competitors, and for the fact you don't know what "indicate" means.

    Me, i would really hope AMD would never even try and troll and bait their customers and competitors, as that'd be a horrible, horrible PR situation and lean people toward nvidia (and Intel potentially depending on if they ruin the view of AMD as a whole company), and i do know what indicate means, indicate =/= statement. AKA you can't indicate a price if you're going to give the exact price you are releasing for. It's that simple.

    So it seems to me, we both like AMD, at least, your posts seem to indicate that. Even if you like intel and nvidia too, you don't have the general intel fanboy superiority complex, hence your above quoted reply to someone that does.

    So we appear to be, if you want to call it "sides", on the same sides yet you're now suggesting i'm a shill. So for what company? What company makes sense? lol
     
    Ricardo likes this.
  7. alanm

    alanm Ancient Guru

    Messages:
    8,977
    Likes Received:
    1,326
    GPU:
    Asus 2080 Dual OC
    Couldnt have a rats ass about ultra-low res gaming. Intels swiss cheese vulnerabilities is main reason to move to these AMD chips. The extra cores, threads at non-extortionate prices are just the icing on the cake.
     
    ZXRaziel and Aura89 like this.
  8. Ricardo

    Ricardo Active Member

    Messages:
    90
    Likes Received:
    60
    GPU:
    1050Ti 4GB
    While I understand the criticism of low res benchmarks, people need to understand that they are necessary to highlight the differences between CPUs when not bottlenecked by the GPU. This isn't a benchmark of "best gaming experience", but a benchmark of "how far can the CPU go".

    Just look at AC:Origins - it's a game that relies heavily on CPU, even though it's on the same generation as other games that are way lighter. Games tend to get heavier on CPU as the time passes by, so knowing that your CPU has enough power to spare is useful as it is indicative of better longevity of a CPU.

    Obviously, those benches are only useful if your CPU is close to bottlenecking your GPU, or in other words: you should only take that in consideration if you're running very/high end GPUs (e.g. Radeon VII or 2080+). Any other scenario is purely synthetic and irrelevant in the real world, since your GPU will choke waaaaay before you CPU even starts to flex its muscles.
     
    Tarkan2467 likes this.
  9. Ambient

    Ambient New Member

    Messages:
    7
    Likes Received:
    0
    GPU:
    8
    Testing in 720p resolution?????? LoL!
     
  10. TurboMan

    TurboMan Member Guru

    Messages:
    159
    Likes Received:
    6
    GPU:
    1080 Ti Strix OC
    Yea if you want to test the CPU performance. Whats the point of testing at 4K when CPU performance means nothing since its all bottle-necked by the GPU at that resolution ??
     

  11. Alessio1989

    Alessio1989 Maha Guru

    Messages:
    1,407
    Likes Received:
    236
    GPU:
    .
    That's why a GTX 1080 TI was a good choice.
    Then you should look at optimised syntethic benchmarks only, like AIDA64.
    Except is still tested in a non-real scenarios. If those these were all 1080p, no-one would complain. Why not test at lower resolution then? Maybe VGA 640*480? I know most of modern games do not support such lower resolution.. for good... However, at lower resolution graphics drivers as well as game rendering path, may behave differently and not reflect real CPU performance too. Low-resolution scenarios may be better tuned for low-quality and low-hardware configurations. No-one never think about that.. Finally showing the performance under average FPS is wrong, especially when the frame rates are so high, period.
    That's the point of CPU reviews... They simply missed the target.
    4K CPU performance benchmarks in-game are useless, no-one doubt that. As they are all test involving surreal system configuration, like 720p or lower. And that's not only for gaming. Productivity testing like video AVC or HEVC encoding and then test it with a 640*480 video source (cough... cough... TPU?)...
     
  12. alanm

    alanm Ancient Guru

    Messages:
    8,977
    Likes Received:
    1,326
    GPU:
    Asus 2080 Dual OC
    And whats the point testing on resolutions that no one will use? Expecting any 3700x owners with 1080ti's to plug their shiny gear into obsolete displays that arent even sold anymore? :rolleyes: . I think the editors at PCGamesHardware.de simply forget to tell there reviewers to stop using this dumb res years ago.

    Yes, 4k CPU performance is also 'almost' pointless, but not completely. Some multi-threaded games are now showing benefit from CPU perf, even at 4k. Sure, vast majority of games wont beneft much, still far more useful to know than 720p results.
    [​IMG]

    [​IMG]
     
    ZXRaziel, Alessio1989 and Aura89 like this.
  13. Tarkan2467

    Tarkan2467 Master Guru

    Messages:
    758
    Likes Received:
    4
    GPU:
    EVGA GTX 1080 FTW
    Personally, I would love to see gaming tests at 640x480 (or lower!) to minimize the influence of the GPU as much as possible. I also want to see synthetic benchmark tests, as well as gaming tests at the usual resolutions (1080p, 1440p, 4K).

    I like the low-resolution tests because in the games I currently play, I am CPU-capped. I have a 240Hz monitor and it is awesome when I can run games at FPS numbers close to that refresh rate. Lowering the resolution artificially is an imperfect simulation of this, but it is a data point I like to see in CPU reviews.

    It's not like we have to litigate the pros and cons of every single testing methodology and then choose ONLY one. Reviewers can run as many or as few scenarios as they want in their testing suites. Then people can examine the results they consider the most relevant to them.

    Excited for Sunday.
     
    Ricardo and ZXRaziel like this.
  14. BReal85

    BReal85 Master Guru

    Messages:
    327
    Likes Received:
    102
    GPU:
    ASUS 270X DC2 TOP
    We will get the results when we measure 15-20 games mixed (ones not taking advantage of more cores and ones that do take) then we make an average. I'm sure there will be minimal difference in FHD with a 2080Ti equipped. And as you said, the number of gamers playing on FHD with a 2080 Ti is.... maybe 0,1% of the whole PC gaming community.
     
    Last edited: Jul 6, 2019
  15. kakiharaFRS

    kakiharaFRS Master Guru

    Messages:
    215
    Likes Received:
    38
    GPU:
    MSI Gaming X 1080ti
    disappointing for now...I hope it'll get better with as full review because 3.x+ clock speeds are meaningless especially since last gen motherboard all do their own version of auto-overclocking
    the reason why I went 9900k was the clock speed but I already ran into a lack of cpu lanes
    we'll see with overclocking how things work for AMD, I hope great because the Z390 plateform frankly sucks (I have some weird 360Mbyte/s bottleneck while transfering large files that I didn't have on the X99 and when you run into problems with 10Gbit on win10 good luck finding infos ><)
     
    Last edited: Jul 6, 2019

  16. kakiharaFRS

    kakiharaFRS Master Guru

    Messages:
    215
    Likes Received:
    38
    GPU:
    MSI Gaming X 1080ti
    people who have too much money and no sense go all 4K ,the real gamers play on 1080p or 1440p, the rest do whatever they think is good but really isn't
    I can read street names on this, I see everything almost as sharp as the still picture https://www.testufo.com/photo#photo=toronto-map.png&pps=1920&pursuit=0&height=0
    try that on a 4K 60Hz for a laugh
     
  17. xg-ei8ht

    xg-ei8ht Ancient Guru

    Messages:
    1,781
    Likes Received:
    7
    GPU:
    1gb 6870 Sapphire
    @pegasus1 Well Said.

    You were not here for 4.5 billion years.
    You will live for maybe 80 years.
    Then you will never exist again.
    Yet you are willing to spend time arguing over something so irrelevant.
     
  18. Monchis

    Monchis Maha Guru

    Messages:
    1,304
    Likes Received:
    36
    GPU:
    GTX 950
    Jesus christ, low res gaming tests are the best way to predict which processor will perform better when new generations of graphics cards show up. If you are only interested on how they perform at 1440p on actual "real hardware" please knock yourselves out with the 1440p results and ignore the additional information those tests provide, it wont bite you, thanks.
     
    Ricardo likes this.
  19. DW75

    DW75 Maha Guru

    Messages:
    1,161
    Likes Received:
    565
    GPU:
    ROG GTX1080 Ti OC
    Just wait until the review threads are up tomorrow. Half of the posts will be from new members (oh pardon me, paid shills and intel employees with 1 post count) stating how AMD sucks, and Intel is the only choice if you game. Indeed, these people seem to be also convinced that the only thing a gamer does on a computer is game. What about Windows updates, program installs, uninstalls, file transfers, file extractions, web surfing, movie watching, music and media work, file uploads ? Oh no, you also game ? Well, your only choice is Intel then !!!!!! See how ridiculous this sounds ?
     
    Ricardo, carnivore and Mesab67 like this.
  20. Chess

    Chess Master Guru

    Messages:
    276
    Likes Received:
    7
    GPU:
    ASUS GTX1080Ti Stri
    Real gamers? If you mean competitive online gamers, then yes I think you have a point.
    Yet, for slow paced single player games I think 4k is a perfect choice if you can financially handle it. Immerion is key there ;).
    I tried best of both worlds and went for 1440p at 144Hz G-sync but alas, this is still a TN pannel. I miss details, HDR and deeper colours in single player games.
    And at 280+ games in my steam library and gaming since the '90 I'd say I'm a real gamer ;).
     
    alanm likes this.

Share This Page