Review: AMD Ryzen 7 1800X processor

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Mar 2, 2017.

  1. BLEH!

    BLEH! Ancient Guru

    Messages:
    6,092
    Likes Received:
    155
    GPU:
    Sapphire Fury
    These are tempting, but the marginally poorer gaming performance and lack of dynamic overclockability/lock clock ceiling is a worry for the high-clock single thread stuff I sometimes need. Will see how the platform matures/intel's prices drop before taking the plunge, I think.
     
  2. pato

    pato Member Guru

    Messages:
    176
    Likes Received:
    6
    GPU:
    MSI 3060TI
    I haven't read all pages, sorry if it's already been asked.
    Would it be possible to run a game benchmark while recording the CPU clocks of all the cores?
    I wonder how often the turbo will be reached while gaming and using the GPU, compared to the rendering (cinebench) without GPU. Maybe there is the culprit?
     
  3. vbetts

    vbetts Don Vincenzo Staff Member

    Messages:
    15,121
    Likes Received:
    1,696
    GPU:
    GTX 1080 Ti
    While I'm not holding my breath that the current Ryzen will get any better(If it does it would be through bios updates and some updates from AMD to Windows as well, but it would only get better maybe by 1%?), future revisions of the same core will have a better base to build from. Future Ryzen's will be better.
     
  4. elijahk33

    elijahk33 Member

    Messages:
    16
    Likes Received:
    0
    GPU:
    Sapphire R9 380X
    found a review in which they use the RX 480
    on Vortez website (can't post links because I'm a newbie :bang: )
    Gaming performance is really interesting...
     

  5. -Tj-

    -Tj- Ancient Guru

    Messages:
    17,198
    Likes Received:
    1,938
    GPU:
    Zotac GTX980Ti OC
  6. pato

    pato Member Guru

    Messages:
    176
    Likes Received:
    6
    GPU:
    MSI 3060TI
    Ooohhh they do have memory latency comparison graphs there.
    i7-7700k has 19.4ms while 1800x has 82ms and is thus even slower than a FX8350, the second slowest in the test.

    I do hope that some upcoming Nvidia optimizations will provide a few % more performance.

    Anybody found some transcoding tests on the GPU in an AMD system while also utilizing the CPU? I wonder if the memory latency rises once the PCIe bus is loaded.
     
  7. thatguy91

    thatguy91 Ancient Guru

    Messages:
    6,643
    Likes Received:
    99
    GPU:
    XFX RX 480 RS 4 GB
    That makes sense when you think about it. Not for any anti-competitive reasons or anything either. It makes sense since the AMD CPU department would have been talking to the AMD GPU department, at least to the extent of having the drivers somewhat optimised for the 16 threads and CPU.
     
  8. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    7,057
    Likes Received:
    273
    GPU:
    6800 XT
  9. MadGizmo

    MadGizmo Maha Guru

    Messages:
    1,396
    Likes Received:
    0
    GPU:
    MSI R9 290X 8GB 2560*1440
  10. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    7,815
    Likes Received:
    707
    GPU:
    Inno3D RTX 3090
    If there's really a combination of DPC latency issues, with the (expected for a brand new architecture) lack of compiler, scheduler and driver optimizations, then you might be surprised. Especially at the scheduler/compliler combo.
     

  11. Forebode

    Forebode Member

    Messages:
    17
    Likes Received:
    0
    GPU:
    MSI GTX 980 4G 1530/8k
    Can we please post benches of the competition? If you're going to buy the 1800x to overclock it to replace your i7-xxxxkl, best to show i7-xxxxk models overclocked as well.

    Currently the 1800x looks to compete to the 7700k. And it costs more. (unless you need the cores)
     
  12. sverek

    sverek Ancient Guru

    Messages:
    6,073
    Likes Received:
    2,953
    GPU:
    NOVIDIA -0.5GB
    One big thing people mistake over 7700K > 8 core Ryzen is performance cap.
    7700K already pushing it limits on gta5 or bf1 once you soften GPU cap and disable vsync.
    CPU utilization of 7700K is above 90% and hitting 100% that causing stutter.
    Ryzen on other hand still has headroom left, thanks to twice ammount of cores.

    So while 7700K might be good option for gaming NOW, it might not be any more as Ryzen tunes up and even more CPU hunger games shows up.

    reference: https://youtu.be/BXVIPo_qbc4
     
  13. sverek

    sverek Ancient Guru

    Messages:
    6,073
    Likes Received:
    2,953
    GPU:
    NOVIDIA -0.5GB
    1700 (which is cheaper) owns 7700K@5GHz on modern games that process 16 cores. So unless you buy 7700K to play old games at 249 fps which is faster than ryzen 215 fps, it doesnt make sense. Please apply ice on burn.

    https://youtu.be/V5RP1CPpFVE
     
  14. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    7,815
    Likes Received:
    707
    GPU:
    Inno3D RTX 3090
    I don't understand people. It's obvious that people who never cared about getting higher core counts at slower frequencies per core before, should still not care.

    On the other hand, we now have the choice to basically get what is 90% of a 6900k, at more or less the price of the 7700k. That's the dilemma, I don't understand the rest really.
     
  15. Agent-A01

    Agent-A01 Ancient Guru

    Messages:
    11,404
    Likes Received:
    921
    GPU:
    ASUS 3080 Strix H20
    Actually, someone who is on a team that develops/designs CPUs made a good video about the technicalities of such.

    You should watch it.
    Not sure what it was I'll have to look.

    Anyways, basically they were saying that Intel is not artificially holding back performance and in fact the 5% from newer revisions is pretty normal and getting more than that requires a lot of work.
    You say they have a lot of money but either way you put it, financially it doesn't make sense to try radically new designs yearly to get as much performance out.
    You have to look at it from a business standpoint.
    They've already failed on a future desktop release(icelake? idk which it was) and that costs a lot of money.

    So I'm going to point out your assumption that intel is holding back performance is untrue.

    True, they could add more cores(as most power users of course we want more) but it would be a performance regression for many cases.
    So much software is very limited in scalability of threads.
    Most software we use(especially general use) only make use of a few threads at most.

    The problem with adding more cores means more heat and more power.
    There is also the fact that as more cores are added, clock speeds decrease exponentially.
    Now why would average gamer Joe need a 16core 1.8ghz CPU?
    In those circumstances, he would see a very large performance regression compared to a quad core.

    Also cost will need to be factored in.
    Will require much better heatsinks, better PSU etc and for an end consumer where most just want cheap this doesn't make much sense.

    In the mobile market, there's a reason most only have dual cores.
    Obviously due to the aforementioned reasons.
    Laptops make more sales globally than desktop variants.

    Anyways another reason we aren't getting 8core minimum is due to lack of good software.
    I know many people say hardware needs to catch up(true for GPUs really) but in these days so much software will only use 2-4 threads at most when we've had 8 thread CPUs in the consumer market for nearly a decade now.

    Now who is at fault for that? Let me guess, Intel.. lol

    I think due to personal reasons of hating intel you aren't thinking logically.
    My 2c
     

  16. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    5,941
    Likes Received:
    2,307
    GPU:
    HIS R9 290
    I agree. Let's not forget that Broadwell-E doesn't fare that great in games either, and in general these aren't gaming CPUs. When you look at everything but gaming, these CPUs are fantastic. Not flawless, but even an 1800X is a bargain for what it can do.

    But, it only takes 1 flaw (of which seems to have room for improvement via updates) for people to deem a product as bad.
     
  17. eclap

    eclap Banned

    Messages:
    31,495
    Likes Received:
    4
    GPU:
    Palit GR 1080 2000/11000
  18. nicugoalkeper

    nicugoalkeper Master Guru

    Messages:
    898
    Likes Received:
    24
    GPU:
    ASUS GTX 1060 DUAL OC 6GB
    Like i said, AMD didn't made a WOW CPU in regards to INTEL.
    But those are nice ones , very good in multi, slower but not by much in single thread.
    Still like i said other times the next versions (2018-2019 ...) may be a game changer if Intel is still going to make slow increments (I don't thinks so).
    To bad for not doing so good on games. Many say that we need more that 8 care, from what i can see we will only benefit from more that 8 cores when consoles (PS or XBOX) will have more that 8 cores, until then few games will support more cores.... Many games are ports, and many more will come as ports. (no need to optimize something for more that you did for console/no real money gain).
    For Me the biggest gain is for the end user, competition is back and this will only be good for us.
    In the near future i see more improvements to Ryzen and same from Intel (but the raw power will see slow increments in speed), and.....and fewer money for what we get.
    This CPU confirms to me that to date with the current technology we cant get a better performance for few bucks (so that the producer can have a good profit).
     
  19. kostaspyrkas

    kostaspyrkas New Member

    Messages:
    3
    Likes Received:
    0
    GPU:
    16gb
    think future proof

    never judge from existing games when upgrading cpu. Its something that will serve you for at least 5 years so think future proof... I made this mistake at 2009 buying a 2 core(intel e8500) over a 4 core(intel q5500) because everyone was saying at that time that you don't need 4 cores for gaming-only 2 with more hz..2 years later my cpu was a huge bottleneck for the whole system...
    so my advise...judge by raw performance...which is better pictured at synthetic benchmarks...in the near future your investment will show it value...
    (same thing happened with fury r9...when first released it lagged even behing 970gtx although its raw performance was at 980ti levels...2 years later it even surpassed 1070gtx in many benchmarks...)
     
  20. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    5,941
    Likes Received:
    2,307
    GPU:
    HIS R9 290
    I understand what you're saying but consider the events you just described:
    You bought a dual core in 2009. Multi-threaded CPUs have been readily available to the public since 2002. It took SEVEN years for dual cores to become obsoleted. The Core2 Quad (pretty much the first widely available x86 quad core) was released in 2006. Over TEN years later, and the vast majority of games will play fine on an i5. I do agree that quad cores are starting to not keep up, but 16 threads? No, nobody is going to need that for gaming for at least a decade.

    Modern consoles use 8-cores and have been around since 2013, and likely won't be replaced within the next couple years. Even if next-gen consoles use 16 threads (which I doubt they will), we're not going to see too many games take advantage of a Ryzen 7's resources.

    I am not saying a R7 is bad, not at all. I would without question recommend it for workstation users. It's "adequate" for gamers (especially 1440p or higher). However, I would never recommend R7s for gamers. By the time games would cripple 8-threaded CPUs, there will be much better options.


    I still stand by my point that the R5 series will likely be the best AMD CPUs for gamers.
     
    Last edited: Mar 3, 2017

Share This Page