Review: AMD Ryzen Threadripper 1950X

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Aug 10, 2017.

  1. Taint3dBulge

    Taint3dBulge Maha Guru

    Messages:
    1,153
    Likes Received:
    12
    GPU:
    MSI Suprim X 4090
    why do you not use BattleField 1 as a game benchmark? It uses cpu cores to their deaths. Most people I talk to that have a high end GPU, like 1080ti say their cpus, along with myn, are always 90-100% usage. Would be fun to see how this rips bf1.
     
    Last edited: Aug 11, 2017
  2. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,392
    Likes Received:
    18,563
    GPU:
    AMD | NVIDIA
    I assume the question is, why I do not ... :) BF1 is fairly difficult to use as reliable test. If I create a 30 second bench run then in one test FPS is lower and in the other higher by margins up-to 5%.

    This is mainly due to random explosions like grenades and artillery hitting close-by. Once that happens the FPS drops significantly and is affecting the FPS measurement fairly extensive. So the main problem is, the explosions are random and they can drop the FPS big-time.

    I am still looking at some other more friendly scenes to use, but then again seeking a more smooth scene to use just for FPS results defeats the purpose of objective benchmarking.

    And then lastly EA has a tendency of releasing new game patches that screw up all results with faster/slower FPSes. Graphics cards you can fairly quickly swap out an test, complete test-systems with CPUs however eats away days of time. This all makes BF1 very time consuming and complicated to test.
     
  3. Taint3dBulge

    Taint3dBulge Maha Guru

    Messages:
    1,153
    Likes Received:
    12
    GPU:
    MSI Suprim X 4090

    Touché my friend. I guess i forgot about that game not having a friendly benchmarker. I just wonder if BF1 would use all the cores, and at what percentage they are at.
     
  4. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,392
    Likes Received:
    18,563
    GPU:
    AMD | NVIDIA
    BF1 does thread well, but does not utilize 100% of the cores by far (from my memory recollection).

    I can do a quick video recording of AB showing that if you like, I guess that is interesting to see for many. I still have the 12/24 core part installed.
     

  5. Taint3dBulge

    Taint3dBulge Maha Guru

    Messages:
    1,153
    Likes Received:
    12
    GPU:
    MSI Suprim X 4090
    If you have the time. That would be interesting to see. Maybe that and another game or 2 that utilize more cores, or use a crap load of cpu power. Just for giggles lol. :)
     
  6. Taint3dBulge

    Taint3dBulge Maha Guru

    Messages:
    1,153
    Likes Received:
    12
    GPU:
    MSI Suprim X 4090




    Well, if i recall correctly when the first 3 and 4 core cpus came out, hardly any games utilized more then 2 cores. So when we saw the first games use 4 cores or more it was pretty cool. But we saw that within a year or 2 of the first few true 4cores. We have had 6-10 core cpus for awhile now, and with the mainstream i7 coffee lake coming out with a 6 core part. i think we will have many more games using more cores and using less core usage to make games more fluid. Cause having high core usage really starts causing them stutters. Thats why im interested in a game like battlefield how the extra cores actually do compaired to say a 4 core i7. even a high mhz i7 7700k at say 5ghz.
     
  7. k3vst3r

    k3vst3r Ancient Guru

    Messages:
    3,702
    Likes Received:
    177
    GPU:
    KP3090
  8. Valken

    Valken Ancient Guru

    Messages:
    2,922
    Likes Received:
    903
    GPU:
    Forsa 1060 3GB Temp GPU
    I cannot wait to see the Vega review on both Intel and Ryzen/TR cpu platforms!
     
  9. cowie

    cowie Ancient Guru

    Messages:
    13,276
    Likes Received:
    357
    GPU:
    GTX
    no 3dvantage cpu or 3d11 cpu scores?
    I cant find any at all
     
  10. xrodney

    xrodney Master Guru

    Messages:
    368
    Likes Received:
    68
    GPU:
    Saphire 7900 XTX
    For 7700k someone might consider that CPU not being worth it because coming EOL for Socket/chipset and for it already being close to 100% utilization in some current games.
     

  11. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,392
    Likes Received:
    18,563
    GPU:
    AMD | NVIDIA
    They do not support threading at this level. At one point you gotta move on brother ...

    :nerd:
     
  12. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,975
    Likes Received:
    4,342
    GPU:
    Asrock 7700XT
    Yeah... and that was 6 years ago. Back then, people were right - that wasn't a wise choice for a gaming CPU. By today's standards, it's just adequate (when you consider the IPC is worse than modern CPUs).

    Sure - nobody is arguing you'll have a bad gaming experience on TR, but to get TR for gaming as the #1 priority is a poor decision unless you have money to burn. I assure you, you could go back to dual channel and not notice the difference. Depending what framerate and resolution you expect to play at, I bet you wouldn't notice the difference between quad and single channel either. The extra memory channels are really only useful in highly parallel tasks, which games are not.

    If you want a TR then great - AMD could sure use your business and marketshare. But, I think you'd be better off with a Ryzen 7.
     
  13. Emille

    Emille Guest

    Messages:
    785
    Likes Received:
    27
    GPU:
    1080 Ti Aorus Extreme

    There won't be an 8 core threadripper. The threadripper parts are called Ryzen Threadripper....and are basically just a higher vore count version of the ryzen cpus just on a different chipset.

    There is no way amd would cannibalise the sales of their already new ryzen cpus with an almost identical 8 core threadripper part.
     
  14. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,975
    Likes Received:
    4,342
    GPU:
    Asrock 7700XT
    Yes there will be; that's what the 1900X is. It doesn't cannibalize the AM4 products, for several reasons:
    1. If it did, then you could say the same about i9s and Xeons.
    2. The CPU+X399 motherboard is significantly more expensive than an 1800X+X370 board. When you consider the price difference of the 1700+B350 board, that separates them even further.
    3. The 1900X will likely operate at a lower voltage and have better thermals, but, it will also likely have a worse idle wattage.
    4. The 1900X has quad-channel memory support; the 1800X only has dual.
    5. The 1900X has a much greater L3 cache.
    6. The 1900X has 60 on-chip PCIe lanes; the 1800X only has 16.
    7. The 1800X can be used in ITX builds; I'm not sure if we're even going to see micro ATX for socket TR4.
     
  15. D3M1G0D

    D3M1G0D Guest

    Messages:
    2,068
    Likes Received:
    1,341
    GPU:
    2 x GeForce 1080 Ti
    Umm, AMD already announced an 8-core part, the 1900X.

    http://www.guru3d.com/articles-pages/tech-preview-ryzen-threadripper-1900x-1920x-and-1050x,1.html
     

  16. D3M1G0D

    D3M1G0D Guest

    Messages:
    2,068
    Likes Received:
    1,341
    GPU:
    2 x GeForce 1080 Ti
  17. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
  18. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,975
    Likes Received:
    4,342
    GPU:
    Asrock 7700XT
    That is kind of funny, though not too surprising. Nvidia seemed to get along relatively well with AMD's CPU division, albeit a little sour once AMD bought ATI (being the death of the nForce chipsets). AMD's CPUs help Nvidia's sales, so ultimately they're more happy about their success than worried.

    On a side note, considering they're direct competitors with AMD's graphics, Nvidia surprisingly doesn't seem to start many feuds, legal or otherwise. Nvidia is a very arrogant and selfish company, but they're not hostile against AMD. However... I can't say the opposite is true - AMD seems very salty and bitter about Nvidia.

    Meanwhile, I'm aware Nvidia does not get along that well with Intel and vise versa. Nvidia created the ARM-based Tegra series because Intel wouldn't allow them to use the x86 license. Many modern Intel boards support Crossfire, but not SLI. Both companies have tried suing each other. Nvidia attempts to steal Intel's server marketshare, and Intel attempted to steal Nvidia's mobile marketshare.
     
  19. nizzen

    nizzen Ancient Guru

    Messages:
    2,414
    Likes Received:
    1,149
    GPU:
    3x3090/3060ti/2080t
    1950x and 2x 1080ti, here we go :D

    Love from Norway :)
     
  20. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,392
    Likes Received:
    18,563
    GPU:
    AMD | NVIDIA
    Here you go:

    Battlefield 1 running on GTX 1080 @ 2560x1440 with Ryzen Threadripper 1920X 12c/24t to demonstrate the thread distribution, allocation and utilization. Btw the fairly low FPS is due to the video encoder recording at 100 Mbps.
     

Share This Page