1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Review: AMD Ryzen Threadripper 1950X

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Aug 10, 2017.

  1. Taint3dBulge

    Taint3dBulge Maha Guru

    Messages:
    1,126
    Likes Received:
    3
    GPU:
    1080Ti 2063/6180
    why do you not use BattleField 1 as a game benchmark? It uses cpu cores to their deaths. Most people I talk to that have a high end GPU, like 1080ti say their cpus, along with myn, are always 90-100% usage. Would be fun to see how this rips bf1.
     
    Last edited: Aug 11, 2017
  2. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    35,889
    Likes Received:
    4,999
    GPU:
    AMD | NVIDIA
    I assume the question is, why I do not ... :) BF1 is fairly difficult to use as reliable test. If I create a 30 second bench run then in one test FPS is lower and in the other higher by margins up-to 5%.

    This is mainly due to random explosions like grenades and artillery hitting close-by. Once that happens the FPS drops significantly and is affecting the FPS measurement fairly extensive. So the main problem is, the explosions are random and they can drop the FPS big-time.

    I am still looking at some other more friendly scenes to use, but then again seeking a more smooth scene to use just for FPS results defeats the purpose of objective benchmarking.

    And then lastly EA has a tendency of releasing new game patches that screw up all results with faster/slower FPSes. Graphics cards you can fairly quickly swap out an test, complete test-systems with CPUs however eats away days of time. This all makes BF1 very time consuming and complicated to test.
     
  3. Taint3dBulge

    Taint3dBulge Maha Guru

    Messages:
    1,126
    Likes Received:
    3
    GPU:
    1080Ti 2063/6180

    Touché my friend. I guess i forgot about that game not having a friendly benchmarker. I just wonder if BF1 would use all the cores, and at what percentage they are at.
     
  4. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    35,889
    Likes Received:
    4,999
    GPU:
    AMD | NVIDIA
    BF1 does thread well, but does not utilize 100% of the cores by far (from my memory recollection).

    I can do a quick video recording of AB showing that if you like, I guess that is interesting to see for many. I still have the 12/24 core part installed.
     

  5. Taint3dBulge

    Taint3dBulge Maha Guru

    Messages:
    1,126
    Likes Received:
    3
    GPU:
    1080Ti 2063/6180
    If you have the time. That would be interesting to see. Maybe that and another game or 2 that utilize more cores, or use a crap load of cpu power. Just for giggles lol. :)
     
  6. angelgraves13

    angelgraves13 Maha Guru

    Messages:
    1,294
    Likes Received:
    276
    GPU:
    RTX 2080 Ti FE
    It will be a while before games can use so many cores. Now that Threadripper is out, we should begin seeing some optimization for HCC in a year or two.

    If all the cores can be used correctly, then clock speeds shouldn't matter as much as they do currently.
     
  7. Taint3dBulge

    Taint3dBulge Maha Guru

    Messages:
    1,126
    Likes Received:
    3
    GPU:
    1080Ti 2063/6180




    Well, if i recall correctly when the first 3 and 4 core cpus came out, hardly any games utilized more then 2 cores. So when we saw the first games use 4 cores or more it was pretty cool. But we saw that within a year or 2 of the first few true 4cores. We have had 6-10 core cpus for awhile now, and with the mainstream i7 coffee lake coming out with a 6 core part. i think we will have many more games using more cores and using less core usage to make games more fluid. Cause having high core usage really starts causing them stutters. Thats why im interested in a game like battlefield how the extra cores actually do compaired to say a 4 core i7. even a high mhz i7 7700k at say 5ghz.
     
  8. k3vst3r

    k3vst3r Ancient Guru

    Messages:
    3,351
    Likes Received:
    16
    GPU:
    Zotac 1070 Amp
  9. Valken

    Valken Maha Guru

    Messages:
    1,459
    Likes Received:
    74
    GPU:
    Forsa 1060 3GB Temp GPU
    I cannot wait to see the Vega review on both Intel and Ryzen/TR cpu platforms!
     
  10. cowie

    cowie Ancient Guru

    Messages:
    13,189
    Likes Received:
    281
    GPU:
    GTX
    no 3dvantage cpu or 3d11 cpu scores?
    I cant find any at all
     

  11. xrodney

    xrodney Master Guru

    Messages:
    325
    Likes Received:
    46
    GPU:
    Aorus 1080ti xtreme
    For 7700k someone might consider that CPU not being worth it because coming EOL for Socket/chipset and for it already being close to 100% utilization in some current games.
     
  12. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    35,889
    Likes Received:
    4,999
    GPU:
    AMD | NVIDIA
    They do not support threading at this level. At one point you gotta move on brother ...

    :nerd:
     
  13. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    4,338
    Likes Received:
    1,308
    GPU:
    HIS R9 290
    Yeah... and that was 6 years ago. Back then, people were right - that wasn't a wise choice for a gaming CPU. By today's standards, it's just adequate (when you consider the IPC is worse than modern CPUs).

    Sure - nobody is arguing you'll have a bad gaming experience on TR, but to get TR for gaming as the #1 priority is a poor decision unless you have money to burn. I assure you, you could go back to dual channel and not notice the difference. Depending what framerate and resolution you expect to play at, I bet you wouldn't notice the difference between quad and single channel either. The extra memory channels are really only useful in highly parallel tasks, which games are not.

    If you want a TR then great - AMD could sure use your business and marketshare. But, I think you'd be better off with a Ryzen 7.
     
  14. Emille

    Emille Master Guru

    Messages:
    785
    Likes Received:
    27
    GPU:
    1080 Ti Aorus Extreme

    There won't be an 8 core threadripper. The threadripper parts are called Ryzen Threadripper....and are basically just a higher vore count version of the ryzen cpus just on a different chipset.

    There is no way amd would cannibalise the sales of their already new ryzen cpus with an almost identical 8 core threadripper part.
     
  15. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    4,338
    Likes Received:
    1,308
    GPU:
    HIS R9 290
    Yes there will be; that's what the 1900X is. It doesn't cannibalize the AM4 products, for several reasons:
    1. If it did, then you could say the same about i9s and Xeons.
    2. The CPU+X399 motherboard is significantly more expensive than an 1800X+X370 board. When you consider the price difference of the 1700+B350 board, that separates them even further.
    3. The 1900X will likely operate at a lower voltage and have better thermals, but, it will also likely have a worse idle wattage.
    4. The 1900X has quad-channel memory support; the 1800X only has dual.
    5. The 1900X has a much greater L3 cache.
    6. The 1900X has 60 on-chip PCIe lanes; the 1800X only has 16.
    7. The 1800X can be used in ITX builds; I'm not sure if we're even going to see micro ATX for socket TR4.
     

  16. D3M1G0D

    D3M1G0D Ancient Guru

    Messages:
    1,857
    Likes Received:
    1,190
    GPU:
    2 x GeForce 1080 Ti
    Umm, AMD already announced an 8-core part, the 1900X.

    http://www.guru3d.com/articles-pages/tech-preview-ryzen-threadripper-1900x-1920x-and-1050x,1.html
     
  17. D3M1G0D

    D3M1G0D Ancient Guru

    Messages:
    1,857
    Likes Received:
    1,190
    GPU:
    2 x GeForce 1080 Ti
  18. Noisiv

    Noisiv Ancient Guru

    Messages:
    6,652
    Likes Received:
    494
    GPU:
    2070 Super
  19. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    4,338
    Likes Received:
    1,308
    GPU:
    HIS R9 290
    That is kind of funny, though not too surprising. Nvidia seemed to get along relatively well with AMD's CPU division, albeit a little sour once AMD bought ATI (being the death of the nForce chipsets). AMD's CPUs help Nvidia's sales, so ultimately they're more happy about their success than worried.

    On a side note, considering they're direct competitors with AMD's graphics, Nvidia surprisingly doesn't seem to start many feuds, legal or otherwise. Nvidia is a very arrogant and selfish company, but they're not hostile against AMD. However... I can't say the opposite is true - AMD seems very salty and bitter about Nvidia.

    Meanwhile, I'm aware Nvidia does not get along that well with Intel and vise versa. Nvidia created the ARM-based Tegra series because Intel wouldn't allow them to use the x86 license. Many modern Intel boards support Crossfire, but not SLI. Both companies have tried suing each other. Nvidia attempts to steal Intel's server marketshare, and Intel attempted to steal Nvidia's mobile marketshare.
     
  20. nizzen

    nizzen Master Guru

    Messages:
    695
    Likes Received:
    116
    GPU:
    3x2080ti/5700x/1060
    1950x and 2x 1080ti, here we go :D

    Love from Norway :)
     

Share This Page