AMD Gives Pointers On How to Improve Ryzen 1080p game performance

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Mar 15, 2017.

  1. Agent-A01

    Agent-A01 Ancient Guru

    Messages:
    11,640
    Likes Received:
    1,143
    GPU:
    4090 FE H20
    Why do you even bother?

    He acts like AMD's salesman, making excuses, deflecting etc.
    He'll argue over anything negative(despite being backed up by facts) about ryzen.

    Just ignore him.
     
  2. vbetts

    vbetts Don Vincenzo Staff Member

    Messages:
    15,140
    Likes Received:
    1,743
    GPU:
    GTX 1080 Ti
    Please, don't. Just don't go on about it anymore...
     
  3. Loophole35

    Loophole35 Guest

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    Or it could be enter BIOS press F9 the 1 or 2 hit enter then F12 and enter again to switch. I turn my computer off when I'm not using it.
     
  4. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    No, in no way I would recommend buying Quad Core for gaming today.
    Kaby 7700k may be better gaming CPU today.

    But think about games which will put heavy strain on next generation of powerful GPUs.
    What will be workload? Will game need just one Game thread + 3 DX12 threads to generate enough DrawCalls to properly utilize new GPUs?

    IIRC basic DX12 design is 6 threads to feed GPU. Then what is left for Game itself? Do I want 6 DX12 threads, where each can fully load entire core + 2~4 game threads to complete over 4 real cores with SMT?

    Even if you plan 4k resolution, AAA games will still use at least 2 threads for game itself. Having BF1 using all 4 of my cores on 100% is not exception confirming rule of poorly threaded games. It is 1st brutal wake-up call.

    Edit: I believe there is some "Freeloader" version of BF1. Try to play multiplayer on it. Run at lowest graphics settings. So you see how high utilization your Quad+SMT gets. And lowest does have less DrawCalls as it uses much simpler geometry, effects, and many meshes are missing.
     
    Last edited: Mar 16, 2017

  5. PunishedSnake

    PunishedSnake Guest

    Messages:
    4
    Likes Received:
    0
    GPU:
    2x Gigabyte GTX780 3GB
    Hey Hilbert, just a heads up on an error in AMDs article and for anyone considering buying the recommended Corsair DIMMs, I've already posted this on AMDs blog:

    The Corsair CMK16GX4M2B3200C16 VERSION 5.39 [16-18-18-36 @ 1.35v] is NOT a Samsung B-Die, it's SK Hynix and will NOT clock above 2666mhz on an Asus Crosshair VI Hero on not only my board but also a friend who bought the exact same setup, the CORSAIR CMK16GX4M2B3200C16 ver4.31 16-18-18-36 @ 1.35V however IS a Samsung B-Die and will behave as expected. Caveat Emptor.
     
  6. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    It may actually clock. It will be just much harder to manually find proper BCLK multi combination.
    Because those memories which actually work still run increased BCLK and lower multi, as high multi causes loose timings and that is not compatible with many of high clock DDR.
    + there is prime issue that Ryzen seems to enforce 1T command rate no matter what. If that was turned to 2T, it would likely drastically increase memory compatibility.
     
  7. PunishedSnake

    PunishedSnake Guest

    Messages:
    4
    Likes Received:
    0
    GPU:
    2x Gigabyte GTX780 3GB
    Nah, it's all in Corsairs version numbering scheme v4=Samsung, v5=SK Hynix. It's even on ASUSs Memory QVL specifically as being ver4.31 also Elmor's 0902 BIOS enforces 2T so that isn't this issue.
     
  8. H83

    H83 Ancient Guru

    Messages:
    5,510
    Likes Received:
    3,036
    GPU:
    XFX Black 6950XT
    So i´m currently buying parts to build a new gaming rig almost from scratch and of course i have to buy a cpu. And like you, i think paying more than 300€ for a quad core (7700K) is silly! So i´m buying a quad core for 250€ (7600K), this way i´m going to feel better with myself.
    As for having 8 cores, that´s nice but only if you can take full advantage of them, otherwise is just a waste, like cell phones with 8 cores...
     
  9. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    On the other hand, the i7 seems to benefit really nicely from the 8 thread support. I mean, I actually have 8 threads now. If I were you, I would wait a bit for the inevitable Ryzen 1700 price drop, prop up another $50 and get that instead of the 7600k.
     
  10. eclap

    eclap Banned

    Messages:
    31,468
    Likes Received:
    4
    GPU:
    Palit GR 1080 2000/11000
    With CPUs, they'll last you a few years, the £50 extra is well worth the premium for 4 extra threads :')
     

  11. H83

    H83 Ancient Guru

    Messages:
    5,510
    Likes Received:
    3,036
    GPU:
    XFX Black 6950XT
    The 1700 is a great cpu but not for me. I´m building a pc for gaming and for that the 1700 is a wast because i won´t take advantage of all the 8 cores not to mention 16 threads... The 7700K would be "perfect" for my case but i don´t want to spend that much money on a quad core cpu... Not to mention i spent more than wanted on a new gpu, bought a 1080 instead of a 1070, so i need to control myself...
    About Ryzen, i think it´s "broken" regarding gaming. The latency problem between clusters seems to be physical and i don´t believe any kind of software fix will solve it completely. My guess is that the 2nd generation is going to solve or greatly reduce the latency problem, making Ryzen a proper gaming cpu.
    This is just my opinion because i could be completly wrong about this matter.
     
  12. Thalyn

    Thalyn Guest

    Curiously, given the performance in situations where the GPU is irrelevant, has anyone stopped to consider that perhaps it's an NVidia driver issue causing the lower 1080p (and below) performance?

    Every situation where it falls behind is relying on the GPU, and every test has been conducted with NVidia hardware. I'm not saying it shouldn't be - Pascal is the fastest option, so it makes sense to use it. Just that it hasn't been ruled out as a source for the disparity. A Fury might not have the raw power of a 1080Ti but it would still make for a good data point for comparison.
     
  13. sverek

    sverek Guest

    Messages:
    6,069
    Likes Received:
    2,975
    GPU:
    NOVIDIA -0.5GB
    I myself would love to see benches on R480 with lowest settings.
     

Share This Page