Review: Ashes of Singularity: DX12 Benchmark II with Explicit Multi-GPU mode

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Feb 24, 2016.

  1. Despoiler

    Despoiler Master Guru

    Messages:
    244
    Likes Received:
    0
    GPU:
    Sapphire TRI-X R9 Fury
  2. BroDragon

    BroDragon Member

    Messages:
    11
    Likes Received:
    0
    GPU:
    gtx970sli
    I was really hoping to see the benefits of sharing workload with the iGPU. Not everyone has multiple GPUs(I do) but most people have a CPU with onboard graphics. If people with graphics cards can finally start using this resource that would be a very good thing for a tremendous number of users. Look at how much space on Intel's CPUs is taken up by the iGPU. Please follow this article up as soon as possible with one on this area. Maybe one percent of users have different brand video cards laying around, maybe five percent have multiple similar GPUs but almost everyone has a video card and an unused iGPU on their CPU. This is the obvious first direction to take.
     
  3. GeniusPr0

    GeniusPr0 Maha Guru

    Messages:
    1,240
    Likes Received:
    13
    GPU:
    RTX 2080
    That's kind of misleading (not robust either). A game would need two paths, which has already been said, and I'm not sure how awkward NVidia's cuda path would be.


    Except it doesn't work with DX12, and I don't know why. The Async compute that NVidia advertised is based on writing a driver that queues for the GPU and can, if needed put the load on CPU cores, which in turn, ruins performance since it's not really meant for large loads. NVidia gets screwed because AMD's on the extreme end. It won't show in VR because VR doesn't put enough load to exploit the weakness.
     
    Last edited: Feb 26, 2016
  4. Dygaza

    Dygaza Master Guru

    Messages:
    535
    Likes Received:
    0
    GPU:
    Fury X 4GB

  5. GeniusPr0

    GeniusPr0 Maha Guru

    Messages:
    1,240
    Likes Received:
    13
    GPU:
    RTX 2080
    Hilbert isn't wrong. He didn't say there was microstuttering, only that vSYNC was on and the limit was really 60FPS. This can be easily tested on an overclocked monitor with DVI.

    The above is quoted from Hilbert's NVidia test on the same page.
     
  6. Ieldra

    Ieldra Banned

    Messages:
    3,490
    Likes Received:
    0
    GPU:
    GTX 980Ti G1 1500/8000
    Generally speaking though, DX12 is a trap; it's not going to be a smooth transition.

    Look at Arkham Knight, look how badly that ran on DX11 despite all the hardware specific libraries and gameworks and physx etc etc, now imagine how much worse it would have been if they had low level access to the hardware.

    I think in general dx12 will be underwhelming initially
     
  7. semitope

    semitope Member

    Messages:
    36
    Likes Received:
    0
    GPU:
    iGPU
    its sad that due to nvidia's dominance and (brainwashing?) people do not understand the difference between something like overtessellation and features meant to actually improve our experience both graphically and in performance.

    Asynchronous compute is not like what nvidia does. it's simply an advancement in graphics technology that allows compute to be run while graphics is running.

    its going nowhere and some of you will of course change your tune whenever nvidia decides to get with the times and finally catch up. Luckily I doubt they can abuse async like they did when they caught up with tessellation. Except to get developers to do it in a way that damaged AMD, unlikely.
     
  8. Ieldra

    Ieldra Banned

    Messages:
    3,490
    Likes Received:
    0
    GPU:
    GTX 980Ti G1 1500/8000
    when you're proven right i will send you a certificate of profound apology and congratulations on your astute predictions, i promise
     
  9. dgrigo

    dgrigo Member

    Messages:
    17
    Likes Received:
    0
    GPU:
    TitanX
  10. GeniusPr0

    GeniusPr0 Maha Guru

    Messages:
    1,240
    Likes Received:
    13
    GPU:
    RTX 2080

  11. C0D3M4N

    C0D3M4N Member

    Messages:
    29
    Likes Received:
    3
    GPU:
    Asus 1080Ti
    I ran the benchmark @ 1080p and 1440p.

    5820K 4.6
    16GB 3200 DDR4
    980ti 1507/8000

    2560x1440 - CRAZY setting

    [​IMG]

    1920x1080 - CRAZY setting
    [​IMG]
     
    Last edited: Feb 26, 2016
  12. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,161
    Likes Received:
    83
    GPU:
    RX 580 8GB
    And forget the benefits NVIDIA has with DX11 games currently. I don't think that's a great idea right now. Not until there's many more DX12/Vulkan, and even then, things might be a lot different to now.
     
  13. GeniusPr0

    GeniusPr0 Maha Guru

    Messages:
    1,240
    Likes Received:
    13
    GPU:
    RTX 2080
    Async compute can't be abused. The FuryX gains more from its ACE units at 4K, meaning it had some left in the tank. Let that sink in.
     
  14. RzrTrek

    RzrTrek Ancient Guru

    Messages:
    2,423
    Likes Received:
    662
    GPU:
    RX 580, MESA 20.0.4
    Looks like AMD have been on the lazy side again.
     
  15. GeniusPr0

    GeniusPr0 Maha Guru

    Messages:
    1,240
    Likes Received:
    13
    GPU:
    RTX 2080
    Since people are confused about the FCAT because of that ET article :rolleyes:.

    [​IMG]

    Snip from ET's paragraph:

    Womp womp. :wanker:

    Hilbert then goes on to say that the data is meaningless because of VSYNC.

    *last page there is a typo with "FACT" that should be "FCAT", no biggie and obviously spell-correct will not see it.
     
    Last edited: Feb 26, 2016

  16. semitope

    semitope Member

    Messages:
    36
    Likes Received:
    0
    GPU:
    iGPU
    There should be no confusion about FCAT. From what I read nvidia themselves admit its not up to date for dx12. The explanation of what was going on was clear.
     
  17. GeniusPr0

    GeniusPr0 Maha Guru

    Messages:
    1,240
    Likes Received:
    13
    GPU:
    RTX 2080
    For those that thought I was flamebaiting about nvidia's PR.

     
  18. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    37,352
    Likes Received:
    6,390
    GPU:
    AMD | NVIDIA
    I've posted an addendum in the article on ETs findings in the conclusion page.
     
  19. Dygaza

    Dygaza Master Guru

    Messages:
    535
    Likes Received:
    0
    GPU:
    Fury X 4GB
    Interesting energy efficiency test by tomshardware.de

    Energy efficiency Watts / fps on 980/980Ti and Fury X are on same level, 390x is far behind.



    [​IMG]

    [​IMG]

    [​IMG]

    Interesting CPU usage on Nvidia comes down a bit more than AMD, even they are using hardware schedules. Then again AMD is producing more frames , so cpu is used more.
     
  20. OnnA

    OnnA Ancient Guru

    Messages:
    10,518
    Likes Received:
    2,335
    GPU:
    Vega 64 XTX LiQuiD
    Here is my Bench for AoS but Older ver. Tested September/October 2015
    (i don't have it installed right now to run Beta 2, but it will be Better i presume ;-) )

    CPU 4GHZ/XFX 1050/1575
    Res. 1920:1440 86Hz (CRT ~23" SONY Black Trinitron Tube in DiamonPRO 2070)

    [​IMG]
    [​IMG]
    [​IMG]
    [​IMG]
    [​IMG]
    [​IMG]
     
    Last edited: Feb 26, 2016

Share This Page