Nvidia and AMD Cross Multi GPU Tested In DirectX 12

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Oct 26, 2015.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    36,471
    Likes Received:
    5,503
    GPU:
    AMD | NVIDIA
    Though I’m not quite sure if I’d be using Ashes of Singularity for testing this myself, it is an interesting read. Anandtech posted some benchmark testing a feature in DirectX 12 c...

    Nvidia and AMD Cross Multi GPU Tested In DirectX 12
     
  2. Blackfyre

    Blackfyre Maha Guru

    Messages:
    1,019
    Likes Received:
    35
    GPU:
    MSI 1070 Gaming X
    I was looking forward to this... I thought my HD4600 iGPU, which comes built into my 4790K, will finally be useful alongside my HD7970 under Windows 10. Certainly the first time Microsoft announced this feature back last year I was excited about it. But I don't believe the HD4600 is DX12 ready. There are still a few interesting questions to be asked:

    How will this work with a gamework title?

    Also will buying a freesync monitor and a g-sync monitor not matter anymore?

    Can one card be used to do anti-aliasing (for example at 24X) and the other card be used to render the game?

    There are a lot of interesting prospects, but I doubt any of them will be implemented in the near future.
     
  3. DiceAir

    DiceAir Maha Guru

    Messages:
    1,350
    Likes Received:
    14
    GPU:
    Galax 980 ti HOF
    I'm more excited running my iGPU together with my dedicated gpu. imagine your iGPU doing some of the more basic tasks in a game while your dedicated gpu does all the hard stuff. Kinda what AMD said about their APU that can be crossfired with any dx12 capable GPU
     
  4. holler

    holler Master Guru

    Messages:
    220
    Likes Received:
    39
    GPU:
    2x AMD Radeon VII
    AMD is starting to play its card. notice how the benchmarks are the best with the AMD Fury X card as primary.

    I would imagine the primary card would either be running freesync or g-sync if your primary is AMD or Nvidia respectively. Kinda sweet I will be able to utlize old cards now in Multi GPU going forward to handle post processing stuff.
     

  5. Tugrul_512bit

    Tugrul_512bit Member Guru

    Messages:
    114
    Likes Received:
    0
    GPU:
    msi_r7870hawk_asus_r7_240
    Load balancing or we choose each card from dropdownlist to bind them to specific tasks.

    R7-240 --> renders %10 of tiles and computes physics(except smoke) and artificial intelligence. (30W)

    hd7870 --> renders %90 of tiles and computes smoke and 8x anti aliasing and some ray traced surfaces and crowd pathfinding. (190W)

    fx8150 --> I dont want to use this for anything, it heats too much ... . 250 W
     
    Last edited: Oct 26, 2015
  6. ---TK---

    ---TK--- Ancient Guru

    Messages:
    22,112
    Likes Received:
    2
    GPU:
    2x 980Ti Gaming 1430/7296
    980ti bit faster than a fury x 1440p, all those nonsense threads in the amd section about maxwell and dx12. All it took was a driver update. Remember 1 guy who said do not expect nvidia to gain dx 12 performance with drivers. Good stuff.
     
  7. CPC_RedDawn

    CPC_RedDawn Ancient Guru

    Messages:
    7,737
    Likes Received:
    214
    GPU:
    Zotac GTX1080Ti AMP
    AMD have no cards to play. Neither does Nvidia.

    The results show what you said, but the difference is BARELY even noticeable. The difference is roughly around 0.1-2fps. You would never ever notice that.
     
  8. Singleton99

    Singleton99 Maha Guru

    Messages:
    1,013
    Likes Received:
    61
    GPU:
    Aorus-Extreme-1080TI
    I'm fairly sure that Nvidia will do something to their drivers to stop this if it's at all possible, if they can't well very interesting times ahead for sure , this opens up a hole load of possibilities
     
  9. blkspade

    blkspade Master Guru

    Messages:
    577
    Likes Received:
    9
    GPU:
    Leadtek Nvidia Geforce 6800 GT 256MB
    I know it was always proposed as a thing that would be possible, but I think there are still 0 titles/engines that are using DirectCompute (or any GPU) based AI.
     
  10. Tugrul_512bit

    Tugrul_512bit Member Guru

    Messages:
    114
    Likes Received:
    0
    GPU:
    msi_r7870hawk_asus_r7_240
    If it cant do A.I. then it should be able to do some crowd behaviour atleast so 1000s of units find their way quicker. I saw someone doing this with a titan.
     

  11. janos666

    janos666 Master Guru

    Messages:
    689
    Likes Received:
    51
    GPU:
    MSI GTX1070 SH EK X 8Gb
    No DX12 for <HD7. :banana:

    Most of the GameWorks stuff runs on AMD cards. Some effect run fine (lile HBAO+) and some are notably slower (like the special shadowing or hair rendering). This will work in the same way. The code which is specially tailored for Maxwell will run slower on GCN.
    And AFR will limit the performance to the slowest of the pack (so in this case where GCN is probably slower, you virtually end up with N times the GCN card --- or vice-versa with different code).

    This depends on the card which handles the display (basically: wherever you physically plug it).

    I don't think that's possible, regardless of the API and GPU(s) used.
    Besides, even if you could dedicate some specific jobs to some specific GPU(s), why would you do that?
    Why would you intentionally create a situation where the even utilization of all the cards (including 100% of all) at the same time (and all the time) is reasonably impossible to begin with? You would only cripple your total resource pool with static allocation like that.

    It is, according to this benchmark, implemented already (everything what is reasonable and were planned). I don't see what you mean (unless you have some unreasonable wishes - like the dedicated anti-aliasing above).
     
    Last edited: Oct 26, 2015
  12. zer0_c0ol

    zer0_c0ol Ancient Guru

    Messages:
    2,976
    Likes Received:
    0
    GPU:
    FuryX cf
    it cant.. the director of oxidie said that mda can not be locked out with drivers. it is engine specific the gpu has nothing to say about it
     
  13. haz_mat

    haz_mat Master Guru

    Messages:
    244
    Likes Received:
    1
    GPU:
    1070 FE
    I still wouldn't put it past nVidia to sneak in some code that defaults to a slow codepath when an AMD video device is detected in the system. We used to be able to run PhysX on an nVidia card with an AMD as the primary display, but they managed to screw that up.
     
  14. labidas

    labidas Master Guru

    Messages:
    230
    Likes Received:
    36
    GPU:
    HD7870
    HD4600 is intel GT2 which supports dx12.
     
  15. MBTP

    MBTP Member Guru

    Messages:
    124
    Likes Received:
    7
    GPU:
    Sapphire RX590
    Are you sure?
    My control panel says directx 11.
     

  16. ---TK---

    ---TK--- Ancient Guru

    Messages:
    22,112
    Likes Received:
    2
    GPU:
    2x 980Ti Gaming 1430/7296
    Windows 10=Direct x 12.
     
  17. sykozis

    sykozis Ancient Guru

    Messages:
    21,100
    Likes Received:
    692
    GPU:
    MSI RX5700
    I'd venture to say that something was missing from NVidia's drivers in the initial Ashes of Singularity comparison, which is why we see such a big improvement with a newer driver. Obviously there will be "optimizations" with each GPU architecture release that will yield some improvements. Once NVidia gets DX12 figured out, we shouldn't see the huge performance gains NVidia has managed to pull off with DX11 though. At least not based on my understanding of how DX12 works.
     
  18. ---TK---

    ---TK--- Ancient Guru

    Messages:
    22,112
    Likes Received:
    2
    GPU:
    2x 980Ti Gaming 1430/7296
    I doubt any 980ti owners are going to buy a furyx for 1 game, and the same with a fury x owner buying a 980ti for 1 game. This is basically a show case, nothing more. 99.9% if not 100% of people will run sli and xfire and have support for a boatload of games.
     
  19. sykozis

    sykozis Ancient Guru

    Messages:
    21,100
    Likes Received:
    692
    GPU:
    MSI RX5700
    Yeah.... I have my HD7950 sitting on the shelf and I'd still pick up another GTX970 before I'd even consider throwing the 7950 back in to run along side my 970...
     
  20. waltc3

    waltc3 Maha Guru

    Messages:
    1,011
    Likes Received:
    300
    GPU:
    AMD 50th Ann 5700XT
    Definitely would not hold my breath waiting on that...Highly unlikely...;) HD4600 is a performance dog next to your 7970 and its only likely effect would be to slow everything way down--if it would even be possible--I doubt anyone could get it working in the first place even if they wanted to...the disparity between the gpus is just too great.

    Ideally, of course, you'd want two (or more) identical cards--but keep in mind that the really neat features in D3d12 have to (a) be supported in hardware and (b) supported by the game itself--especially these features, or for instance the feature that would give you a giant single GPU with >~5,000 stream processors (2x2500 sps)and ~16GBs (2x8GBs) of ram...:nerd: etc., etc.

    I'm thinking that these features will be included in several mainstream game engines (like Valve's or Epic's, etc.) so that a developer using these engines for his game would have all the grunt work relative to the nice D3d12 feature support already done for him.

    I mean, really, a GPU from nVidia/Intel paired with a GPU from AMD wouldn't be a very appealing concept for a number of reasons...right? At least with two like GPUs you have a fighting chance to do Crossfire or SLI in d3d11 and < games. With a mixed pair, no can do. It's a great novelty but I suspect that other configurations will be far more appealing to the majority of people. Makes for sensational articles, though, even if few people will seriously ever do it.
     

Share This Page