AMD explains what are the DX12 benefits for crossfire...

Discussion in 'Videocards - AMD Radeon Drivers Section' started by sammarbella, May 30, 2015.

  1. sammarbella

    sammarbella Ancient Guru

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    Last edited: May 30, 2015
  2. haz_mat

    haz_mat Master Guru

    Messages:
    243
    Likes Received:
    1
    GPU:
    1070 FE
    I think the explicit multiadapter mode will be a challenge, but in the long run its for the best. Some game developers as well as middleware devs have wanted more GPU control like this for years - and the large library-like GPU drivers just get in the way by hiding and abstracting the GPU's features. The devs that don't want this level of control are probably already using licensed engines anyways.

    Yeah, game devs are going to have a new gun to shoot themselves in the foot with - but that has always been the case when new and powerful languages and features become available. But it will be better than the blame war we have between bad drivers and badly optimized games.
     
  3. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,556
    Likes Received:
    222
    GPU:
    RX 580 8GB
    It's really down to engine developers. If the engine has the support and the game doesn't, you know what not to do! :)
     
  4. undeadpolice

    undeadpolice Master Guru

    Messages:
    203
    Likes Received:
    0
    GPU:
    EVGA GTX980 K|NGP|N (SLI)
    AMD should really focus on fixing their drivers 1st
     

  5. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,556
    Likes Received:
    222
    GPU:
    RX 580 8GB
    We've seen a big jump in draw performance (less overhead) with the Windows 10 drivers so they are working on it.
     
  6. Fuzzylog1c

    Fuzzylog1c Member

    Messages:
    20
    Likes Received:
    0
    GPU:
    4x R9 290X
    Really? I'm down about 20% over Windows 7 right now.
     
  7. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,556
    Likes Received:
    222
    GPU:
    RX 580 8GB
  8. sammarbella

    sammarbella Ancient Guru

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    Which game dev are supposed to want more control over multi-GPU?

    I only see a majority of imperfect and/or delayed multi-GPU support and a minority of launch day perfect multi-GPU.

    Maybe the multi-GPU profile support is better or faster in the Green camp but i don't see a special interest from game devs to have this support ready on games launch day.

    Do you mean to shoot multi-GPU users by not providing correct or full implementations because it will be to expensive (cost/benefit) for them to care about?


    The Black bolded text are DX12 new options for game developers in relation to multi-GPU.

    What i fear the most that will happen is the red bolded text:

    No support (or incomplete) from game devs and the imposibility to add it from GPU makers.

    This new scenario will be optimal for AMD and Nvidia: no more driver problems (and expenses) related to multi-GPU driver support for them.

    In the case the games have not correctly implemented support they will point their finger to game developers...and with reason and call it a day.

    :bang:
     
    Last edited: May 30, 2015
  9. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    7,503
    Likes Received:
    509
    GPU:
    Inno3D RTX 3090
    This is irrelevant, as they speak about DX12, and their DX12 driver looks to be better than NVIDIA's for now.

    What this does is that it puts all the power in the hands of the game developers, and it will make more clear for us in the future who is to blame in case of late SLI/Crossfire profiles.

    All big engines will support it, there is no question about that, but from engine support to game support there is still a bit of a way off. If the Firaxis Civ profile is of any indication (as well as the Mantle BF4 profiles), then the future looks very promising gentlemen. The Firaxis profile is the best case scenario, as SFR brings back almost all the benefits of the 3dfx-era sli (each card drawing a part of a scene), without no drawbacks really.

    I would be happy with this clarification if I had multi-gpu. AMD are just presenting how the new API(s) will handle multi-gpu, it will be the same for NVIDIA too.
     
  10. klaupe

    klaupe Member Guru

    Messages:
    157
    Likes Received:
    0
    GPU:
    Zotac GTX980Ti AmpExtreme
    This is an early excuse for future trends, no clarification.

    BTW., SFR was implemented, because beyond earth was sponsored by AMD. Another developer wouldn´t even think a second about investing in a feature like this without getting paid.

    Stop dreaming.
     

  11. GreenAlien

    GreenAlien Active Member

    Messages:
    61
    Likes Received:
    5
    GPU:
    Sapphire Nitro+ RX 480 OC
    Dont you have to implement features like that right just once, and then it's mostly a case of copy&paste for every new game?
     
    Last edited: May 30, 2015
  12. Maddness

    Maddness Maha Guru

    Messages:
    1,459
    Likes Received:
    674
    GPU:
    3080 Aorus Xtreme
    I would think it comes down to the individual game or the game engine.
     
  13. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    7,503
    Likes Received:
    509
    GPU:
    Inno3D RTX 3090
    Nope. That's how multi-gpu is working in the closer to the metal apis, that's no excuse. The synchronization and resource management between the applications and the hardware is now done by the developers and not the driver. That's that.

    I have no idea, to be frank. I guess a basic framework can be setup in an engine, and then you could tweak for each game.
     
  14. velocityx

    velocityx Master Guru

    Messages:
    310
    Likes Received:
    0
    GPU:
    EVGA 1080 Ti FE - EK FCB
    so its pretty much either all triple A games get multi gpu support because all the major crytek, epic etc engine studios implement the SFR feature in engine, or its pretty much dead.

    thats how I see it, game studios wont bother implementing it, engine makers gotta take care of that and implement support with a click of a button in engine.
     
  15. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    7,503
    Likes Received:
    509
    GPU:
    Inno3D RTX 3090
    That's why you have DX 11.3.
     

  16. haz_mat

    haz_mat Master Guru

    Messages:
    243
    Likes Received:
    1
    GPU:
    1070 FE
    I'd have to do some digging to direct quote someone, but I've seen a few interviews with engine programmers throwing around ideas to get better multi-gpu scaling. I'm pretty sure one of the guys was with Epic and another with id - or maybe it was one of the guys that worked on frostbite. I don't remember for sure, but its certainly an open question and there are people that want to explore the idea.

    Unless I'm misunderstanding something, they're not stripping out the ability to use traditional methods. Looks to me like they're just adding these new multi-gpu features as non-mandatory options. I think it was the same thing with Mantle, where we saw a pseudo-SFR in the new Civ and a traditional AFR in BF4.


    We don't know if DX12 is going to gut the ability to force traditional AFR or 1x1 in games that dont officially support it. That is a fair question. A simpler API might make that easier to do in the driver, but we will have to wait and see.

    I honestly don't expect most devs to take advantage of the new multi-gpu stuff right off the bat. But there will be nothing stopping someone from trying. A lot of people seem to think its impossible to achieve an optimal memory-pooled SFR, and maybe it is, but we will never know unless they can at least try.

    By some accounts, this kind of multi-gpu rendering needs to be done at the engine level anyways. There's talk about splitting up the scenes by objects, could be some merit to that idea. The engine knows a good deal more about the composition of the scene than the driver knows, and if they can figure out how to leverage that information it will help in juggling around the load and minimizing cache misses. A lot of stuff is more-or-less static, another thing that the engine would know and could take advantage of when deciding how to split the scene.

    I also think some clever statistical analysis of the scene could help give the engine hints to the performance cost of rendering each object in the scene. And if each object had a pre-computed profile to estimate the cost of rendering each fragment it might help to split up an object in situations where a single actor is taking up a large portion of the screen. So we could end up with each GPU holding different texture fragments at various LOD levels for the same object.

    I don't think its as bleak as you make it out to be. Multi-GPU under DirectX has been a hack all the way up through DX11. Tuning a driver to each game doesn't make a lot of sense from a software point of view - the goals of the new API are to minimize the need for this and not just for multi-gpu. If the devs follow a few guidelines for how to feed ther pipeline to the GPU it should perform more predictably than before, making it easier to find what needs to be optimized for both single and multi-gpu setups.
     
  17. sammarbella

    sammarbella Ancient Guru

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    Unreal 4 (Epic)
    Id Tech 5 (Id)
    Frosbite (EA)

    Are all game engines, the options they will provide to game developers are ony that: options.

    I still don't see a special interest in game developers.



    I agree, as i understand the article they expose NEW OPTIONS for multi-GPU support as not mandatory or exclusive.


    Yes this part is unclear, maybe on purpose (?).

    We only need to wait a few weeks or months for the first batch of DX12 games.

    I don't see many game developers throwing money in this to use it as a showcase WITHOUT the support ($$$) of GPU developers.

    On paper all is exposed in a good manner and every piece seems to fit nicely but i don't believe the SFR and even the main multi-GPU support in DX12 will be like an on/off option to be activated by game devs.

    I couldn't agree more the situation of the ever increasing driver size and mess due to specific additions specially made for a game is not practical for GPU makers and customers.

    If a game will need a LOT of hours (and $$$) to implemented a good multi-GPU support game developers will consider cost/benefit and we all know what that means...

    Yep, GPU makers normally had to solve the multi-GPU part by themself or working with game developers sending them their engineers.

    This is an absolute inefficient solution from GPU makers (cost) and customers point of view (delayed support).

    What is the root origin of this multi-GPU mess in most cases?

    Red bold text.

    Multi-GPU support is mainly an interest in GPU makers camp not on game developers camp.

    Little benefit for the game sales and big for the GPU sales.

    The problem will not be the guidelines to follow but the cost/benefit of following them by the game developers. :(

    It's logic (for me) that game developers will not change this attitude towards multi-GPU support and wait for GPU makers to continue solving it by themselves or directly collaborating with them.

    Now the problem in the new scenario is i don't think Nvidia is sending his workforce to game developers to implement multi-GPU support for their GPUs and AMD ones. :3eyes:
     
    Last edited: May 30, 2015
  18. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    7,503
    Likes Received:
    509
    GPU:
    Inno3D RTX 3090
    There is nothing "new" really here. Developers will have more or less the same freedoms as they would have on consoles, with DX12. There will be no driver that will try to force a game that doesn't support multi-gpu in its rendering pipeline, and there will be no driver borking a game with a proper rendering pipeline that is doing something new and fast.
    Both companies can focus on the effectiveness of the actual driver now, and adding to the features of said driver, while capable developers can literally do whatever they like with almost any combination of hardware.

    Take a look at this article.
     
  19. PF Prophet

    PF Prophet Master Guru

    Messages:
    232
    Likes Received:
    0
    GPU:
    Zotac 1070 mini/290x
    one thing to keep in mind is, this will also allow devs to take advantage of iGPU's from intel and amd on top of whatever standalone gpus are in the system.

    the advantage of this is, lower end, cheaper systems will be more able to play modern games.

    alot of development houses will take that into account, because more and more people have laptops for example and from what i understand both amd and intel really want to move to a mainstream socket that always has a gpu on package with the cpu (if not on the same die), this makes sense in alot of ways, specially if they can get devs to leverage the iGPU's power on top of whatever standalone solution is added....

    a great example is the A10 laptop my buddy has, it has a r9 290m, (4gb) as well as the video on the cpu...(yeah hes gotta use modified drivers to get that to work properly to save power/heat)

    the cpus not bad really, better then the i3 laptop his wife has(that has a 660m or whatever in it....both are weak on the cpu side, both have iGPU's onboard..but in this case the AMD's iGPU is alot more powerful...but both could be used to take some load off the cpu and primary gpu, the big problem is, at least according to devs i work with,DX/D3D dosnt really work that way 11.x and lower really do force game deves to get creative in ways to try and gain perf despite the limitations and overhead...

    I mean i get where the worry comes from but, as i understand it drivers will still be capable of basic multi gpu support, I have also been told, I would expect every major game engine to come out with code that most devs can use or base their own builds on, lets be honest, there are some great engines out there, and if the companies behind them want to keep/get more users, they better be offering good/great gpu and multi-gpu support.

    I like this move for the same reason I like mantel, once its finalized and drivers mature a bit, there should be alot less of this "devs blame amd/nvidia, and nvidia/amd blame devs" since AMD/nVidia both will not have a dog in the race really, they do offer support and assistance to developers....i see no reason they wouldnt do this with the major (and even many minor) game engine developers.....

    I have been viewing the last several years as sort of a lul, this driver **** has been typical of both major videocard makers since the start of the industry, they both have had HORRIBLE driver periods, and GREAT driver periods....

    i been told amd actually hired more people to debug and work on their drivers....this could be a great move.

    IMHO, what they should do is, hire a team either to debug the curent drivers, as the current team write a new driver from scratch, OR have the new team make the new driver from scratch and the old team work on debugging, profiles for games...till the new drivers are ready....

    either way, at least it seems like amd's management are catching a clue that people arent happy.....so im guessing we will start seeing better drivers with win10 on the way, at least from AMD, nVidia.....well they tend to trade places with amd back and forth.....and thats fine....it keeps the industry lively....i sure wish we have more then 2 cpu companies and 2 gpu companies...but thats what you get when IP (intellectual property) is more important then advancement.
     

Share This Page