NVIDIA Ends SLI Support and is Transitioning to Native Game Integrations (read terminated)

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Sep 18, 2020.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,324
    Likes Received:
    18,405
    GPU:
    AMD | NVIDIA
  2. vbetts

    vbetts Don Vincenzo Staff Member

    Messages:
    15,140
    Likes Received:
    1,743
    GPU:
    GTX 1080 Ti
    An end of an era. I mean SLI has been kind of pretty dead for awhile now, but damn did it look cool back in the day having 2 8800 GT cards in my build.
     
  3. Netherwind

    Netherwind Ancient Guru

    Messages:
    8,813
    Likes Received:
    2,396
    GPU:
    GB 4090 Gaming OC
    Question is how easy or difficult it is for game devs to include it in their games? Most likely not worth it but who knows.
     
  4. asturur

    asturur Maha Guru

    Messages:
    1,371
    Likes Received:
    503
    GPU:
    Geforce Gtx 1080TI
    Once it was automatic, 2 cards, nearly 200% performance.
    At some point egines changed and it was not anymore working good :(
     
    DeskStar likes this.

  5. asturur

    asturur Maha Guru

    Messages:
    1,371
    Likes Received:
    503
    GPU:
    Geforce Gtx 1080TI
    The fact that the 3080 does not have a sli connector is very bad imho.
    I understand pushing the 3090 more, but knowing that 2 3080 would make the 3090 useless and stopping it... seems bad.
     
  6. zhalonia

    zhalonia Active Member

    Messages:
    99
    Likes Received:
    11
    GPU:
    r9 290 4gb
    i feel bad for a friend that just bought a second 1080 ti XD
     
    Deleted member 213629 likes this.
  7. Undying

    Undying Ancient Guru

    Messages:
    25,206
    Likes Received:
    12,611
    GPU:
    XFX RX6800XT 16GB
    I remember the time when nvidia and ati ware saying multi gpu is the future.
     
  8. This (7800GT) man... I still remember the ad and the man with his little jetpack backpack
     
    DeskStar and Undying like this.
  9. geogan

    geogan Maha Guru

    Messages:
    1,266
    Likes Received:
    468
    GPU:
    4080 Gaming OC
    It - the use of multiple GPUs to speed up framerate - is not dead. The requirement of the SLI bridge connector is dead. Game devs need to implement multi-GPU support directly into their code while writing their game engine. Probably not easy. But IMO should be compulsory for VR related games engines and code, so we could use one GPU-per-eye to get decent framerates in certain tough applications. Probably not as critically needed for desktop.

    Anyway as always it suits Nvidia for it to be not used - sell more expensive high-end cards that way.
     
  10. DeskStar

    DeskStar Guest

    Messages:
    1,307
    Likes Received:
    229
    GPU:
    EVGA 3080Ti/3090FTW
    No it wouldn't. 10gb of VRAM already makes the 3080 useless....

    It's going to be obvious when playing any games that demand a lot of VRAM.

    Get these reviewers to use the division 2 as a benchmark. Gone will be that 10gb of VRAM.

    Run neon noir from cryengine. Chews up more vram than you've got.
     

  11. DeskStar

    DeskStar Guest

    Messages:
    1,307
    Likes Received:
    229
    GPU:
    EVGA 3080Ti/3090FTW
    I would too.....!
     
    Deleted member 213629 likes this.
  12. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    Division 2 allocates 80% of the GPU's VRAM total (First game I think had that at 70%) it's not much of a measurement for actual consumption without checking for the real usage too.

    Pretty sure Vulkan or D3D12 under multi-GPU mode could also combine the total but it'd require the developer to utilize this support which I think is also part of D3D12 and has been used in what, Ashes of the Singularity back early on and then what?
     
    Deleted member 213629 likes this.
  13. Ricardo

    Ricardo Member Guru

    Messages:
    165
    Likes Received:
    113
    GPU:
    1050Ti 4GB
    Not sure if I'm misunderstanding, but it seems like they're actually doing the proper thing for SLI to continue? I mean, instead of manually tweaking each profile for a game, they're saying "the major APIs have native support, so let's just use that".

    It's equivalent to them saying "we're dropping RTX in favor of native RT implementations that exist inside DX12 and Vulkan".

    How is that bad news?
     
  14. Pepehl

    Pepehl Member

    Messages:
    33
    Likes Received:
    13
    GPU:
    RTX 4080
    That´s bad news ... was going to buy two 3080s 20gb and NVlink them to get 40gb of VRAM for rendering...
     
  15. I mean in essence, it really isn't bad news per se. For NVIDIA they've been devoting (or AMD for that matter) portions of their budget to do the heavy lifting that dev studios "technically" could / would have been doing for some time and or publishers of said studios could have set aside separate teams in tandem with to do. The technical polishing of games to complement the studios who had/have more "critical" things to be doing than integrating niche hardware features. Not every video game studio has the capacity for optimization.

    Many don't and either drop features that aren't considered popular which is why AMD & NVIDIA had supported them up until now as they were selling points for their products but for studios they weren't; consoles have been. So this is when you usually see a complimentary studio hired to do "touch up work". In the past, this would have been less of an issue but terms like "crunch-time" are probably ones all of you have heard at some point. It's always been there but never at the rate like it is today, there's an ever-increasing rate to turn out video games today and to respond to big-name publishers with "hard launch dates".

    Content is also produced at such finer fidelity that rivals what you see in Hollywood sometimes. Creating that can be a pain-staking process. Optimizing it can be just as difficult to bear. For a long time due to increasing needs and demands, studios have refused to develop SLI or Crossfire. It took additional time and extra work on top of an already rigorous process. It's not uncommon for NVIDIA or AMD to offer hardware or employees to assist at times with some of this - it's happened and in turn, a request for marketing integration had been an ask in return. There'd also been requirements to support these technologies (SLI/Crossfire) at a game engine level.

    Now with DX12 - there's the native support of "Hardware Agnostic" multi-GPU scaling, it only needs to be "turned on" in whatever is using it as it were. As opposed to it's older variants VRAM isn't mirrored it's combined (if I recall right) few other things are diff too. Some cool changes. Anyways... the point is with Vulcan (Mantle) and DX12 having been developed & AMD NVIDIA having had worked heavily to aid in bringing them to fruition the goal was to place responsibility on the developers/publishers and streamline the process of development in bare metal APIs.
     
    Last edited by a moderator: Sep 18, 2020

  16. Pepehl

    Pepehl Member

    Messages:
    33
    Likes Received:
    13
    GPU:
    RTX 4080
    VRAM is an issue - NVLink creates shared pool, without it each GPU will need to load everything in its own VRAM
     
  17. Astyanax

    Astyanax Ancient Guru

    Messages:
    16,998
    Likes Received:
    7,340
    GPU:
    GTX 1080ti
    3080 didn't have sli in the first place.
     
    DeskStar likes this.
  18. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,955
    Likes Received:
    4,336
    GPU:
    HIS R9 290
    Came here to say the same thing. I think it'd be nice if Nvidia could offer users the chance to force-enable mGPU support (and maybe they will) but if Nvidia is working together with developers, that really ought to yield a better outcome than we've seen before.

    If the game itself is talking directly to the GPUs, it can more efficiently manage resources. There's no need to copy all game data into both GPUs if they're not processing the same thing. With traditional mGPU setups, the drivers did all the work, so the only reliable way to split the workload was for each GPU to share pretty much all the same resources as the others.
     
  19. Anarion

    Anarion Ancient Guru

    Messages:
    13,599
    Likes Received:
    386
    GPU:
    GeForce RTX 3060 Ti
    Only 3090 has NVLink support though.
     
  20. H83

    H83 Ancient Guru

    Messages:
    5,443
    Likes Received:
    2,982
    GPU:
    XFX Black 6950XT
    It´s bad news because studios have already said since the beginning of DX12 that they wouldn´t waste time and money catering for the small crowd who uses SLI/Crossfire and with Nvidia stating that they will no longer support it either it means that SLI is dead and buried. Of course this is just a confirmation of what we already knew since the introduction of DX12.

    For me it makes no difference because i only use a single GPU but for those who like mGPU solutions it´s bad news.
     
    Maddness likes this.

Share This Page