Only GeForce RTX 2080 and 2080 Ti to support NVLink - No Multi GPU for 2070?

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Aug 21, 2018.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    44,720
    Likes Received:
    11,384
    GPU:
    AMD | NVIDIA
  2. HardwareCaps

    HardwareCaps Master Guru

    Messages:
    452
    Likes Received:
    154
    GPU:
    x
    Sounds fine to me. SLI/CFX are long dead. the only situation when it is acceptable is for the top end cards where if you need more performance, you have nothing to upgrade to basically.
    Finally its dead.
     
  3. Srsbsns

    Srsbsns Member Guru

    Messages:
    188
    Likes Received:
    54
    GPU:
    RX Vega 64 Liquid
    No SLI in the 1060 and now they move up a line to the 2070. Makes me wonder if the 1080TI i really just a titan and they are playing musical chairs with the card names. Usually the TI never comes out at launch.
     
  4. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    13,511
    Likes Received:
    6,342
    GPU:
    2080Ti @h2o
    Honestly, I'd be more interested in how that new NVlink really behaves than what card is compatible.
     

  5. -Tj-

    -Tj- Ancient Guru

    Messages:
    17,621
    Likes Received:
    2,238
    GPU:
    Zotac GTX980Ti OC
    Lol, they know 2x 2070 would kick 2080ti ass and be cheaper doing it.. 12tflop tensor perf vs 10 by 2080ti says it all..
     
  6. tunejunky

    tunejunky Ancient Guru

    Messages:
    2,666
    Likes Received:
    1,324
    GPU:
    RX6900XT, 2070

    mind reader :D
     
    airbud7 and -Tj- like this.
  7. Anarion

    Anarion Ancient Guru

    Messages:
    13,605
    Likes Received:
    384
    GPU:
    GeForce RTX 3060 Ti
    Once you factor in all the negative side effect, compatibility issues and less than perfect scaling, it would probably match the 2080 Ti at best. I don't see why anyone would be stupid enough to buy 2x2070's when there's 2080 Ti available.
     
  8. -Tj-

    -Tj- Ancient Guru

    Messages:
    17,621
    Likes Received:
    2,238
    GPU:
    Zotac GTX980Ti OC
    I would be stupid enough, enough to save 200-300€ and still have better perf. Why wouldn't tensor core perf scale properly. That thing is independent from normal render.
     
  9. AsiJu

    AsiJu Ancient Guru

    Messages:
    8,134
    Likes Received:
    2,882
    GPU:
    MSI RX6800XT G.XT.
    Yep, sounds about right.

    Considering:
    - the ludicrous release pricing for 2080 Ti (the 2080 isn't exactly affordable either)
    - likely marginalish gains over Pascal in performance (to be confirmed but wouldn't expect miracles given number of CUDA cores)

    this Turing release starts to look like NV throwing fancy names around with RTX, charging people for that and kind of seeing what sticks.

    Wouldn't be surprised if they release GTX 2080 series later with slightly reduced price, without RT or Tensor (?) cores but practically the same performance in non-RT games.

    From what was seen in Tomb Raider RTX demo, enabling RT cripples performance even with 2080 Ti.
    Welcome to new generation of cinematic gaming where you can do FullHD at 30 fps!

    Somehow this Turing release starts to look less appealing by the day...

    (granted, any kind of ray-tracing at even 30 fps is impressive as such, but how many would really sacrifice half their frame rate for RT?)
     
    Dragondale13 and -Tj- like this.
  10. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,192
    Likes Received:
    1,459
    GPU:
    2070 Super

    Tensor cores are not the problem. They are used in purely computational tasks and will scale beautifully. SLI rendering OTOH........
    2080 Ti is the way to go. But not for 1300 euros.

    I would give 1000 euros for one. It's a good deal Nvidia, you should take it
    :)
     
    cowie likes this.

  11. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,809
    Likes Received:
    3,370
    GPU:
    6900XT+AW@240Hz
    While that's correct. nVLink does not make two physical cards into one logical. Therefore such user is still at mercy of multi-GPU support in game.
    And if they are used just for those new compute features, then Cuda/OpenCL/DirectCompute can't care less about some bridge between cards as it will open new context for each.
     
  12. caldas

    caldas Member

    Messages:
    46
    Likes Received:
    17
    GPU:
    Galax 2080Ti
    "could also offer some sort of rudimentary support linked over the PCIe bus"
    So.. AMD with Radeon R9 + have a rudimentary linked :p
     
  13. cowie

    cowie Ancient Guru

    Messages:
    13,276
    Likes Received:
    353
    GPU:
    GTX
    is nvlink a monthly subscription?:p
     
    -Tj- and airbud7 like this.
  14. The Goose

    The Goose Ancient Guru

    Messages:
    2,976
    Likes Received:
    320
    GPU:
    MSI Rtx3080 SuprimX
    Perhaps people that dont have Ti sort of money burning a hole in there pockets
     
  15. Elder III

    Elder III Ancient Guru

    Messages:
    3,731
    Likes Received:
    333
    GPU:
    6900 XT Nitro+ 16GB
    If it's true that enabling "Ray Tracing" in Tomb Raider smashes a RTX 2080 Ti down to ~35 FPS at a mere 1920x1080 resolution, then well --- screw that. I haven't gamed at 1080p in 6-7 years and I don't consider any kind of action or "FPS" game playable at 35 fps either. :rage:
     

  16. southamptonfc

    southamptonfc Ancient Guru

    Messages:
    2,303
    Likes Received:
    354
    GPU:
    GB 3080ti Vision OC
    Nvidia have been engineering the market and pricing of their cards for a few years now to force people into buying more expensive cards. People used to get a x70 card and then get another in a couple of years time either new or second hand. I think Nvidia wanted to stamp that out.

    If Nvidia wanted to, they could work with game developers to get sIi working but IMHO, they have decided it's not in their interest to do so.
     
    Dragam1337 likes this.
  17. IchimA

    IchimA Maha Guru

    Messages:
    1,271
    Likes Received:
    220
    GPU:
    G1 1070
    @southamptonfc : Ya , I remember the time when a **70 gard was 300 - 350 Euro ! Now it's just to much for some of us.
     
  18. The Goose

    The Goose Ancient Guru

    Messages:
    2,976
    Likes Received:
    320
    GPU:
    MSI Rtx3080 SuprimX
    I might of gone for the 2070 at some point but if there is no link.....no money, my Evga 1080ftw is going to last to the grave, pc gaming has always been a hobby for me not a luxury.
     
  19. MegaFalloutFan

    MegaFalloutFan Master Guru

    Messages:
    990
    Likes Received:
    186
    GPU:
    RTX3090
    None of you watched Nvidias professional presentation a week before? I only watched the ending. And the only thing I remembered from it is that this new Nvlink allows for video cards to share the Vram and effectively double it.
    Thats why its not on 2070, its new generation of SLI, that will scale much better and have unified VRAM
     
  20. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,809
    Likes Received:
    3,370
    GPU:
    6900XT+AW@240Hz
    At slug speed... sharing 1GB of VRAM will kill performance horribly even through nVlink. Get nVlink speed. Calculate time to transfer that 1GB of data to GPU on another PCB and you have amount of time added to each frametime.
    fps = 1/frametime

    But since I conveniently used just 1GB of shared memory.
    60GB/s link delivers that 1GB 60times per second. That means in 0,016667s. If GPU renders 200fps just from its own VRAM, one frame has rendering time 0,005s.
    Make it wait for that 1GB of data and your frametime is 0.005s + 0.016667s = 0,021667s. That equals to 46.15fps.
     

Share This Page