1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

PCIe 4.0 feature pops up in X470 Motherboard BIOS

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, May 20, 2019.

  1. mat9v9tam

    mat9v9tam Member

    Messages:
    35
    Likes Received:
    4
    GPU:
    MSI GTX970M @C1330/M2705
    I wonder if someone will make a splitter/translator (to translate PCIEx 4.0 signalling to PCIEx 3.0) that will enable us to connect two x16 PCIEx 3.0 GPUs to one x16 PCIEx 4.0 slot on X370/X470 boards. Since the bios already allows for splitting signalling to slot 1 and 2 for x8 and the "translator" card was inserted in the first slot with short routes...
     
  2. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    10,325
    Likes Received:
    2,568
    GPU:
    1080Ti @h2o
    Well the same question applies to you honestly, who cares if you benefit when 90% of people won't? Just saying, even the gurus here aren't what's the main buyer's segment, we're "enthusiasts". :)
    Yes that would make sense. I only hope the support for mGPU configs will get better, since it's practically non existant for gamers at this point (with dx12 that is).
     
  3. Alessio1989

    Alessio1989 Maha Guru

    Messages:
    1,212
    Likes Received:
    155
    GPU:
    .
    good news, however most of current GPUs are not limited by a x16 2.x/ x8 3.x....
     
  4. Astyanax

    Astyanax Ancient Guru

    Messages:
    1,790
    Likes Received:
    438
    GPU:
    GTX 1080ti
    lol, yes they are.
     

  5. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    3,994
    Likes Received:
    1,017
    GPU:
    HIS R9 290
    No, they aren't. Only a handful of GPUs can max out 2.0 @ x16 or 3.0 @ x8.
     
  6. Astyanax

    Astyanax Ancient Guru

    Messages:
    1,790
    Likes Received:
    438
    GPU:
    GTX 1080ti
    All mid to high end GPU's since Kepler can max out PCI-E 2 and 3 16 links under complex workloads
     
  7. vbetts

    vbetts Don Vincenzo Staff Member

    Messages:
    14,401
    Likes Received:
    928
    GPU:
    Nvidia Geforce GTX 960M
    fantaskarsef and schmidtbag like this.
  8. Undying

    Undying Ancient Guru

    Messages:
    11,027
    Likes Received:
    957
    GPU:
    Aorus RX580 XTR 8GB
    Pcie4 would benefit from multi gpu systems. Too bad they are dead.
     
  9. Alessio1989

    Alessio1989 Maha Guru

    Messages:
    1,212
    Likes Received:
    155
    GPU:
    .
    Talking about games, most of current GPUs do not take more then 1-2% performance improvement in most AAA titles and most of time the performance improvement is 0% or just system noise.
    Multi-GPU is mostly limited by latency times, not bandwidth. They aren't also dead, simply AMD and NVIDIA decided wisely to limit the linked node adapter support to 2 adapters (ie: crossfire and sli are limited to 2 cards), however low-overhead/abstraction APIs like Direct3D 12 and Vulkan do not need linked node adapter to take advantage of multiple GPUs at all.
     
  10. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    3,994
    Likes Received:
    1,017
    GPU:
    HIS R9 290
    Isn't [modern] Crossfire more heavily dependent on bandwidth? That uses the PCIe bus for cross-communication between the GPUs, whereas SLI uses a discrete hardware bridge.
     

  11. Alessio1989

    Alessio1989 Maha Guru

    Messages:
    1,212
    Likes Received:
    155
    GPU:
    .
    It depends on the single workloads, of course more bandwidth is always welcomed, but the most complex aspect, which is not resolved at all on todays hardware, is accessing on a memory resource which is resident on another physical adapter. This is why most of multi-GPU rendering techniques are restricted to decouple the rendering resources (buffers, vertices and indices, and textures) on both GPUs and use AFR or using one GPU for the main rendering and the second for heavy post-processing. Again, higher bandwidth can help and is always welcomed, but it's not the solution to improve multi-GPU performance and programming facility.
     
  12. JamesSneed

    JamesSneed Master Guru

    Messages:
    509
    Likes Received:
    163
    GPU:
    GTX 1070
    There are certain workloads like cuDNN and Caffe that show large differences between PCIE-3 x8 and x16. However I don't know of any significant differences in gaming which is probably 99.99% of anyone reading this forum. PCIE-4 likely will be a boon for reducing latency due to its faster signaling which will also have some fringe use cases let alone the obvious bandwidth increases. Even with a RTX 2080 ti you only lose 2% performance in gaming moving down to PCIE-3 x8 from x16. You can see significant bottlenecks in PCIE-3 x8 when you SLI 2 RTX 2080 ti's without the NVlink ie over PCIE but that is more theory than applicable since nobody would do that in the real world.
     

Share This Page