PCIe socket spacing - too small?

Discussion in 'General Hardware' started by D3M1G0D, Oct 8, 2017.

  1. D3M1G0D

    D3M1G0D Guest

    Messages:
    2,068
    Likes Received:
    1,341
    GPU:
    2 x GeForce 1080 Ti
    So my old Athlon II system broke down again the other day and I decided that some modifications were in order. As part of those modifications, I removed the GPU - a GeForce GTX 1060 - and moved it to one of my other systems. I decided to use my X399 system (MSI X399 Gaming Pro Carbon) since it had a lot of lanes and a lot of slots (4 total, evenly spaced out). Here's where the problems began.

    The thing is, this system already had two other GPUs installed, on the first and third slots. I wasn't sure where to place this new card, since I was worried that it would block off air to the other cards (or to itself, depending on how I configured it). I tried fitting it between the two existing cards but when I began computing on it the results were atrocious - the fan was spinning almost at full trying to suck air in, and it was also pumping hot air directly into the card above it (the 1060 is a short card with a single fan while the card above is dual-fan). I reconfigured things so that the 1060 sat on top, but it still struggled to breathe and I had to scrap that as well. I then decided to use a PCIe riser cable that I happened to have, which solved the fan and noise problems, but the GPU was no longer attached to the case or motherboard and I could no longer put the side panel on. In the end, I moved the 1060 to my old Haswell system (I have to use the riser cable here as well, but it's in another room and I don't have to look at it :p).

    So after this experience, I got thinking - what is the point of having all those PCIe slots? It seems to me that you can only install two GPUs at max, since any additional cards will struggle to breathe. Would a blower fan design work better here? (I can't imagine they would, as they would still need room to breathe) I suppose liquid cooling might work, but that's awfully expensive for four cards. The irony is that this board and platform was attractive to me because it could support a lot of GPUs, but turns out I can only use half of the slots (almost makes me think that I should have gotten the Aorus board instead, which has five slots and would allow three GPUs at least).

    I realize that most people would not have this issue since most are using a single GPU (or two at most). Just found it a bit frustrating not being able to use more than half the slots. I dunno, does anyone else think the design of these boards is a bit silly?
     
    airbud7 likes this.
  2. user1

    user1 Ancient Guru

    Messages:
    2,782
    Likes Received:
    1,305
    GPU:
    Mi25/IGP
    If you use blower style cards, they tend to work alot better in confined spaces, since the heat is ejected from the case and blower fans are less effected by restricted airflow. All ref cards are blowers, and a large portion of gpus available in general have blower coolers, so the slot config is not that unreasonable.
     
  3. Guru01

    Guru01 Master Guru

    Messages:
    354
    Likes Received:
    25
    GPU:
    nVIDIA RTX 2080
    Both AMD and NVidia have gone to only two GPU's down from four. So that issue wouldn't matter on newer boards anymore. But I guess there is a way to tweak that and have four cards instead of two, so you still run into that issue with the PCI-E spacing even on newer boards. I would think it would either be better off with water cooling or somehow find mini-ITX graphic cards that might fit perfectly.
     
  4. airbud7

    airbud7 Guest

    Messages:
    7,833
    Likes Received:
    4,797
    GPU:
    pny gtx 1060 xlr8
    I like your spirit..... idle minds never get nowhere....you will find the fix Bro!
     
    Last edited: Oct 8, 2017

  5. D3M1G0D

    D3M1G0D Guest

    Messages:
    2,068
    Likes Received:
    1,341
    GPU:
    2 x GeForce 1080 Ti
    Hmm, I dunno. The problem is that the card is unable to draw sufficient air in because the fan area is being blocked off by the card below it. I would assume that a blower card would also suffer if the fan area was blocked off?

    I think most consumer boards provide some space between the top two slots for SLI/mGPU (usually three slots for the top one). If they didn't then this problem would be a lot more prevalent. It just makes me wonder if I'm doing something wrong? (I've never had a board like this before)
     
  6. user1

    user1 Ancient Guru

    Messages:
    2,782
    Likes Received:
    1,305
    GPU:
    Mi25/IGP
    it will suffer, but it will do better than a open air type cooler, the open air coolers tend to recycle air, which gets much worse when restricted, the blower wont , will be alot louder though.

    I should mention though, that if you have enough airflow from case fans, it would probably negate the advantage blowers would have. dunno how many fans you would need though.
     
  7. jura11

    jura11 Guest

    Messages:
    2,640
    Likes Received:
    707
    GPU:
    RTX 3090 NvLink
    Hi there

    As above I would suggest go route of the blower style cards if you are planning to run multiple GPUs, just due blower cards have front to back airflow or works lot better when they're running in SLI or when they are sandwiched and these aftermarket GPUs don't work like blower style GPUs, they cools or heat everything around the GPU and slots which can results in higher temperatures when they are sandwiched

    If you are running multiple GPU and you have budget then I would go route of the with waterblocks for GPUs and put them under water which should lower temperatures at least

    This has been my main reason go with waterblocks when I went with multiple GPUs, I've run Zotac GTX1080 AMP which has been sandwiched with Titan X SC with Raijintek Morpheus II cooler and my temperatures on Zotac GTX1080 AMP has been pretty high in 76-78°C when I run both cards in rendering, in gaming when I used only Zotac GTX1080 AMP my temperatures has been in low 70's with fans running at 75%


    Hope this helps

    Thanks, Jura
     
  8. D3M1G0D

    D3M1G0D Guest

    Messages:
    2,068
    Likes Received:
    1,341
    GPU:
    2 x GeForce 1080 Ti
    Yeah, noise would be an issue with me, especially since I run these GPUs at full load at all times. In fact, what bothered me most about my situation was the noise that the fan was making spinning at 90%. The extra heat was also causing the other GPUs to spin higher so more noise overall. If I get the chance then I might try a blower GPU, but I'm not expecting miracles.

    I figured that water-cooling would help here, but I don't want to invest in water blocks for these GPUs - a GTX 1060 and two RX 580s. I only water-cool high-end GPUs to make the investment worth it (my 1080 has a water block). I think water is probably the best overall solution though, and something to keep in mind for my next major overhaul.
     
  9. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,020
    Likes Received:
    4,397
    GPU:
    Asrock 7700XT
    If you want to use 3+ GPUs, it may be worth looking into using heatsinks you find from Tesla models, like this:
    https://www.bhphotovideo.com/images...1030_0040_100_tesla_gpu_m2090_6gb_1001374.jpg
    Note the complete lack of fans - this is intentional. Normally, these GPUs are used in blade servers, where the chassis is very flat with a row of fans at the front, and the GPUs lay parallel to the motherboard. The air is supposed to pass from the back of the GPU to the front (where by "front" I mean where the I/O ports would be), whereas with "traditional" desktop GPUs, air is pushed into the GPU perpendicularly. Because of this, most desktop GPUs are terrible at properly cooling something when the fans are mostly blocked off, and they spew a bunch of heat inside your chassis. If you use a heatsink like you find on these Teslas, you could probably get a single 80mm or 90mm fan and efficiently cool all 3 GPUs simultaneously. Keep in mind if you set this up right, 100% of the heat should flow outside the case, so you shouldn't ever be recycling any heat.

    For many GPUs, the fins on their heatsink run parallel to the length of the GPU, like this one:
    https://3.bp.blogspot.com/-4dbuJIRPXic/V30krN6WOKI/AAAAAAAAFVw/Oyo0DPvYuOoyJ_CQivyBI-IGUdhWBiwzgCLcB/s1600/MSI+GTX+1070+Gaming+X+%2815%29.JPG
    If your heatsinks are like this, you should be able to just simply remove the fans and maybe some plastic shrouding, and that ought to be enough - this ought to allow you to have the air flow from back to front mostly un-obstructed. Does this make sense?

    Also, despite the "leafblower" heatsinks that AMD uses on their reference coolers being loud and somewhat crappy at cooling, what they're especially good at is cooling GPUs in cramped situations like the one you propose. Much like the Tesla heatsinks, these coolers are designed to push air along the length of the GPU rather than push air into the GPU.
     

Share This Page