PCIe Gen5 12VHPWR power connection to have 150W, 300W, 450W, and 600W outputs

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Mar 3, 2022.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    47,166
    Likes Received:
    15,890
    GPU:
    AMD | NVIDIA
  2. kcajjones

    kcajjones Active Member

    Messages:
    93
    Likes Received:
    68
    GPU:
    Gigabyte GTX 470 1.28GB
    Is the industry just following USB's example of making this stupid as hell?
    One connector but many possible outputs is a terrible idea. It's one connector and it should have to do everything by design. Make it 600w minimum standard and be done with it.

    Stop duping people with fake/deceptive specs. I can see it already - "PCIE gen 5 compliant!" - with 150w connectors/cables and the poor guy/girl with a 600w gpu.
     
    Agonist, Keitosha, Maddness and 6 others like this.
  3. H83

    H83 Ancient Guru

    Messages:
    5,099
    Likes Received:
    2,610
    GPU:
    XFX Black 6950XT
    450 and 600w cables just to power up GPUs seems to excessive to me...

    Companies should be prioritizing energy efficiency, not the other way around.
     
    386SX, Keitosha, mackintosh and 5 others like this.
  4. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,693
    Likes Received:
    4,088
    GPU:
    HIS R9 290
    Agreed. While the wattage label on the connector helps, not everyone is going to know the wattage of a GPU. With the old connectors, you didn't have to know; it either fit or it didn't. This will also either drive up the cost of cheap PSUs, or, they could be a potential hazard.

    The average consumer is an idiot. "hUrR dUrR it fits and the computer powers on so surely it works!" except when they try to put the GPU under any load. I expect a lot of products being unnecessarily returned because of this.

    Why couldn't this just be a 350W connector? Nothing more, nothing less. The PSU must deliver that wattage or else it shouldn't be included. At 350W, it doesn't have any overlap with the old connectors, so each one is still distinctly different. 350W is pretty much the upper limit of what can be dissipated in a dual-slot HSF card. If you need any more than 350W or a GPU, it's a stupid product, so just use a 2nd connector.
     

  5. Krizby

    Krizby Ancient Guru

    Messages:
    2,767
    Likes Received:
    1,346
    GPU:
    Asus RTX 4090 TUF
    Single cable is a better solution than some 320W GPU that comes with 3 power connectors, are you gonna use 3 separate PCIe power cable for this (no piggyback)?
    small_red-devil-6900-xt-power.jpg
     
  6. rl66

    rl66 Ancient Guru

    Messages:
    3,775
    Likes Received:
    785
    GPU:
    Sapphire RX 6700 XT
    The most stupid thing is to make a GPU that need 600W when the whole world work hard to reduce the W used everywhere (exept crypto farmer)...
    Oups... my bad... those GPU might be made for them, at least for those that left: if you want a sport car, a luxury sofa or even a well placed house (everything that is bling bling) right now they sell everything again...
     
    schmidtbag likes this.
  7. H83

    H83 Ancient Guru

    Messages:
    5,099
    Likes Received:
    2,610
    GPU:
    XFX Black 6950XT
    That might be true, but the problem is that industries are trying to take advantage of their ignorance instead of helping them by making things easier.

    A normal person shouldn´t be forced to spend hours in forums and tech sites in order to make a simple and educate guess about what to buy...

    And the same way some guys are ignorant about tech related stuff, i´m also ignorant about other stuff, and i also don´t like to be confused about products that should all be the same but instead have some small differences that in the end make all the difference!...
     
    Airbud and schmidtbag like this.
  8. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,693
    Likes Received:
    4,088
    GPU:
    HIS R9 290
    Did you not read what I said? I said this connector should be strictly 350W, because it would solve issues precisely like the example you provided.

    I'm fine with the connector existing. What I don't like is how it's going to 450W+ and supports such a wide range of wattages.
     
  9. Krizby

    Krizby Ancient Guru

    Messages:
    2,767
    Likes Received:
    1,346
    GPU:
    Asus RTX 4090 TUF
    Why would you want 2 cables when 1 cable with thicker wiring can do the job?

    The current PCIe spec say each 8pin PCIe connector is rated for 150W, but in fact high quality PCIe cable can handle >300W, so piggyback is perfectly safe (using 2x8pin connectors from the same cable).

    So yeah, the new PCIe spec just make it clearer for specifying the max wattage per cable. GPU manufacturer can just put an advisory "make sure you have 450W or above power cable for this GPU" instead of people having to use 2 or 3 power cables unnecessarily (sure it's safer but clunky looking set up).
     
    Last edited: Mar 3, 2022
  10. nevcairiel

    nevcairiel Master Guru

    Messages:
    863
    Likes Received:
    362
    GPU:
    4090
    Assuming you still want the same wattages, you would be stuck with 4 different connectors though, and your PSU either has to have a whole load of extra cables, or adapters - and adapters are generally not ideal as any connection point is a point of failure, and also of loss.
    And just enforcing support for full 600W obviously doesn't work, or you are automatically killing all PSUs with lower total power.

    Even if you enforce only 350W as suggested above, it still means you need another low-power connector, as 350W is still too much for eg. a 500W PSU, which is plenty for a small system with like a 150W GPU in it.
     

  11. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,693
    Likes Received:
    4,088
    GPU:
    HIS R9 290
    Again: are you not reading what I said? I'm saying it should be a single 350W connector.
     
  12. Krizby

    Krizby Ancient Guru

    Messages:
    2,767
    Likes Received:
    1,346
    GPU:
    Asus RTX 4090 TUF
    Your particular usage scenario might not fit others, there are plenty of AIB 6900XT and 3090 out there already use >420W at stock settings
     
  13. WeRoRa

    WeRoRa Member

    Messages:
    46
    Likes Received:
    25
    GPU:
    GTX 1070 8gb
    Wait this connector is actually interesting because lower tier gpus that are modded will not run into a power bottleneck because they only have one 6 pin and can suck up to 600 watts with just one connector. You just need to ground the 2 sense pins. Quite exciting.
     
  14. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,693
    Likes Received:
    4,088
    GPU:
    HIS R9 290
    I'm well aware. I think such situations are stupid and should be avoided. If you need more than 350W, then you should use additional power connectors. If a single connector for 450W+ systems is really that desirable, then make a new connector.

    Chip manufacturers should be motivated to attain performance levels within a certain power envelope. If they just say "screw it" or move the goal posts, that doesn't incite innovation. It promotes laziness of the engineers and it's a copout to whichever company claims to have the fastest GPU. A GPU being the fastest isn't impressive when it consumes more power than every other electronic device in the room combined.
     
    H83 likes this.
  15. Krizby

    Krizby Ancient Guru

    Messages:
    2,767
    Likes Received:
    1,346
    GPU:
    Asus RTX 4090 TUF
    Well if the 600W GPU has the same efficiency (or FPS/watt) as the 300W one, why should you care? I can't handle >400W GPU either because I live in tropical climate but who am I to force others not to use >400W GPU when they live in Alaska for example :D?

    Hell I have seen lots of "enthusiast" SLI/Xfire 4 GPUs that use 1000W in total in the past.
     

  16. JamesSneed

    JamesSneed Ancient Guru

    Messages:
    1,688
    Likes Received:
    959
    GPU:
    GTX 1070
    I personally hate it but to each is there own. Also have done the SLI/Xfire and living in Texas during the summer it would suck. I needed an air conditioner added to my room to make it bearable. I'm not looking to do that again so plan to keep the GPU's under 250watts going forward.
     
  17. Stairmand

    Stairmand Master Guru

    Messages:
    371
    Likes Received:
    184
    GPU:
    RTX3090 Aorus Maste
    A fine solution to a problem we didn't really have.
     
    Agonist likes this.
  18. Astyanax

    Astyanax Ancient Guru

    Messages:
    16,316
    Likes Received:
    6,882
    GPU:
    GTX 1080ti
    except the problem has existed for years, you've just been ignorant of it.
     
  19. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,693
    Likes Received:
    4,088
    GPU:
    HIS R9 290
    It isn't the same efficiency. Processor efficiency is not linear with clock speeds. They tend to really lose efficiency once you push them to a certain point; they also lose efficiency when you underclock them to a certain point. Architectures have different "sweet spots" for efficiency. That's why Intel struggles to compete with ARM's performance-per-watt and vise versa. They both have the same exact problem, except x86 scales down poorly and ARM scales up poorly.
    If you want a 400W+ GPU, fine, but the underlying point is standards should not be bent to satisfy either a niche need or to be an excuse for lazy product releases. It's not that big of a deal to have a 2nd power connector. It is a big deal when companies make excuses to ignore power consumption.

    The quad SLI/Xfire setups would require many power connectors too, because they're all separate GPUs. Clearly, enthusiasts didn't have much of a problem with that back then.
     
  20. Krizby

    Krizby Ancient Guru

    Messages:
    2,767
    Likes Received:
    1,346
    GPU:
    Asus RTX 4090 TUF
    Nah, as long as it's monolithic design, bigger GPU will have almost the same efficiency as smaller chip from the same uarch

    Like here
    energy-efficiency.png

    Except for Ampere (well due to how inefficient GDDR6X is), biggest chip like the TU102 and Navi21 actually have superior efficiency compare to smaller chip.

    Also GPU power consumption is not the whole PC, which is the only thing that matter. Let say a PC with 3080 that use 500W total is 30% faster than one with 3070 that use 400W, can you really say that the PC with 3080 is inefficient?

    If you worry about power consumption all that much, maybe consider switching to Steam Deck?
     
    Last edited: Mar 3, 2022

Share This Page