Here is the new 12-pin NVIDIA PCIe Power connector in more detail, much smaller then you think it is

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Aug 27, 2020.

  1. Cplifj

    Cplifj Active Member

    Messages:
    87
    Likes Received:
    22
    GPU:
    290X
    sure, that is the reason they used 2x8pins in the past. Nvidia just throwing out safety guidelines to save a buck. But yeah, fanboys..... are not electrical engineers.
     
    ocsystem likes this.
  2. jbscotchman

    jbscotchman Ancient Guru

    Messages:
    5,795
    Likes Received:
    4,603
    GPU:
    MSI 1660 Ti Ventus
    This will definitely become the standard eventually. It's just like back when we had to use 4 pin molex connectors to 6 pin PCI E adapters. It was a pain, but look where we're at now.
     
  3. alanm

    alanm Ancient Guru

    Messages:
    10,140
    Likes Received:
    2,298
    GPU:
    Asus 2080 Dual OC
    Cringe comment :rolleyes:. Pretty sure Nvidia knows what they are doing in terms of "safety guidelines" than some random forum poster as yourself. Also "fanboys" is a dumb term to use in the context of your comment. It says more about you than the ppl you are addressing.
     
    sykozis, Noisiv and Aura89 like this.
  4. ocsystem

    ocsystem Member Guru

    Messages:
    148
    Likes Received:
    9
    GPU:
    MSI1070Ti
    2x8pins wasn't enough nvidia? also smaller the connectors more chances of melting.
     

  5. Bobdole776

    Bobdole776 Member

    Messages:
    25
    Likes Received:
    4
    GPU:
    zotac amp extreme 1080ti


    Same. It looks small and compact and we do away with that crap 4-pin add-on that you would pray would push into the socket correctly each time and seat. As long as I get an adapter that works I'll be happy cause I'd hate hate hate to drop my 1300w Antec pro platnium plus PSU which is still the tippy top tier for PSUs and has plenty of life left in it.
     
  6. Fender178

    Fender178 Ancient Guru

    Messages:
    4,184
    Likes Received:
    207
    GPU:
    GTX 1070 | GTX 1060
    Also from watching a GN video I heard that the 12 pin connector is for the FE cards only and the AIB partners can choose if they want to use the 12 pin connector or they can use what ever they want to as long that is within spec. So the AIB partners can use 2x 8 pin or 3 8pin if they desire to.
     
  7. Denial

    Denial Ancient Guru

    Messages:
    13,323
    Likes Received:
    2,823
    GPU:
    EVGA RTX 3080
    The connector is a Micro-fit 3.0 connector that's been in use since 2014:

    https://www.molex.com/molex/products/part-detail/crimp_housings/0430251200

    The specification on the connector:

    http://www.farnell.com/datasheets/2363764.pdf
    The one Nvidia is using (12 pin) can handle up to 600w of power within spec, far above the 300-350 that Nvidia is looking for. But yeah, it's totally going to melt at half load. I'm sure Nvidia's actual electrical engineers and not arm-chair forum enthusiasts didn't think of that.
     
    Last edited: Aug 27, 2020
    sykozis, rdmetz and Astyanax like this.
  8. geogan

    geogan Master Guru

    Messages:
    821
    Likes Received:
    159
    GPU:
    3070 AORUS Master
    It seems it is even better because instead of having two cables plugged in at front end of card in case like all high-end cards now, this connector stands vertically at 45 degrees facing backwards behind card it seems. So no horrible power cables in show in front of your GPU for your nice glass case build.
     
  9. icedman

    icedman Maha Guru

    Messages:
    1,083
    Likes Received:
    128
    GPU:
    MSI Duke GTX 1080
    I'm all for this new connector but only if it becomes a new open standard and not some special one off or proprietary connector if that's the case than Nvidia can F$%! off with this.
     
    Loophole35 likes this.
  10. Elder III

    Elder III Ancient Guru

    Messages:
    3,724
    Likes Received:
    327
    GPU:
    6900 XT Nitro+ 16GB
    If the connector is rated for 600w, but the 2 8 pin connectors going into the cable adapter are only rated for 150w each... wouldn't it still be out of spec if the GPU pulled more than 300 watt through the new 12 pin connector? I know 8 pin PCIE can actually handle more then it's rated for and be "safe".... but it seems like negative publicity at best if reviewers show the GPU pulling more power than the 8 pin connectors are rated for.

    If down the road this becomes standard and we have a dedicated cable direct from the PSU to the 12pin connector on the GPU, then that sounds pretty good to me. Until then though, I am still scratching my head.

    I hope @Hilbert Hagedoorn will deep dive into this when he does the first set of 3000 series GPU reviews (next month?). :)
     

  11. semantics

    semantics Member

    Messages:
    39
    Likes Received:
    4
    GPU:
    N/A
    600w is likely not what nvidia's spec is.

    It really just depends on the power supply, many of the premium ones already have thick enough wires and are designed well enough that pulling more than 150w on a 8 pin is fine. Many of them do this because then you can have two 8-pins come off the same wire and daisy chain back to the power supply, this isn't ideal because electricity travels inversely proportional to resistance so you might get a slight issue with that, matters more for overclocking where you'd want rock stable voltage. So for an adapter you would just take two 8 pins that come separately from the psu and you'd more than likely safe even on a cheaper psu.

    The current PCIe 8 pin connector is a MiniFit JR connector which is rated at 13A per contact, we don't use anywhere near 13A in the specification. If you keep the current pin out of the PCIe 8 pin you get a max of 468w way more than 150w. Even if you derate it for the continuous load and multiple current carrying contacts at 10A it's 360w, again well above 150w we use.

    This 12 pin connector is rated 8A UL and 5A IEC, for 12 circuit it's derated to 5A, so lets go with 5A. 5Ax12Vx6=360w which would be well with-in the 8 pin connector's ability to carry assuming the wires and psu behind it can carry it. The connector could do more but i don't think that's what nvidia's spec is. Although i haven't seen what their spec actually is and can't really find anything on it outside it using 6 12v contacts. I'm not even sure that's right.
     
    Last edited: Aug 27, 2020
    rdmetz and alanm like this.
  12. Segamon

    Segamon Member Guru

    Messages:
    130
    Likes Received:
    17
    GPU:
    N/A
    I guess this leak is fake then :D


    [​IMG]
     
  13. itpro

    itpro Master Guru

    Messages:
    958
    Likes Received:
    514
    GPU:
    Radeon Technologies
    That would be serious power waiting there. :D It suits for an hypothetical 3095 dual ampere card at only 5k$.
     
  14. Rich_Guy

    Rich_Guy Ancient Guru

    Messages:
    12,656
    Likes Received:
    647
    GPU:
    MSI 2070S X-Trio
    Only on the ref cards these (the Flounders, as far as i know), so won't bother me, as always go custom.
     
  15. Ssateneth

    Ssateneth Active Member

    Messages:
    92
    Likes Received:
    16
    GPU:
    EVGA GeForce GTX 980
    Y'all need to keep in mind, PCI-E power plugs only draw power from 2 of the +12v pins. The third one is a sense pin. The +2 is also similar where 1 of the ground pins is sense pin and does not deliver any current capability

    The 600W spec is fine because it won't need to run any sense pins. all 12 pins can carry current.
     

  16. Great story @Hilbert Hagedoorn

    I'd been thinking the same thing. Here's my stance. Unless there's a feature set or toolkit as a developer you absolutely need NVIDIA for & will be adopting ampere; I'll hold off another few years. Chances are AMD won't be adopting this and while it's a "bet" my "bet" is this won't become an industry standard across AMD & Intel too, just NVIDIA. If you have a known-good power supply already why change more parts?
     
    Last edited by a moderator: Aug 28, 2020
  17. Reddoguk

    Reddoguk Ancient Guru

    Messages:
    2,084
    Likes Received:
    255
    GPU:
    RTX3090 GB GamingOC
    It's just a single 375w socket which should be more than enough for even a 3090. If the 3080 uses this socket too then that's bad i feel as a single 8pin + pci-e should be enough for most cards really @225w but even my GTX 980 G1 has 2 8pins which is 375w delivered but only about 180w is used for the 980.

    I hope the 3080 is less than 200w or i might have to go lower when i really wanted a 3080. Even a 3070 would be a massive upgrade for me. For some reason i thought Ampere was gonna be less power hungry than RTX20XX but i'm now seriously doubting that's the case.
     
  18. DannyD

    DannyD Ancient Guru

    Messages:
    2,039
    Likes Received:
    1,535
    GPU:
    MSI 2080ti
    I've a 650w platinum psu and am looking towards the 3080, i'm sure i'll have no need to OC the thing for a while and also ..
    These new cards may not be as power friendly as previous generations, but these are gonna smoke the boots off previous, especially the 2k series.
    I tested a 2070s for 10 days and tbh wasn't too impressed, chatter on the web is saying the 3070 will be in reach of a 2080ti when OCed.
    I wish the 3070 was coming out same time as 3090 /3080, maybe i won't have enough saved and will have to wait till 3070?
    3080 /3070 i'd be very happy with either, the thought of 4.3k cores on the 3080 is a little scary to me :D
     
    Last edited: Aug 28, 2020
  19. Reddoguk

    Reddoguk Ancient Guru

    Messages:
    2,084
    Likes Received:
    255
    GPU:
    RTX3090 GB GamingOC
    Well if the rumors are true and 3070 still only has 8gb of Vram then that's pretty dissapointing really. GFX cards from about 2014 had 8gbs of vram.
     
  20. Andrew LB

    Andrew LB Maha Guru

    Messages:
    1,188
    Likes Received:
    191
    GPU:
    EVGA GTX 1080@2,025
    I was thinking the same thing but while its an improvement, you still have a fairly bulky ATX12VO 10-pin connector and 6-pin ATX BRD PWR2 connector that are fairly bulky due to the amount of current they handle. The two plus with a small gap between them take up about as much space as a 24 pin connector. Then you got more plugs coming off the motherboard for any devices like SATA drives that require less than 12v. Then there is all the real estate taken up by all the voltage regulators, , which is a lot. Linus got his hands on one recently and posted a video. My concern is all the extra heat since most of the voltage conversion is done on the motherboard, not the PSU. My other concern is having a potentially large source of EMF right smack on your motherboard.
    One big advantage of the ATX12VO standard is efficiency at idle and low use levels, if you're into that sort of thing.
     

Share This Page