Nvidia GTX 1080 Ti vs Titan X PCB Explored

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Mar 6, 2017.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    39,454
    Likes Received:
    8,102
    GPU:
    AMD | NVIDIA
    We all know that aside from a memory configuration and a GPU rebrand, there will be little different inbetween the Titan X Pascal and upcoming GeForce GTX 1080 Ti. Nvidia did however claim a few chang...

    Nvidia GTX 1080 Ti vs Titan X PCB Explored
     
  2. Loophole35

    Loophole35 Ancient Guru

    Messages:
    9,759
    Likes Received:
    1,119
    GPU:
    EVGA 1080ti SC
    Interesting. Think I will affect overclocking any?
     
  3. Agent-A01

    Agent-A01 Ancient Guru

    Messages:
    11,362
    Likes Received:
    904
    GPU:
    1080Ti H20
    Nope.

    Reference cards overclock the same as even the best boards.

    Limitation these days are silicon lottery.
     
  4. GALTARAUJO

    GALTARAUJO Active Member

    Messages:
    54
    Likes Received:
    0
    GPU:
    2 x GTX980 Strix
    "Board partners will likely added DVI.

    Please, don't
     

  5. C-Power

    C-Power Member Guru

    Messages:
    114
    Likes Received:
    17
    GPU:
    Msi 2060 Gaming-Z
    I think for the first time I might actually go with a reference/FE card.

    If the OC potential is the same as the 1080 non-ti then I'd be happy with a stock cooler that exhausts heat out of the case, while the OC potential is pretty much the same as the aftermarket cards.

    I personally have found my G1 quite loud under full stress as well, and the stock cooler is actually less loud... win/win

    Edit;
    The 1080 Ti PCB looks better then the Titan one.. Interesting, so apart from 1gb less and a slightly gimped memory bus, you can get a better card for half the price.. :banana: Glad I waited for a bit and didn't jump the gun on the 1080 :D
     
  6. Loophole35

    Loophole35 Ancient Guru

    Messages:
    9,759
    Likes Received:
    1,119
    GPU:
    EVGA 1080ti SC
    8 fewer ROP's as well. TXP may end up being better in 4K at the same clocks.
     
  7. C-Power

    C-Power Member Guru

    Messages:
    114
    Likes Received:
    17
    GPU:
    Msi 2060 Gaming-Z
    Hmm true, but we're talking a couple % here and there at half the price :nerd:
     
  8. -Tj-

    -Tj- Ancient Guru

    Messages:
    17,029
    Likes Received:
    1,861
    GPU:
    Zotac GTX980Ti OC
    Some customs may be louder, but also a lot cooler.

    Stock seem to aim 80C mark, imo too much - max 75C.
     
  9. Denial

    Denial Ancient Guru

    Messages:
    13,147
    Likes Received:
    2,636
    GPU:
    EVGA RTX 3080
    [​IMG]

    Obviously ambient temperature and it most likely being an open test bed effect the performance. But it does seem like the stock cooling solution was improved. Either that or removing the DVI port did far more for temps then I would expect.

    Edit: By comparison the 180w, GTX1080 FE loads ~80c.
     
    Last edited: Mar 6, 2017
  10. Loophole35

    Loophole35 Ancient Guru

    Messages:
    9,759
    Likes Received:
    1,119
    GPU:
    EVGA 1080ti SC
    Very true. No doubt the TXP is obsolete with the 1080Ti coming. Again don't be surprised if we see a TXP black or Titan Ultra with 3840sp cores, 96 ROP's, 12GB of 11Ghz GDDR5x and a $1200 MSRP.
     

  11. -Tj-

    -Tj- Ancient Guru

    Messages:
    17,029
    Likes Received:
    1,861
    GPU:
    Zotac GTX980Ti OC
     
  12. C-Power

    C-Power Member Guru

    Messages:
    114
    Likes Received:
    17
    GPU:
    Msi 2060 Gaming-Z
    Personally I don't mind that much - if the card is actually made to run at that temp (and it can do it for a couple years) I honestly don't care.

    Especially with the fans that exhaust hot air at the back and not into the case :)
     
  13. Cartman372

    Cartman372 Maha Guru

    Messages:
    1,469
    Likes Received:
    0
    GPU:
    980 FTW
    No DVI also means single slot water cooling, just like the Fury X.
     
  14. tsunami231

    tsunami231 Ancient Guru

    Messages:
    10,937
    Likes Received:
    672
    GPU:
    EVGA 1070Ti Black
    I still use DVI
     
  15. ThundercatMan

    ThundercatMan Member

    Messages:
    15
    Likes Received:
    0
    GPU:
    Msi gaming 980ti
    Definitely going for the FE for the first time ever as did my first water cooling system a few weeks ago with my older kit and my 980ti so I'm confident in replacing water blocks so if I don't buy this generation ti it will definitely be the next one but 100% FE ;)
     

  16. Ricepudding

    Ricepudding Master Guru

    Messages:
    718
    Likes Received:
    204
    GPU:
    1080ti MSI Light
    Can we finally let DVI die off now? least with these top end cards, almost no new monitors even have this old connection. Like cards that kept VGA for way longer than they should have :')

    But seriously let DVI die, as much as i loved it back in the day, those cables were massive in comparison to DP or HDMI
     
  17. angelgraves13

    angelgraves13 Ancient Guru

    Messages:
    2,181
    Likes Received:
    636
    GPU:
    RTX 2080 Ti FE
    There's an adapter for DVI users, so yes we can.
     
  18. icedman

    icedman Maha Guru

    Messages:
    1,052
    Likes Received:
    115
    GPU:
    MSI Duke GTX 1080
    I'm with everyone else on letting dvi die but wow when this was the fury x everyone went nuts now Nvidia does it and it's the smartest thing ever
     
  19. Ricepudding

    Ricepudding Master Guru

    Messages:
    718
    Likes Received:
    204
    GPU:
    1080ti MSI Light
    Well the fury (x) did come out in June 2015, coming up to 2 years ago... though shouldn't have been on that card either. i can understand DVI on the lower end models, but not on the top end. i'd rather it be left for extra venting or another DP/HDMI port
     
  20. Loophole35

    Loophole35 Ancient Guru

    Messages:
    9,759
    Likes Received:
    1,119
    GPU:
    EVGA 1080ti SC
    The thing with the Fury X was not only did it lack DVI it also lacked HDMI2.0.
     

Share This Page