Nvidia GTX 1080 Ti vs Titan X PCB Explored

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Mar 6, 2017.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,535
    Likes Received:
    18,841
    GPU:
    AMD | NVIDIA
    We all know that aside from a memory configuration and a GPU rebrand, there will be little different inbetween the Titan X Pascal and upcoming GeForce GTX 1080 Ti. Nvidia did however claim a few chang...

    Nvidia GTX 1080 Ti vs Titan X PCB Explored
     
  2. Loophole35

    Loophole35 Guest

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    Interesting. Think I will affect overclocking any?
     
  3. Agent-A01

    Agent-A01 Ancient Guru

    Messages:
    11,640
    Likes Received:
    1,143
    GPU:
    4090 FE H20
    Nope.

    Reference cards overclock the same as even the best boards.

    Limitation these days are silicon lottery.
     
  4. GALTARAUJO

    GALTARAUJO Active Member

    Messages:
    55
    Likes Received:
    1
    GPU:
    2 x GTX980 Strix
    "Board partners will likely added DVI.

    Please, don't
     

  5. C-Power

    C-Power Member Guru

    Messages:
    128
    Likes Received:
    34
    GPU:
    Inno3d 4090 White
    I think for the first time I might actually go with a reference/FE card.

    If the OC potential is the same as the 1080 non-ti then I'd be happy with a stock cooler that exhausts heat out of the case, while the OC potential is pretty much the same as the aftermarket cards.

    I personally have found my G1 quite loud under full stress as well, and the stock cooler is actually less loud... win/win

    Edit;
    The 1080 Ti PCB looks better then the Titan one.. Interesting, so apart from 1gb less and a slightly gimped memory bus, you can get a better card for half the price.. :banana: Glad I waited for a bit and didn't jump the gun on the 1080 :D
     
  6. Loophole35

    Loophole35 Guest

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    8 fewer ROP's as well. TXP may end up being better in 4K at the same clocks.
     
  7. C-Power

    C-Power Member Guru

    Messages:
    128
    Likes Received:
    34
    GPU:
    Inno3d 4090 White
    Hmm true, but we're talking a couple % here and there at half the price :nerd:
     
  8. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,103
    Likes Received:
    2,606
    GPU:
    3080TI iChill Black
    Some customs may be louder, but also a lot cooler.

    Stock seem to aim 80C mark, imo too much - max 75C.
     
  9. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    [​IMG]

    Obviously ambient temperature and it most likely being an open test bed effect the performance. But it does seem like the stock cooling solution was improved. Either that or removing the DVI port did far more for temps then I would expect.

    Edit: By comparison the 180w, GTX1080 FE loads ~80c.
     
    Last edited: Mar 6, 2017
  10. Loophole35

    Loophole35 Guest

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    Very true. No doubt the TXP is obsolete with the 1080Ti coming. Again don't be surprised if we see a TXP black or Titan Ultra with 3840sp cores, 96 ROP's, 12GB of 11Ghz GDDR5x and a $1200 MSRP.
     

  11. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,103
    Likes Received:
    2,606
    GPU:
    3080TI iChill Black
     
  12. C-Power

    C-Power Member Guru

    Messages:
    128
    Likes Received:
    34
    GPU:
    Inno3d 4090 White
    Personally I don't mind that much - if the card is actually made to run at that temp (and it can do it for a couple years) I honestly don't care.

    Especially with the fans that exhaust hot air at the back and not into the case :)
     
  13. Cartman372

    Cartman372 Maha Guru

    Messages:
    1,469
    Likes Received:
    0
    GPU:
    EVGA 1660Ti
    No DVI also means single slot water cooling, just like the Fury X.
     
  14. tsunami231

    tsunami231 Ancient Guru

    Messages:
    14,750
    Likes Received:
    1,868
    GPU:
    EVGA 1070Ti Black
    I still use DVI
     
  15. ThundercatMan

    ThundercatMan Guest

    Messages:
    15
    Likes Received:
    0
    GPU:
    Msi gaming 980ti
    Definitely going for the FE for the first time ever as did my first water cooling system a few weeks ago with my older kit and my 980ti so I'm confident in replacing water blocks so if I don't buy this generation ti it will definitely be the next one but 100% FE ;)
     

  16. Ricepudding

    Ricepudding Master Guru

    Messages:
    872
    Likes Received:
    279
    GPU:
    RTX 4090
    Can we finally let DVI die off now? least with these top end cards, almost no new monitors even have this old connection. Like cards that kept VGA for way longer than they should have :')

    But seriously let DVI die, as much as i loved it back in the day, those cables were massive in comparison to DP or HDMI
     
  17. icedman

    icedman Maha Guru

    Messages:
    1,300
    Likes Received:
    269
    GPU:
    MSI MECH RX 6750XT
    I'm with everyone else on letting dvi die but wow when this was the fury x everyone went nuts now Nvidia does it and it's the smartest thing ever
     
  18. Ricepudding

    Ricepudding Master Guru

    Messages:
    872
    Likes Received:
    279
    GPU:
    RTX 4090
    Well the fury (x) did come out in June 2015, coming up to 2 years ago... though shouldn't have been on that card either. i can understand DVI on the lower end models, but not on the top end. i'd rather it be left for extra venting or another DP/HDMI port
     
  19. Loophole35

    Loophole35 Guest

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    The thing with the Fury X was not only did it lack DVI it also lacked HDMI2.0.
     
  20. DeskStar

    DeskStar Guest

    Messages:
    1,307
    Likes Received:
    229
    GPU:
    EVGA 3080Ti/3090FTW
    It will be in lieu of a "dongle" for people who need to have access to a DVI connection.
     

Share This Page