The GTX 1080-Ti Thread

Discussion in 'Videocards - NVIDIA GeForce' started by XenthorX, Sep 18, 2016.

Thread Status:
Not open for further replies.
  1. Paulo Narciso

    Paulo Narciso Guest

    Messages:
    1,226
    Likes Received:
    36
    GPU:
    ASUS Strix GTX 1080 Ti

    Adaptive sync is a nice technology but Nvidia charges 200 bucks more for the same monitor compared to freesync technology, which is basically the same thing without the need for the overpriced hardware.
    It's time for that BS to stop and Nvidia to adopt an open standart.
     
  2. StarvinMarvinDK

    StarvinMarvinDK Maha Guru

    Messages:
    1,374
    Likes Received:
    119
    GPU:
    Inno3D 4070Ti 12GB
    But until then - G-SYNC baby! :D
     
  3. Agent-A01

    Agent-A01 Ancient Guru

    Messages:
    11,640
    Likes Received:
    1,143
    GPU:
    4090 FE H20
    G.Sync is arguably superior to Freesync in many ways.
    Regardless of whether or not one believes it's worth the premium(you can find a ton of G.sync monitors on sale for 200 below MSRP) the chance of freesync support is close to none anytime soon.
     
  4. Emille

    Emille Guest

    Messages:
    785
    Likes Received:
    27
    GPU:
    1080 Ti Aorus Extreme
    How do you unlock core voltage control on an 1080ti Aorus Extreme?

    I'm using 378.92 driver for HDR compatiblity reasons. Do I have to use a newer driver or is the fix unrelated to newer driver versions? I read a review of my card and with the same clocks and increased votage they got a higher constant boost clock.
     

  5. PapaJohn

    PapaJohn Master Guru

    Messages:
    410
    Likes Received:
    141
    GPU:
    Asus 6700XT OC 12Gb
    Which gsync monitor did you get Darren? I'm currently saving up to go balls out on a new beast rig and a top of the line card plus a good quality monitor is a certainty.
     
  6. Emille

    Emille Guest

    Messages:
    785
    Likes Received:
    27
    GPU:
    1080 Ti Aorus Extreme
    I never really screwed around with my card....I alwayd o.c to the max but with this card I wanted to get a really high factory o.c card so I didn't have to worry much of it overclocked. And after trying to see if I could squeeze more out of it. Ai saw that basically every core overclock up yielded high boost but lower scores on last light redux test and also in heaven 4.0.

    As much as a higher boost clock looks good...it doesn't always correlate to better performance.
     
  7. Emille

    Emille Guest

    Messages:
    785
    Likes Received:
    27
    GPU:
    1080 Ti Aorus Extreme
    *Fingers crossed*

    Well after finding my absolute core limit where the display shut off at +55 vore over my cards default 1607mhz I tried 45 and 50 with +300 core...not sure of the cards default memory freq. And them tried for 45 and 50 with +450 mhz memory. It passed heaven bench and the I went back to playing mass effect andromeda which I had been playing with +30 core and +300 memory for many hours and I am now seeong that magoc 60 fps number and seeing it stay there for maybe half the time instead of the 50-58 frame variable I was seeing before.

    I think I might have gained 4-5 frames avg with just +15 more core and +150mhz more memory which seems crazy.
     
  8. 5thElement

    5thElement Guest

    Messages:
    2
    Likes Received:
    0
    GPU:
    16gb
    Still NO full support for the 1080 TI in afterburner 4.3. Screw MSI! Voltage slider is disabled STILL. Finally got my hands on 4.4 Beta 10 and am sharing it for everyone. Screw u MSI!

    https://drive.google.com/file/d/0B_zD3Swh4bi2WjBaUlRBcFFWZkU/view?usp=sharing
     
  9. Icanium

    Icanium Ancient Guru

    Messages:
    1,624
    Likes Received:
    136
    GPU:
    ASUS 4090TUF
  10. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,730
    Likes Received:
    2,699
    GPU:
    Aorus 3090 Xtreme
    Also you can enable it in the none beta.
    http://www.overclock.net/t/1625653/how-to-get-voltage-slider-in-afterburner-working-on-a-1080-ti

    You could always write your own util if none suit you.
    After all, you must be superior.
    Which makes me wonder why you cant use google.
     

  11. Terepin

    Terepin Guest

    Messages:
    873
    Likes Received:
    129
    GPU:
    ASUS RTX 4070 Ti
    Oh no! You discovered secret version that no one was suppose to know about!

    In other news, you're an idiot.
     
  12. Darren Hodgson

    Darren Hodgson Ancient Guru

    Messages:
    17,221
    Likes Received:
    1,540
    GPU:
    NVIDIA RTX 4080 FE
    Sorry for the late reply: I bought an ASUS ROG Swift PG279Q and I've been absolutely loving using it for the past two weeks. G-SYNC completely transforms the gaming experience such that I wonder how I did without it for long. Seeing how much headroom my GTX 1080 Ti has over 60 Hz/60 fps is quite eye-opening even those that run at 3840x2160 via DSR, e.g. The Witcher 3 at 4K with EVERYTHING maxed, including HairWorks, runs at 50+ fps but those dips do not feel stutter or jerky or anything other than silky-smooth even during heavy combat with lots of enemies. Really, really impressive stuff IMO.
     
  13. PapaJohn

    PapaJohn Master Guru

    Messages:
    410
    Likes Received:
    141
    GPU:
    Asus 6700XT OC 12Gb
    Thanks for the reply,sounds fantastic. Did you buy the monitor locally or online?
     
  14. Darren Hodgson

    Darren Hodgson Ancient Guru

    Messages:
    17,221
    Likes Received:
    1,540
    GPU:
    NVIDIA RTX 4080 FE
    Online, from Amazon here in the U.K.
     
  15. eGGroLLiO

    eGGroLLiO Master Guru

    Messages:
    241
    Likes Received:
    108
    GPU:
    EVGA 3080ti FTW3
    Hey Darren,
    Sort of a sidejack, but would you mind listing all of your technical setup as far as the Nvidia Control Panel and what software you are using for your gaming setup. I've been tinkering and at the moment I am using the 382.53 driver. I've set power globally to Adaptive and then in each individual game profile I set power to maximum. I'm using RTSS to limit frames to 140 fps. Yes I have a 144 hz gsync Acer. On my system that results in a 215 mhz desktop clock and still various gpu clocks in each game (even though I have maximum selected per profile). I'm not sure why I can't get to 144 mhz on the desktop, but like I said Nvidia drivers are a mess IMHO.

    I just feel like it's a terribly complicated configuration to make it all work. I've read your posts but I'd like to know your exact approach and software settings. Also, do you feel like you have your hardware running optimally? Thanks in advance.
     

  16. Emille

    Emille Guest

    Messages:
    785
    Likes Received:
    27
    GPU:
    1080 Ti Aorus Extreme
    Well I installed rhe latest MSI and higher voltage didn't seem to help anything and the forum at overclockers.net seemed to be divided as tp whether the high clocks that the increased voltage seem to allow actually increase the performance. The minimum boost clocks increased from 2025 to 2063 but intsability began and it is never worth it for undtable clocks.

    In my expereisnce going from + 50 core to +55 core decreased performance in both heaven and metro benchmarks.

    So I decreased core and memory from +45core and +450memory for stability sake and reduced core voltage tp +0mv.

    I had a random crash in mass effect andromeda after gaming for maybe 5hrs which made me question stability and then metro benchmark crashed . Fongers crossed that it staua stable which it seems to be after reducing clocks and voltage.
     
    Last edited: Jun 18, 2017
  17. pstlouis

    pstlouis Master Guru

    Messages:
    404
    Likes Received:
    3
    GPU:
    RTX 2080Ti Aorus Xt
    Just bought a EVGA GTX1080 Ti SC2. Is there some of you with that video card and what is your evaluation ?
     
  18. Darren Hodgson

    Darren Hodgson Ancient Guru

    Messages:
    17,221
    Likes Received:
    1,540
    GPU:
    NVIDIA RTX 4080 FE
    Over the last few days I have discovered one of the perils of using G-SYNC during a spell of hot weather (it is currently 25-27 C where I am in the UK) as I saw my temperatures in games soar from sub-75 C while gaming at 60 Hz/60 fps max. to as high as 86 C while playing games over the weekend at 165 Hz/100+ fps. While the temperatures stabilised at 86 C with the core dropping in speed to compensate, it was still alarming to see on an EVGA GTX 1080 Ti SC2, a third-party card with dual fan cooling. Capping the refresh rate to 60 Hz with G-SYNC, even at 4K, brought temperatures down significantly though, so I expect this is completely normal even though I always considering my Cooler Master HAF 932 Advanced case to have decent cooling (the CPU never comes close to 60 C for example and the motherboard temperature remains sub-40 C).

    I'm wondering if I should stick with 60 Hz during hot spells rather than risk running my card at 86 C for long periods? I don't have much room with the GPU fans either as they are running at 75-85% at the this point (though noise isn't an issue).

    Is it safe to run the card at these temperatures for extended periods?
     
  19. slickric21

    slickric21 Guest

    Messages:
    2,458
    Likes Received:
    4
    GPU:
    eVGA 1080ti SC / Gsync
    Quite what 'hot weather' as you call it 25-27'c, which would probably make a lot of users on an international forum laugh, has to do with GSync is beyond me.

    Also I think the GSync discussion would be better suited to another thread perhaps ? As it's not really anything specific to the 1080ti thread.
    GSync is awesome and it has always been awesome - long before the 1080ti came out.

    Regarding the temps, The eVGA 1080ti SC Cooler is pretty poor imo. I was pretty disappointed with mine, if you want decent cooling you need to ramp up the fan speed... which starts to resemble a jet engine anything over 45% rpm.
    Putting an AIO cooler in it is a must for quiet gaming and stable high over clocks.
     
  20. Darren Hodgson

    Darren Hodgson Ancient Guru

    Messages:
    17,221
    Likes Received:
    1,540
    GPU:
    NVIDIA RTX 4080 FE
    Do I really have to explain? G-SYNC allows much higher framerates of up to 165 fps vs. 60 fps on my previous setup (higher framerates = more heat), which combined with ambient temperatures that are 10 C warmer than this time last week leads to much higher temperatures during gaming than I was expecting to see on a third-party dual-fan cooler.

    My post wasn't about discussing G-SYNC anyway; it was just an observation based on the timing of warmer weather and my decision to buy a monitor that would allow me to play games at over 60 fps without screen tearing after years of gaming at 60 Hz with v-sync.
     
Thread Status:
Not open for further replies.

Share This Page