The GTX 1080-Ti Thread

Discussion in 'Videocards - NVIDIA GeForce' started by XenthorX, Sep 18, 2016.

Thread Status:
Not open for further replies.
  1. endbase

    endbase Maha Guru

    Messages:
    1,073
    Likes Received:
    181
    GPU:
    TUF OC RTX 3080
    Nada reading about it it gives not realy an performance hit but correct me if I'm wrong
     
  2. slickric21

    slickric21 Ancient Guru

    Messages:
    2,458
    Likes Received:
    4
    GPU:
    eVGA 1080ti SC / Gsync
    Not really sure what you mean if I'm honest.

    I've got a 1080p monitor but run most if not all my games at 1440p, 1620p or even a few at 2160p.

    Why wouldn't you if you have performance to spare. The IQ benefits are tremendous.
     
  3. XenthorX

    XenthorX Ancient Guru

    Messages:
    4,225
    Likes Received:
    2,301
    GPU:
    3090 Gaming X Trio
    1080-Ti or Vega, my brain is split
     
  4. endbase

    endbase Maha Guru

    Messages:
    1,073
    Likes Received:
    181
    GPU:
    TUF OC RTX 3080
    is DSR not about using higher resolutions ?

    Nice to hear maybe I must read some more about resolutions I have an 144 Mhz 1080p monitor tried higher resolutions but I have bad reading eyes on my age so 1080p is readable for me the text I mean <--- old man hehe

    tried 4k but everything went too small for me on the desktop I know games will be different but hey I'm quit happy as it is :)
     
    Last edited: Mar 6, 2017

  5. slickric21

    slickric21 Ancient Guru

    Messages:
    2,458
    Likes Received:
    4
    GPU:
    eVGA 1080ti SC / Gsync
    You wouldn't want to run your desktop at an non native resolution as you will find it blurry.

    For games enable DSR in control panel, then enable 1.78x, 2.25x and 4.0x (or whatever factors you want) this gives you 1440p, 1620p and 2160p resolution available.
    Set smoothness to about 10-15%.

    Now in games where you want better IQ, in terms of less Aliasing, better texture filtering and a more clearer stable picture select your Higher resolution.
    Obviously the higher the better IQ but greater performance hit you see.

    Can't wait to get my 1080 Ti so I can ramp up the resolution in newer games.
     
  6. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    13,546
    Likes Received:
    6,376
    GPU:
    2080Ti @h2o
    Let me help you mate: Own a Gsync monitor? Halves your options quickly :D
     
  7. HittriX

    HittriX Member Guru

    Messages:
    154
    Likes Received:
    0
    GPU:
    MSI 1080Ti Gaming X
    It sure does :D
     
  8. Hammie

    Hammie Banned

    Messages:
    703
    Likes Received:
    9
    GPU:
    760 SMG 28"
    Nice looks like its about 20 percent faster then the 1080 GTX.
     
  9. HittriX

    HittriX Member Guru

    Messages:
    154
    Likes Received:
    0
    GPU:
    MSI 1080Ti Gaming X
    Stop trolling bro Lol
     
  10. Turanis

    Turanis Ancient Guru

    Messages:
    1,780
    Likes Received:
    489
    GPU:
    Gigabyte RX500

  11. XenthorX

    XenthorX Ancient Guru

    Messages:
    4,225
    Likes Received:
    2,301
    GPU:
    3090 Gaming X Trio
    lol, nah, i've a 4k PB279Q now, i dropped G-Sync, it was really underwhelming for the excessive extra price.

    Wow Physic Score is like Day and Night oO
     
    Last edited: Mar 8, 2017
  12. Behelit

    Behelit Master Guru

    Messages:
    344
    Likes Received:
    2
    GPU:
    EVGA GTX 1080Ti SC2 Hydro
  13. kx11

    kx11 Ancient Guru

    Messages:
    3,786
    Likes Received:
    839
    GPU:
    RTX 3090

    very funny benhcmark


    4 cores cpu vs 8 cores cpu :infinity::infinity:
     
  14. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    13,546
    Likes Received:
    6,376
    GPU:
    2080Ti @h2o
    Well once I had it I didn't let go of Gsync anymore, but I'm running 144Hz 1440p display also, easier to feed mine than your 4K screen. What do you do when you drop below 60fps in games at 4K? Just curious :)

    This may sound stupid, but how close is that physics benchmark actually to how it works in games? I have no firm understanding of comparing those phsyics numbers to game performance (not like fps here and fps in games).



    Of course they do because they are in the same price range. If the quad core would be cheaper, it would be the better buy probably still. But yes, AMD, moar coares! :bugeye: :D
     
  15. isidore

    isidore Ancient Guru

    Messages:
    6,276
    Likes Received:
    58
    GPU:
    RTX 2080TI GamingOC
    It's not about that. It's about the fact that it does not bottleneck the 1080ti at default clock. Like all the ****ing reviews you saw about gtx 1080 and the ryzen CPU's.
     
    Last edited: Mar 9, 2017

  16. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    13,546
    Likes Received:
    6,376
    GPU:
    2080Ti @h2o
    If anybody's interested, I still got to check it out, an unboxing and a little talk about the PCB from der8auer:

    https://www.youtube.com/watch?v=rPV9HazW4aQ



    I might have missed that, where was a review of Ryzen that showed a 1080Ti bottlenecked (although the card wasn't released...)? Sounds like you're referring to the issues with 1080p performance and whatever reason for it they found. I personally haven't seen a REVIEW of the 1080Ti with a Ryzen CPU running it. Could you link me to one that isn't a leak, please? I can't imagine Ryzen not being fast enough to feed a single Titan / 1080Ti.
     
    Last edited: Mar 9, 2017
  17. isidore

    isidore Ancient Guru

    Messages:
    6,276
    Likes Received:
    58
    GPU:
    RTX 2080TI GamingOC
    Wasn't referring specifically to the 1080ti. Sorry i did not specified, was referring to gtx1080. Seeing that gtx1080ti will be faster..than... you get it. Anyway i'm really glad it's not the case. :)
     
  18. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    13,546
    Likes Received:
    6,376
    GPU:
    2080Ti @h2o
    Ah yes I understand. Still I don't think you can't game with a Ryzen rig :D
     
  19. XenthorX

    XenthorX Ancient Guru

    Messages:
    4,225
    Likes Received:
    2,301
    GPU:
    3090 Gaming X Trio

    For the framerate, i'm clearly lowering the graphics, i don't mind playing some games in the 40s range, otherwise i drop a couple settings here and there, or play around with the 'resolution scaling' too.

    As for physics benchmark, it's not a straight indication of performance gain in game, at least not yet. Game engine were heavily optimized for 4 cores, but those benchmark clearly shows that there's a headroom left on AMD side, it's a great sign !

    It's natural for software to lag a bit behind on complex piece of technology as game engines. It'll eventually pay off now that it's going mainstream, it should already pay off in already scalable game engine like frostbite.

    And of course it also greatly depends on the type of game. Open-world game are clearly where the CPU headroom will be relevant. NPCs handling, lot's of physic etc.. all this on CPU
     
    Last edited: Mar 9, 2017
  20. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    13,546
    Likes Received:
    6,376
    GPU:
    2080Ti @h2o
    How does it work for you to have 40fps on a 4K display without Gsync? Should that bring up stuttering or tearing? Or am I so used to Gsync already that I forget how it worked? (Honestly, I kind of forgot! :bang: )

    Well to be fair, don't automaticly say everything above 4 cores is AMD now... I'm kind of fed up with people forgetting that the only thing AMD did was to bring cheaper and better performing hexa and octa core CPUs to the game than they had before. They are not the first, and it's not their first iteration of that stuff either. Intel's CPUs are expensive, but they have been around performing well for a long time with more than four cores. Just a friendly reminder at those people repeating AMD's marketing bs over and over again, no offense intended (you are running a hexacore too after all, and it's performing well, and it's not AMD).
     
Thread Status:
Not open for further replies.

Share This Page