The GTX 1080-Ti Thread

Discussion in 'Videocards - NVIDIA GeForce' started by XenthorX, Sep 18, 2016.

Thread Status:
Not open for further replies.
  1. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,103
    Likes Received:
    2,606
    GPU:
    3080TI iChill Black
    Apparently they will show 1080ti in March @ PAX? So around the same time as 980Ti.
     
  2. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,750
    Likes Received:
    9,641
    GPU:
    4090@H2O
    Yes I read that too. Can't wait tbh... got the unnecessary upgrade itch.
     
  3. ddelamare

    ddelamare Guest

    Messages:
    224
    Likes Received:
    5
    GPU:
    Inno3D GTX 1070 X4
    As much as i would like to believe that, i dont think SLI/CF will die off completely as screen technology is far outpacing GPU power, i think it always will to a certain degree. We are already seeing screens that are close to production that are 8k in resolution so....

    I hope it does get better, in fact i am sure it will but there will always be extreme pro's and con's to it, similar to as it is today. (Price vs Performance Scaling vs Power Draw)

    I think that 4k should be much more common by the end of this year, but as always, game engines will probably still employ heaving processing techniques to make sure your 1080ti is being slapped around at 4k/60fps.

    As an example, even now, some games are so processing heavy or optimized that my 1070 with vsync off is producing fps that's only marginally above 60fps (Personal Testing). :)
     
  4. Redemption80

    Redemption80 Guest

    Messages:
    18,491
    Likes Received:
    267
    GPU:
    GALAX 970/ASUS 970
    As I said, it could go either way.

    Many games/engines will have switched to DX12/Vulkan in the next few years, so in the traditional sense it's inevitable that both will die off.

    Whether DX12 MGPU is supported by many devs remains to be seen, MS were supposed to be working on a quick and dirty way that would work for all, but not heard much from that recently, while Vulkan currently has no support at all.

    People might want 4k/120, but two 1080ti will not be close to enough if one card is sitting idle.
     

  5. wrathloki

    wrathloki Ancient Guru

    Messages:
    2,134
    Likes Received:
    318
    GPU:
    EVGA 3080
    I think people would love to see 4K60. 4k120 is so far off its not even worth thinking about right now.
     
  6. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,103
    Likes Received:
    2,606
    GPU:
    3080TI iChill Black
    I would rather like to see hybrid gpu mode, igpu+dgpu.. So far its only in one tech u4e Elemental demo, still. Wow.. :rolleyes: :p
     
  7. wrathloki

    wrathloki Ancient Guru

    Messages:
    2,134
    Likes Received:
    318
    GPU:
    EVGA 3080
    What's hybrid GPU mode?
     
  8. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,103
    Likes Received:
    2,606
    GPU:
    3080TI iChill Black
    you know like in the old days Lucid virtu multigpu


    https://blogs.msdn.microsoft.com/di...p-dormant-silicon-and-making-it-work-for-you/
     
  9. Redemption80

    Redemption80 Guest

    Messages:
    18,491
    Likes Received:
    267
    GPU:
    GALAX 970/ASUS 970
    Easily done with older games, I finished Batman AC last year and (via DSR) held a solid 4k/60 throughout, so those with better hardware could go much newer than that.
    Only handy if like me, you have a backlog of games that stretches back years though.

    As for hybrid, I thought AOTS also supported it.
     
  10. wrathloki

    wrathloki Ancient Guru

    Messages:
    2,134
    Likes Received:
    318
    GPU:
    EVGA 3080
    Well yeah that's a given. I was able to play through Inside at 4k60 and that's a new game, just not very demanding. I want to play new AAA games at 4k60 though. Only when that's possible will we have really arrived at 4K gaming.
     

  11. Redemption80

    Redemption80 Guest

    Messages:
    18,491
    Likes Received:
    267
    GPU:
    GALAX 970/ASUS 970
    Yeah, was just meaning that an older game running 4k/120 could be a more impressive experience than a newer one packed full of more modern but subtle visual effects.

    Only modern AAA games that will be doable soon are probably Doom and Gears of War 4.

    The good thing (for me anyway) is that im unsure if going even higher framerate wise is noticeable, and im im a lot more confident that going beyond 4k is a waste of time on anything other than say a projector.
     
  12. wrathloki

    wrathloki Ancient Guru

    Messages:
    2,134
    Likes Received:
    318
    GPU:
    EVGA 3080
    Well we are a very long way from 8k so I'm sure we'll get to high 4K frame rates before then and honestly I think 8k is a much harder sell than 4K. I can say from experience of using my Gsync monitor however that frame rates higher than 60 are noticeable, the higher you go the less blurry an image in motion will look. You can easily test this difference out on a game such as Diablo 3 where you move on only 2 axes and can do so fairly quickly. Cap your frame rate at 120 and move around and see how it looks, then drop it to 60 and see the difference.

    As for Doom and Gears 4 I'm reasonably certain a 1080ti should be able to manage 4k60, same goes for The Witcher 3. Unfortunately those games are in the minority of extremely well optimized games.
     
    Last edited: Jan 11, 2017
  13. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,750
    Likes Received:
    9,641
    GPU:
    4090@H2O
    Personally I thought that m$ is NOT working on any mGPU implementation, I was under the impression that it was ALL dev work now, completely handed over to the game makers instead of the API itself. So no... I don't think we will see lots of mGPU support until each engine is updated with it.

    It all comes down to a simple statement: We have the horsepowers for 4k/120, it just doesn't work properly to use more than one GPU to get there (could be two, or three, or four way CFX/SLI). That's just how I see it, and it's a shame actually, that in 2016 we still didn't see a new API get the features that dx11 had for years.
     
  14. Redemption80

    Redemption80 Guest

    Messages:
    18,491
    Likes Received:
    267
    GPU:
    GALAX 970/ASUS 970
    MS were reportedly working on an api solution for devs without the experience or the resources to support it properly.

    That's from memory, will try and find the article in a bit.

    DX12 actually has better support for MGPU than DX11, providing the devs put the effort into it, but we know how that works.
     
  15. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,750
    Likes Received:
    9,641
    GPU:
    4090@H2O
    Ah, well I missed that one then.
    Indeed, as you say, we know how that works, and what it's worth if nobody puts it into their games. A shame, really, but it was clear that the transition and implementation of mGPU in dev's hands was a bad idea from mGPU user point of view (and probably dev pov).
     

  16. ScoobyDooby

    ScoobyDooby Guest

    Messages:
    7,112
    Likes Received:
    88
    GPU:
    1080Ti & Acer X34
    I still don't understand why people have such hard ons for 4k. Like, I play games, and games already look awesome at 1440p, but its like people just can't be satisfied and NEED the absolute best resolution as if they are working in graphics and media or something. From a strictly gaming standpoint, why bother with 4k/60fps when 21:9 Gsync @ 100FPS+ is much more attainable from a hardware standpoint, and will play so much better..

    It just doesn't seem like its a viable gaming resolution at this point since its too hard to drive with current tech.. so yeah I totally get what you are saying. I just don't understand why people thinks its the be-all-end-all resolution right now, and I read this position A LOT.
     
  17. wrathloki

    wrathloki Ancient Guru

    Messages:
    2,134
    Likes Received:
    318
    GPU:
    EVGA 3080
    4K eliminates aliasing for the most part, what's left is pretty minor, SMAA basically removes it completely at that point. 1080p is a jagged crawly mess and there's often not good AA choices and if there are they're crazy intensive. Also 4K makes everything look so sharp. As for 1440p my TV only does 1440p at 30hz and looks blurry because its pixel count doesn't divide evenly into 4K. This is the case for most (all?) 4K TVs. Basically it's like running SSAA 4x but with all the added extra detail and sharpness from actually having 4 times as many pixels.

    If you haven't seen 4K games in action it's hard to understand why we have a hard on for it. It just looks so amazing. It's like the jump from SD to HD. Note that this isn't really the case for movies and TV because those things don't have to contend with aliasing, the jump there is much more subtle than SD to HD.
     
  18. ScoobyDooby

    ScoobyDooby Guest

    Messages:
    7,112
    Likes Received:
    88
    GPU:
    1080Ti & Acer X34
    Thats the thing, I have seen 4k and played at that rest numerous times.. the discernible difference between it and an ultrawide 3440x1440 with SMAA just does not merit the performance hike and loss of frames IMO.

    Again, from a gaming standpoint, this alone is confusing to me why any gamer would choose 4k YET. Once newer cards drop that can get 4k/60 as a minimum in 95% of games, I can see it become more widely sought after.

    Up till now though, its just odd to me.
     
  19. wrathloki

    wrathloki Ancient Guru

    Messages:
    2,134
    Likes Received:
    318
    GPU:
    EVGA 3080
    You can't buy a 1440p TV. I for one quite enjoy gaming on my 55" TV sitting back in a comfy chair. I do have a 1440p Gsync monitor but I only play competitive games and games that don't play well with a controller on that. Even with that there's still considerably more aliasing than 4K.
     
  20. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,750
    Likes Received:
    9,641
    GPU:
    4090@H2O
    nvm, better not say anything to that, besides more leaning towards Scooby's lack of getting why everybody wants to go 4K.
     
    Last edited: Jan 12, 2017
Thread Status:
Not open for further replies.

Share This Page