New Rumors: GeForce GTX 1180, 2080

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jun 13, 2018 at 8:08 AM.

  1. Fox2232

    Fox2232 Ancient Guru

    Messages:
    6,564
    Likes Received:
    425
    GPU:
    Fury X +AW@240Hz
    nVidia does not need 7nm with current GPUs. On 7nm, they will not clock significantly better than they do now. They will only cost more and eat less.
    So, why should nVidia pay more for GPUs saving client's cash on electricity bill? Maybe for minors, they sure love performance/watt ratio?
     
  2. ubercake

    ubercake Member Guru

    Messages:
    157
    Likes Received:
    29
    GPU:
    Asus GTX 1080 FE
    1080 was such a big leap over the 980. Most of us could go from using two cards in an SLI setup to one card for a 1440p monitor.

    That's probably why Nvidia has taken a while getting to the next generation. They just haven't had to and I'm not sure the demand is there yet. Are there any games by which you can't exceed 60fps on a 1440p monitor at full details with a GTX 1080? If there is something, is it worth playing?

    The idea is cool to run everything at my monitor's 144Hz with 144 fps in G-sync, but once you get up past around 90Hz/90 fps, it's hard to tell a difference.
     
  3. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    2,860
    Likes Received:
    311
    GPU:
    HIS R9 290
    4K users demand more than what a 1080Ti offers.
     
  4. ubercake

    ubercake Member Guru

    Messages:
    157
    Likes Received:
    29
    GPU:
    Asus GTX 1080 FE
    There aren't that many... yet.
     

  5. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    2,860
    Likes Received:
    311
    GPU:
    HIS R9 290
    They are on the rise though. 4K displays are steadily becoming more affordable, and most modern applications are built to be compatible with such resolutions. Besides, that's a bit of a chicken and egg situation - are there not many 4K users because there's not enough good/playable 4K content, or is there not enough 4K content because there aren't enough users with 4K displays?
    Back around 2015, the latter was definitely true. As of today, I'd say the former is true.

    The point is: there is actually a demand for better GPUs.
     
    chispy, Maddness and fantaskarsef like this.
  6. JamesSneed

    JamesSneed Master Guru

    Messages:
    250
    Likes Received:
    59
    GPU:
    GTX 1070
    Another alternative is that 4k has a low adoption rate is because to get the frame rates to have a very smooth experience you have to spend $800+ on a GPU. I went with 2k(1440p) for that very reason.
     
    Maddness likes this.
  7. D3M1G0D

    D3M1G0D Master Guru

    Messages:
    818
    Likes Received:
    289
    GPU:
    2 x GeForce 1080 Ti
    I dunno, 4K monitors are still prohibitively expensive, especially the ones with G-Sync. Most of them are also limited to 60 FPS (this is the main reason why I did not buy a 4K monitor). I would only consider going 4K when 144hz models are out, and prices come down on them (so not for a very long time). 4K content has very little to do with it.
     
    Maddness likes this.
  8. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    2,860
    Likes Received:
    311
    GPU:
    HIS R9 290
    From what I've noticed, 1080Tis have still been in somewhat high demand, despite their price. I personally wouldn't be willing to spend $800+ on a GPU (I find it hard to justify spending more than $300) but I'm not everyone else. I would also like to point out that I myself don't have a 4K display and don't intend to get one for a while.
    For your specific needs, yes, 4K is ludicrously expensive. But 4K displays in general definitely are not prohibitively expensive. These are very reasonable prices:
    https://www.newegg.com/Product/ProductList.aspx?Submit=ENE&IsNodeId=1&N=100167585 600474748 4814
    Sure, monitors are more expensive, but the pixel density is higher. That being said, these prices aren't that outrageous either, especially when compared to 1440p:
    https://www.newegg.com/Product/ProductList.aspx?Submit=ENE&IsNodeId=1&N=100160979 601305587 4814
    As for anything beyond 60Hz.... why? A 1080Ti often doesn't even reach 60FPS in a lot (if not most) modern games. Getting a 4K display higher than 60Hz with no hardware to take advantage of that is pointless. That's like buying a racecar and never taking it on the track. So before we can start demanding 4K @ 144Hz displays, we first need GPUs that can reliably handle 60Hz. And that comes full-circle to my original point: there is in fact a demand for better hardware.
     
    Last edited: Jun 13, 2018 at 9:13 PM
  9. D3M1G0D

    D3M1G0D Master Guru

    Messages:
    818
    Likes Received:
    289
    GPU:
    2 x GeForce 1080 Ti
    I've been using a 144hz monitor for a while and I don't want to go back to 60 if I can help it. GPUs will eventually become fast enough to render 60+ FPS @ 4K and I don't want to be stuck with a 60 FPS monitor when that happens. It'll be the same with 6K/8K monitors (I'll probably skip the 60hz ones). Just my personal preference.
     
    Maddness likes this.
  10. -Tj-

    -Tj- Ancient Guru

    Messages:
    14,966
    Likes Received:
    491
    GPU:
    Zotac GTX980Ti OC
    Maddness and fantaskarsef like this.

  11. alanm

    alanm Ancient Guru

    Messages:
    7,763
    Likes Received:
    351
    GPU:
    1070 AMP!
    Not sure everyone is willy nilly bound to max settings, esp if they are unaware how much of a difference each setting may visually make. Many settings are just overkill with little to show for it visually. Have owned several high refresh monitors since 2012 (1080p and 1440p) and chose to ditch them all for 4k 60hz even on a gtx1070. Granted I have to turn down settings and use a slightly lower custom res (3840x1620 for 21:9), I prefer this to a 1440p 120hz monitor which I still have as backup. Not into fast paced competitive games which of course may alter the nature of the argument.

    The amount of tweaking you can do to compensate and balance performance vs IQ is astonishing really. All games run to my satisfaction, even AAA titles, and look better than they could ever be maxed out on a small lower res screens. Am also tolerant of dips under 60fps when they occur. I guess my views in this regard are in line with Hilberts re his "A word about FPS" preface to GPU benchmarks in the reviews.

    This of course aside the many older games that run far more easily on 4k and still look good today. Then there is the matter of priorities, since 90% of my PC time is outside of gaming (browsing, movies, productivity) just seals the deal. Cannot go back to small, low res screens anymore. For those who live, breath fast competitive gaming, I can understand the wish to stick to small 1080p high refresh screens.
     
    Embra and sverek like this.
  12. tensai28

    tensai28 Master Guru

    Messages:
    850
    Likes Received:
    116
    GPU:
    1080TI aorus xtreme
    4k performance leaves a lot to be desired on a 1080ti even with overclocking. What I'd suggest for 4k display owners to do while Nvidia is being shady/taking advantage, is to set a custom resolution in nvcp of 3584x2016 and use it for all the games you can't reach ultra 60fps at 4k. This resolution looks great on a 4k screen and is better than lowering settings. You can also try 3456x1944 for extremely demanding games like rise of the tomb raider and ffxv.
     
  13. tensai28

    tensai28 Master Guru

    Messages:
    850
    Likes Received:
    116
    GPU:
    1080TI aorus xtreme
    I game on a 4k TV and use it as my monitor so making the jump to 4k wasn't nearly expensive.
     
  14. dfsdfs1112

    dfsdfs1112 Member

    Messages:
    46
    Likes Received:
    1
    GPU:
    480
  15. sverek

    sverek Ancient Guru

    Messages:
    3,817
    Likes Received:
    565
    GPU:
    NOVIDIA -0.5GB
    Visually maxed out games is really bad benchmark in my opinion. Games might be poorly optimized in first place. (Look at Dishonored 2 for example).
    Hell, I won't be surprised if recent game might have fps drops on 1080p@60fps with 1080Ti.

    If consumer must adjust GPU so it runs latest games on MAX settings flawlesly, it must change GPU whenever there new flag GPU available... only to run at 1080p.
    We shouldn't buy better GPU, cause devs might lazy to optimize engine and just telling consumers to get better GPU to enjoy all eye candies.

    We have stable benchmarks like Mark3D and other benchmarks to tell how well GPU performs.
     
    Maddness likes this.

  16. Fox2232

    Fox2232 Ancient Guru

    Messages:
    6,564
    Likes Received:
    425
    GPU:
    Fury X +AW@240Hz
    Games have quite poor per-pixel-quality. While increased resolution allows for higher details, it still has same poor per-pixel-quality. There are just more pixels.
    Native 1440p vs downsampling from 1440p to 1080p does not make reasonable difference to pay extra or sacrifice screen refresh rate. Same goes for 4k vs down downsampling from 4k to 1440p.

    Yes, downsampling, extra shader effects injection takes users time and native is just a bit better. But one does not need to go to higher resolution before it has all features required.

    It is like watching in-game video from upcoming games. You open news with it and see it in that small youtube embedded window. And you are like:
    "Wow, such fidelity, such quality per pixel bot in effects and geometry."
    But then you put it to to fullscreen, and you see all those places where they had to take step back on shaders, geometry, ... And it does not matter that I run 1080p screen, even when video was recorded in 4k, and I play it in 4k. It still does not reach such high per-pixel-quality to make me think about 1440p screen.
     
  17. Fox2232

    Fox2232 Ancient Guru

    Messages:
    6,564
    Likes Received:
    425
    GPU:
    Fury X +AW@240Hz
    My monitor does 240Hz via HDMI and Freesync works with it too in full range. :)
    Those "standards" are there for bandwidth. Resolution vs refresh rate are just consequence of what manufacturer of screen electronics decides to support.
     
  18. alanm

    alanm Ancient Guru

    Messages:
    7,763
    Likes Received:
    351
    GPU:
    1070 AMP!
    Have you tried 3840x1620? That reduces pixels by 25% and gives you 21:9 UW while maintaining 1:1 scaling (since basically you are just cropping the height). Much easier to run that way and looks great on large screens.
     
  19. Noisiv

    Noisiv Ancient Guru

    Messages:
    6,162
    Likes Received:
    160
    GPU:
    R9 290 AC 1150/1500


    Let’s look at The Witcher 3 for example. There is a 16% difference turning on HairWorks on the GeForce GTX 1080 Ti at 1440p. However, there is only a 10% difference turning on HairWorks on the AMD Radeon RX Vega 64 at 1440p using the same settings. The AMD Radeon RX Vega 64 takes less of a performance hit turning on HairWorks. Another example is Watch Dogs 2 on the GTX 1080 Ti the difference with PCSS Shadows is 27% but on Radeon RX Vega 64 it is 23%, so more efficient. In The Division the difference is exactly the same.

    This shows that no, GameWorks features are not harming AMD GPU performance, in fact, the Radeon RX Vega 64 seems to be better at rendering it sometimes, or just the same other times.

    :eek: :eek:
    --
     
    Maddness likes this.
  20. Noisiv

    Noisiv Ancient Guru

    Messages:
    6,162
    Likes Received:
    160
    GPU:
    R9 290 AC 1150/1500
    hmm.. I wonder if "AMD optimized" tessellation setting has anything to do with that ^^
     
    Maddness likes this.

Share This Page