NVIDIA Confirms G-Sync Ultra HD HDR monitors will launch this month

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, May 16, 2018.

  1. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,697
    Likes Received:
    9,574
    GPU:
    4090@H2O
    I have to say, since I got a 144Hz screen I am suddenly able to see the difference between 144Hz and 60Hz, as well as noticing it's looking different even on 30Hz (especially with youtube videos until the buffer's up to par etc.)

    I'm still fairly confident in my 144Hz / 1440p screen, when 4K with all eye candy on 144Hz will be a thing with a single GPU, I'll be willing to switch over. Right now it's more like a specific case scenario, so I'll stay at "2K" and rather have more than 60fps in games.

    And I still like to play Overwatch on 144Hz with all the eye candy... I guess I'm a graphics whore that fuels the GPU industry without mining.
     
  2. Witcher29

    Witcher29 Ancient Guru

    Messages:
    1,708
    Likes Received:
    341
    GPU:
    3080 Gaming X Trio
    27 inch nobody is going to buy this, to small for 4k!
    34 inch and up for true 4k.

    I have a 27 inch full HD monitor and thats the perfect size for me, not to small and not to big either, and extremely sharp IQ as well because its a samsung.
     
  3. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,697
    Likes Received:
    9,574
    GPU:
    4090@H2O
    I couldn't use this anymore. Pixels are too large at that size (was mine before I upgraded to 1440p). I'm spoiled now! :(
     
  4. oxidized

    oxidized Master Guru

    Messages:
    234
    Likes Received:
    35
    GPU:
    GTX 1060 6G
    I'm not saying they don't feel smoother, i'm saying there's no real use to that increased smoothness in those games, on the other hand, on CSGO for example you could gain a decent advantage compared to 60Hz

    The higher you go with the size, keeping the same resolution, the lower the ppi will get. For you 27" Full HD could be enough, i find it totally unusable, pixels are way too sparse resulting in the image being pretty low quality. Hell even on my 23" Full HD feels bad.
     

  5. mgilbert

    mgilbert Member

    Messages:
    46
    Likes Received:
    7
    GPU:
    16 GB DDR3
    Glad I'm running an AMD video card, because I don't want to have to give up a kidney to pay for this monitor. Even the Freesync version is likely to be over $1,000, if it is ever released.
     
  6. nhlkoho

    nhlkoho Guest

    Messages:
    7,755
    Likes Received:
    366
    GPU:
    RTX 2080ti FE
    What? Have you ever used a 144hz display?
     
  7. Wrinkly

    Wrinkly Active Member

    Messages:
    73
    Likes Received:
    44
    GPU:
    Radeon 290 / 4GB
    Very happy with a 27" 1440p/60Hz IPS panel paired with a 1080Ti. Higher refresh rates just result in higher temps and more noise.
     
  8. Agent-A01

    Agent-A01 Ancient Guru

    Messages:
    11,631
    Likes Received:
    1,125
    GPU:
    4090 FE H20
    It doesn't matter if its age of empires, high refresh rate benefits all games. Even if it doesn't matter, it still gives a better experience and better perception of smoothness.


    Personally, i prefer to keep my min FPS at the refresh rate; especially in comp titles.

    1080Ti can't even do 4K for fortnite, overwatch, and on and on...

    Need a GPU at minimum twice as fast as the 1080Ti.
     
  9. bigfutus

    bigfutus Master Guru

    Messages:
    535
    Likes Received:
    59
    GPU:
    MSI 3080 VENTUS 10G
    I have 27" 1440/144Hz TN, because most of the 1440 IPS monitors had atrocious backlight bleed problems, so i really hope they learned from that. Anyway, i run it only on 100Hz and FPS capped at 95. The most important thing for me was to check, if the Gsync is worth to me or not, and oh my god it is. I play now Dark Forces II capped at 48FPS and it's pretty smooth anyway. If i dissable Gsync, my eyes would melt at that framerate.
     
  10. ruthan

    ruthan Master Guru

    Messages:
    573
    Likes Received:
    106
    GPU:
    G1070 MSI Gaming
    I would preffer 1440p too.

    4k with 144fps, maybe with Titan you could play Quake 1 with such settings..
     

  11. Prince Valiant

    Prince Valiant Master Guru

    Messages:
    819
    Likes Received:
    146
    GPU:
    EVGA GTX 1080 ti
    I hope they consider making a 2560x1440 version for much less.

    Tying the zones to HDR seems like a bad decision, where did you hear about it?

    I'm more worried about the excessive glow those panels have.
     
  12. tensai28

    tensai28 Ancient Guru

    Messages:
    1,543
    Likes Received:
    414
    GPU:
    rtx 4080 super
    Yeah I never personally cared for 60hz+ refresh rates either. Which is why I'm fine with just a 6600k since I always lock everything to 60fps with vsync. I was just making a point and somewhat venting my frustration about how long Nvidia is taking to release new cards.

    I've heard this a lot. I guess once you get used to it, there's no going back. I'm glad I never tried it because that would be just one more thing costing me money. It's expensive enough maintaining 60fps at 4k.
     
    Last edited: May 16, 2018
  13. slick3

    slick3 Guest

    Messages:
    1,867
    Likes Received:
    234
    GPU:
    RTX 2070 +85/1200
    That is very inaccurate.

    i agree that if you want to play competitive, you need high refresh rates. Just like you need a mouse/keyboard over a controller.

    However, that doesn't mean that games such as Witcher 3, Dirt racing games, Killing floor 2, The Forest etc etc doesn't benefit from the higher refresh rates. The difference is night and day. Heck I'm usually hovering at 80fps+ on TW3 and even then the difference is noticeable over 60hz.

    Also you don't 'need' 144fps to see the difference at 144hz - anything over 80fps and the difference is there.
     
    Solfaur likes this.
  14. Solfaur

    Solfaur Ancient Guru

    Messages:
    8,008
    Likes Received:
    1,530
    GPU:
    GB 3080Ti Gaming OC
    Oh yes, I thought this BS myself before I had a 144Hz monitor, but ever since I got my Dell S2716DG I can guarantee that Witcher 3 for example is a way, way better gaming experience than at 60Hz. High refresh rate is great for everything, no brainer imo, BUT it requiters a LOT of horsepower, a 1080Ti is great NOW for 1440p/144Hz, in 1-2 years, probably not. 4K/144hz is a fantasy, at least for any modern game. Not to mention these new monitors will be obscenely priced and I for one would be very, very wary of the actual quality of the panels...
     
    fantaskarsef likes this.
  15. Legacy-ZA

    Legacy-ZA Master Guru

    Messages:
    271
    Likes Received:
    203
    GPU:
    ASUS RTX 3070Ti TUF
    Personally; I will skip until OLED with HDR monitors are mainstream and affordable.
     

  16. Moonbogg

    Moonbogg Master Guru

    Messages:
    306
    Likes Received:
    212
    GPU:
    GTX1080Ti@2.1GHz
    I was going to say this. I believe these new monitors will be very nice and everything, but they are half assing the true HDR benefits because there is no OLED or micro LED option yet. The backlight zones are spaced roughly 1 inch apart. How can you dim those zones to give text and other small features more contrast than a regular monitor? How can that be better for fine detail? I don't think it can. I expect these zones to offer better contrast for larger contrasting screen areas that can be detailed with backlight LED's spaced about 1 inch apart. Also, the screen will be brighter, but most of us lower the brightness of our monitors way down to like 30/100 or something since brighter hurts the eyes and just annoys. Not sure these will the ideal HDR implementation. OLED is needed for that where each pixel is the light source and the HDR aspects can have perfect fine detailed accuracy on a per pixel basis, not this huge gap of 1" between light sources.
    If you are wondering where I came up with 1" gap, I did some basic math. 21/9 = 2.3. That means 2.3 units across for every 1 unit down. Multiply by, oh say, 15. 2.3x15 = 34.5 units across. Now do the vertical ones. Multiply 1x15 = 15. Now, multiply 34.5x15 and you get 517.5 which is pretty close to the number of zones they claim. Just count 34 points across a 34" monitor, and by eyeballing it, it appears to be about an inch of separation. That's not close enough IMO and we need OLED to do the job right, especially at the prices they will surely charge.
     
    Last edited: May 16, 2018
    Solfaur likes this.
  17. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    I would love an OLED monitor but they'd have to prove to me that they solved the image retention/burn in issues for a period longer than 5 years - especially in static content like a desktop scenario.

    Currently a LG C7 burns in after 6 weeks @ 20 hours per day.. which is obviously extreme but 6 weeks isn't long at all.
     
    fantaskarsef, Solfaur and maikai like this.
  18. maikai

    maikai Maha Guru

    Messages:
    1,337
    Likes Received:
    68
    GPU:
    NVIDIA RTX 4070
    Umm how about until the OLED image burn in is fixed I wouldn't touch it. Nothing like having a monitor trying to play a game but you still see your desktop burned in the screen, yea, hard pass
     
    Denial likes this.
  19. maikai

    maikai Maha Guru

    Messages:
    1,337
    Likes Received:
    68
    GPU:
    NVIDIA RTX 4070
    lol you posted right as I did, exactly right
     
  20. fagoatse

    fagoatse Guest

    Messages:
    140
    Likes Received:
    0
    GPU:
    nvidia
    27" and 4K? What a horrible mistake.

    OLED is a dead end. Those issues cannot be fixed. Micro-LED is the only hope now but it's years away.
     

Share This Page