NVIDIA Confirms G-Sync Ultra HD HDR monitors will launch this month

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, May 16, 2018.

  1. sverek

    sverek Guest

    Messages:
    6,069
    Likes Received:
    2,975
    GPU:
    NOVIDIA -0.5GB
    Oh boy, I worrying how I suppose to tell the difference on my office TN monitor. Blessings.
     
  2. vazup

    vazup Guest

    Messages:
    333
    Likes Received:
    26
    GPU:
    r9 280X
    Can people stop stroking oled's shaft already? Yes, it is fantastic in a pitch black room. If you are playing content in a medium to a very bright room oled wont even come close to a high end fald tv because it is too damn dim.
    And as far as I know people still dont recommend to turn oled light past 40 if you dont want to risk burn in so again not ideal for gaming in medium to very bright rooms.
    But for those who enjoy screen time only in a very dark room there is no better screen than oled until micro-led comes out.
     
    Last edited: May 17, 2018
  3. nhlkoho

    nhlkoho Guest

    Messages:
    7,754
    Likes Received:
    366
    GPU:
    RTX 2080ti FE
    lol
     
  4. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    Oh boy.

    Well brightness of OLED has improved significantly in the last several generations and continues to improve and I said and people agreed they'd have to prove they fixed the burn-in issues with static content before anyone would buy into it.

    Also I run my C7 at 100% brightness for over a year now with no issues - but I don't game on it, just movies.
     

  5. anxious_f0x

    anxious_f0x Ancient Guru

    Messages:
    1,908
    Likes Received:
    616
    GPU:
    ASUS TUF RTX 4090
    I feel like people get to hung up on peak brightness, it’s not like every piece HDR content is mastered to include 1000nit highlights in every scene, the brightness of LCD/LED can be amazing at times, but I found local dimming to be hit and miss when I was looking for new tv, could be quite noticeable when zones dimmed was very distracting, you get more bloom/light bleed around bright highlights in dark scenes as well which I really didn’t like.

    That was over a year ago though so the situation may have improved somewhat since then.

    Prefer OLED myself, far more natural looking, haven’t had any burn in with over a year of gaming on my LG B7.

    I’ll wait for reviews on these new monitors before pulling the trigger.
     
  6. Koniakki

    Koniakki Guest

    Messages:
    2,843
    Likes Received:
    452
    GPU:
    ZOTAC GTX 1080Ti FE
    Same scene from a "4k" BD Youtube video on my cheap turkish-panel Vestel 65" 4k Telefunken. Non fully tweaked atm.

    https://s7.**********/80pyg8qyj/youtu.be-n_Smio6l_MPYA.png
     
  7. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    Taking pictures of your screen isn't going to show anything. The sensor on the camera, the DSP processing on the SoC and the reproduction on the monitor of each person viewing it is going to completely alter the image. You can't physically see the benefits of HDR in a picture.
     
    anxious_f0x likes this.
  8. Moonbogg

    Moonbogg Master Guru

    Messages:
    306
    Likes Received:
    212
    GPU:
    GTX1080Ti@2.1GHz
    OK, I found something. It shows why the 1" spacing is an issue VERY CLEARLY. Its because you still have an LCD panel, just like the one you have right now, but it has a bunch of widely spaced backlights behind it. These panels have backlight bleed just like any other LCD, and oh boy, it shows when you have a bright object on dark background. Watch this video and lol all the way to the bank, because no one will be spending that much money on tech like this. I TOLD you guys that fine details can't work with 1" spaced back lights. You laughed. Keep laughing and go ahead and spend $2k on this thing.
    Can't believe people defending this tech. As if 1" spaced super bright backlights behind an LCD panel was not going to be an issue? Maybe this is why it was delayed for so long and they had to give up and admit they did the best they could? Maybe this is also why panel mfg's have announced better tech is coming in less than 2 years? I think so.

    Fast forward to 4:02 to see the issue.

     
  9. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    No one in this thread was arguing with you about whether or not the backlight spacing was an issue - it's obviously an issue, it's present in essentially all Zoned LED TVs.. so I'm not even sure why you're bringing it up like it's some revelation. The argument was whether HDR was real or manufactured/fake/makes no difference. Also my first post in this thread is that I'm not spending this much money on the monitor.
     
  10. Moonbogg

    Moonbogg Master Guru

    Messages:
    306
    Likes Received:
    212
    GPU:
    GTX1080Ti@2.1GHz
    Well if anything, the backlights should provide a really good uniformity which is something current displays have a hard time with, so at least there's that. I can't see myself getting something like this until its more of a standard, maybe like in 2020 or something.
     

  11. maikai

    maikai Maha Guru

    Messages:
    1,337
    Likes Received:
    68
    GPU:
    NVIDIA RTX 4070
    Right, the price on these we all know will be absurd but the freaking tinfoil hat wearing people thinking the tech is made up Is just as absurd
     
  12. RealNC

    RealNC Ancient Guru

    Messages:
    5,121
    Likes Received:
    3,395
    GPU:
    4070 Ti Super
    Don't know about the others, but I'm just having a bit of fun at the whole thing. They are selling a relatively minor improvement at a completely inflated price.
     
  13. ttnuagmada

    ttnuagmada Master Guru

    Messages:
    270
    Likes Received:
    145
    GPU:
    3090 Strix
    This monitor is going to be the biggest disappointment since Vega. 1000 nits +IPS = worst blooming in the history of LCD displays.
     
  14. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,561
    Likes Received:
    18,880
    GPU:
    AMD | NVIDIA
  15. ttnuagmada

    ttnuagmada Master Guru

    Messages:
    270
    Likes Received:
    145
    GPU:
    3090 Strix
    yes, yes really. 384 only seems like a lot. When you factor in that this monitor will take up a much larger field of vision than a TV will, on top of the fact that it has 5x worse static panel contrast than any FALD TV (they all use VA panels), and then crank it to 1000 nits, you're going to have some of the worst blooming you've ever seen.

    Imagine sitting 5 feet from a 75in TV. That's what kind of FOV this monitor will take up. The array will only be something like 16x24 back-lighting zones. That's fine for a 65in TV that's 10 feet away, and masked by 5x the static contrast, but this is a completely different story. The blooming will be everywhere in HDR content on this thing.
     

  16. Moonbogg

    Moonbogg Master Guru

    Messages:
    306
    Likes Received:
    212
    GPU:
    GTX1080Ti@2.1GHz
    I have my tinfoil hat reservations, but really I am more curious about the technology than anything. I can't wait to go to Newegg (down the street from me) and see them on display, provided they decide to showcase them there. I would only buy the ultra wide variant if anything since I can't go back to 16:9 now. I have some burning questions about HDR tech in general on the PC. Basically, is it worth it with gaming being the primary thing I do? I suppose Netflix HDR could be pretty awesome.
    One thing is for certain. The reviews will tell a lot about these, but once enthusiasts get their hands on them, THAT'S when you will know exactly what all the downsides are. I predict an absolute mountain of complaints about the haloing effect. The backlights will bleed through the panel as they attempt to brighten up fine details, such as a mouse cursor. I think it will drive a lot of people absolutely insane and tons of panels will be getting shipped back and RMA'd, especially since they cost $2,000. Almost no one will accept even the faintest defect (real or perceived) at a price like that. I think a ton of panels will be getting returned to the store for all sorts of stuff, lol. Just watch the forums at overclock.net. Wait for the owners thread to explode with RMA RMA RMA RMA RMA. Haha. I'm sure the monitors are very beautiful, but not beautiful or perfect enough for $2,000.
     
  17. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
  18. thesebastian

    thesebastian Member Guru

    Messages:
    173
    Likes Received:
    53
    GPU:
    RX 6800 Waterblock
    I've been using a 28" 4k g-sync monitor (60Hz) for a long time.

    28" 4K is fine for working, multimedia, etc.
    But for gaming in 4K I highly recommend 32" at least, if you have 4K. So I wouldn't buy 120Hz 4K 28" just for gaming. It's a bit better than 1440p, but not as good as 32" or more.
     
    Last edited: May 19, 2018
  19. fry178

    fry178 Ancient Guru

    Messages:
    2,078
    Likes Received:
    379
    GPU:
    Aorus 2080S WB
    @ttnuagmada
    all tvs/fald use va panels?
    Either you dont know what your talking about, or your trying to mislead ppl.
    Most uhd panels lg and sony uses are ips, jfyi.

    And reading some other comments:
    Local dimming has NOTHING to do with signal/source or connection .
    Its controlled internally by the tv/moni.

    Not even talking about the fact that the Z9 series from sony doesn't even get limited by "zones" (as the tv has individual control over each single led), shows that it doesnt need oled to produce an oled like picture (quality).

    Anyone waiting for oled monis, have fun growing a 30ft beard while waiting.
    No company (should be) stupid enough to make a moni with screens that suffer heavily from static images.
    The oled lg in our store isnt even up for a year, and already burned in the lg logo that shows when the tv is turned on (once a day).
    Besides that the sony (based on same oled panel) does NOT have that problem, even that its on the wall for about 3 month longer.

    And gaming on a uhd screen does NOT require 4 titans in sli, thats what gsync is for, so that constant fps of 60 is not needed.

    And gaming on higher Hz/fps will feel smoother, but mainly because our eyes see less of a difference between each frame as the numbers go up.
    E.g. what looks like smooth motion to us, moving hand very fast trying to smash a fly, looks like stop motion to the fly, as they can see higher fps (before it turns to "motion").

    And anyone complaining about backlight zones/distance between leds:
    Does your current moni has that? Right...
     
    Last edited: May 18, 2018
    maikai likes this.
  20. ivymike10mt

    ivymike10mt Guest

    Messages:
    226
    Likes Received:
    12
    GPU:
    GTX 1080Ti SLI
    That's what I'm affraid there: IPS + 1000 nits. Idk if VA panel's are "suitable" for 144Hz gamming monitor.
    But I know, that even a medium quality VA panel, should give much less "over-glow".
    I see, in the video above.. They presented relatively big double-pointer. That's may over-expose this "over-glow" anyway.
    "All the games I'm currently playing I can easily push past 100fps at 4K"
    I respect Your personal opinion.. But I think, Your graphics settings was far from Ultra? And only some well optimized games..
    On sample.. do You tried?: Kingdom Come Deliverance, ELEX, FF XV, FC5, Assassins Creed Origins, Deus Ex: Human Revolution, Dishonored 2, Mirrors Edge Catalyst, -
    Quantum Break.. in 4K, at Very High / Ultra preset..?
    I'm asking because many people buy Top-End GPU/s.. to see the way, what Programmer/Graphics/Artist wan't to present Us.
    Or just to play in better quality than consoles can curently offer.
    For Movies a OLED TV, are better than (today) LCD. Even if LCD, producing significantly higher peak brightness.
    Videophiles and cinemamaniacs, should "focus" on more basic aspects, like: contrast, black quality, and colors..
    HDR WOW effect, should go on the second "plan". Movies probably will not hurt 2017+ OLED screen's. Even if I already saw some defect's on B7.
    The problem start.. When We like to use a OLED into desktop PC, or even console. Or We plan to display scenes, with bigger white areas - (like scenes in snow).
    Or, (let's say), enlarging white window in windows OS - the whole brightness going down significantly. That's annoying. And don't fit for some proffessional usage.
    Other, not so obvious aspects, are Durability.. desktop usage = afterimages, or burnouts..
    We know LG (itself) do not recommend using OLED for desktop. Their "papper" lifespan.. is very.. Very subjective.
    So my conclussion is simply.. (right now) LCD and OLED panel's, are two different target group.
    Personally, I like to use a OLED screen under PC.. do some work, play in games, ocassionally watch a movie, or look my pictures.
    I think, that mostl typical usage of desktop PC! I will be satisfied, from spended pile of money on my TV, only if they provide me this withou any issues.
    My problem is: High price, and too much compromises..
    I agree, even lower Sony models, will have much higher contrast ratio. And in effect, less "over-glow" effect.
    One of the most important parameter in current LCD technology, are Native Contrast Ratio. Then - (on the second "plan") should be: amount of zones.
    There is also very important, algorithm "behind" array - softweare.
     
    Last edited: May 18, 2018

Share This Page