ASUS Releases TUF Gaming VG279QL1A gaming monitor

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Aug 3, 2020.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    40,012
    Likes Received:
    8,684
    GPU:
    AMD | NVIDIA
    The 27-inch full HD IPS (1920 x 1080) gaming LCD display is compatible with AMD FreeSync Premium. HDR is compliant with VESA Display HDR 400. Furthermore, it supports wide color gamut display of 125%...

    ASUS Releases TUF Gaming VG279QL1A gaming monitor
     
  2. Loobyluggs

    Loobyluggs Ancient Guru

    Messages:
    4,018
    Likes Received:
    881
    GPU:
    EVGA 1070 FTW
    Review HH; it seems this is a swansong monitor for 1080P?

    Many boxes ticked here.
     
  3. jbscotchman

    jbscotchman Ancient Guru

    Messages:
    5,243
    Likes Received:
    3,982
    GPU:
    MSI 1660 Ti Ventus
    Asus? I'm sure it's way over priced.
     
  4. itpro

    itpro Master Guru

    Messages:
    771
    Likes Received:
    410
    GPU:
    Radeon Technologies
    Hdr400 is a scam. All new games like rdr2 and gears 5 have a minimum HDR of 500/600 nits. I think sites should mention this on every article. False marketing is bad, 400 nits was never hdr.
     

  5. Loobyluggs

    Loobyluggs Ancient Guru

    Messages:
    4,018
    Likes Received:
    881
    GPU:
    EVGA 1070 FTW
    I stand to be corrected here, but isn't "HDR" listed as a compatibility to a colour space size/spread by the games company?

    So - like a mastering process?
     
    itpro likes this.
  6. itpro

    itpro Master Guru

    Messages:
    771
    Likes Received:
    410
    GPU:
    Radeon Technologies
    Not really, you're both wrong and correct. Surely it's a WCG space, beyond srgb. Monitors have an absolute maximum brightness capability whether it has local dimming or not. That intended designed brightened areas have a corresponding nits level for apps and games. Meaning, if you force more nits than a monitor's cd brightness, you will lose image clarity, details loss, white crash, contrast or shadow errors etc. For example in my 600 nits monitor, I can safely choose anything from 500~700 nits no worries.
     
  7. RavenMaster

    RavenMaster Maha Guru

    Messages:
    1,218
    Likes Received:
    145
    GPU:
    1x RTX 3080 FE
    1080p and HDR 400 lol...
     
  8. gx-x

    gx-x Maha Guru

    Messages:
    1,437
    Likes Received:
    144
    GPU:
    1070Ti Phoenix
    Well it is now. Look up HDR 400 :(
    Besides, You don't want 400 nits blasting you while "browsing". Calibrated screen would have 120 cd/m2 / gamma 2.2 / color temp 6500. Then if you have real HDR you would have it enabled in movies and such.
     
  9. Loobyluggs

    Loobyluggs Ancient Guru

    Messages:
    4,018
    Likes Received:
    881
    GPU:
    EVGA 1070 FTW
    So - the games company doesn't have to do anything to switch on HDR for you - it's automatic for the monitor?
     
  10. itpro

    itpro Master Guru

    Messages:
    771
    Likes Received:
    410
    GPU:
    Radeon Technologies
    No. In windows 10 you set it in average of 50% to be great. So, if my HDR brightness is low, I will not enjoy everything. It is not worth it to switch HDR on and off in OS if you got a nice monitor.

    Quite the opposite. It's automatic signal, meaning, you must manually switch on HDR either from OS, application video or game. Then, to get calibrated image you must edit settings everywhere, from gpu vendor to in-game menu. Only HDR10/DV movies are ready to play correctly, out of the box.
     

  11. gx-x

    gx-x Maha Guru

    Messages:
    1,437
    Likes Received:
    144
    GPU:
    1070Ti Phoenix
    look, it's simple: there are two ways to properly calibrate display 1. For windows 2. For MAC. Both will use 120cd/m2 neither will use more.
    You like the way you like, that's fine, but that's not color accurate.
     
  12. itpro

    itpro Master Guru

    Messages:
    771
    Likes Received:
    410
    GPU:
    Radeon Technologies
    It is beyond accurate. HDR enabled usually uses WCG(DCI-P3) which is greater and more accurate than sRGB/Adobe RGB. What are you talking about? cd/m2 means nothing. Nobody can live with a monitor dimmer than mobile phone. Eyes are hurt more from lower brightness than higher one.
     
  13. gx-x

    gx-x Maha Guru

    Messages:
    1,437
    Likes Received:
    144
    GPU:
    1070Ti Phoenix
    that's just not true. It's not how that works at all.
     
    Deleted member 213629 likes this.
  14. Loobyluggs

    Loobyluggs Ancient Guru

    Messages:
    4,018
    Likes Received:
    881
    GPU:
    EVGA 1070 FTW
    So what you are saying is, the marketing BS speak from games companies and monitor owners and movie studios is contradictory and we shouldn't worry about it?
     
  15. gx-x

    gx-x Maha Guru

    Messages:
    1,437
    Likes Received:
    144
    GPU:
    1070Ti Phoenix
    IF monitor does 400 nits while it's darkest black is 0.4 nits - You are looking at washed out picture. Calibration is required and usually after calibration monitor will not be showing 400nits anymore (or whatever marketed value is, just think "dynamic contrast" marketing, it's just that, marketing) but will have darker blacks, proper contrast and colors and overall much better and vivid picture then when it was doing 400 nits for the sake of doing 400 nits.
    Of course, some monitors will show 400 nits (again, insert whatever value) and have good blacks, so you know, it varies really.

    as was said, there is a "pin" in HDR content that would invoke display to turn on HDR mode. In, say, latest Doom you can turn HDR on/off. Movies should have a "pin" in signal to always tell display that movie is HDR and negotiate weather should it play as HDR or SDR. This is entirely different topic. :)
     
    Deleted member 213629 likes this.

  16. EspHack

    EspHack Ancient Guru

    Messages:
    2,621
    Likes Received:
    95
    GPU:
    ATI/HD5770/1GB
    Lcd... 1080p... Its starting to feel like 14nm++++++ here
     
  17. It's the opposite. Your muscles surrounding your eyes that help with focus strain thus feel more sore from lower brightness. Your optic nerve and your iris, cornea etc will suffer from higher brightness at extended periods, in addition too much too fast too bright (impact flash)
     
  18. itpro

    itpro Master Guru

    Messages:
    771
    Likes Received:
    410
    GPU:
    Radeon Technologies
    It depends on exposure time. It is worse to strain your eyes for 8 hours with low brightness trying hard to distinguish anything, than 8 seconds occasional peak brightness here and there.
     
    Deleted member 213629 likes this.
  19. Ahh shall we call it a draw then? Well fought! :p:)

    Just don’t go staring into any monitors now kid’ you’ll poke ur eye out!
     
    itpro likes this.
  20. itpro

    itpro Master Guru

    Messages:
    771
    Likes Received:
    410
    GPU:
    Radeon Technologies
    Haha, I await for the day we'll wear sunglasses when we're watching indoors. Just remember to mention me. :p :D
     
    Deleted member 213629 likes this.

Share This Page