ASUS and Acer UHD G-Sync HDR Monitors Forced to Use Color Compression at 120/144 Hz

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jun 18, 2018.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,541
    Likes Received:
    18,853
    GPU:
    AMD | NVIDIA
  2. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Did not study HDR yet, but I considered it as simple as another 8bits per pixel which can work as kind of exponent to RGB bits of pixel.
    Or HSL expansion where one would need even less additional bits. There are many methods which can easily achieve Very wide HDR range.

    I guess they used something more bandwidth hungry.
     
  3. Timothy D Smith

    Timothy D Smith Member

    Messages:
    12
    Likes Received:
    4
    GPU:
    Nvidia gtx 1080ti
    I can't believe that the interconnect is the STILL the limiting factor. People are paying tons of money for GPUs and monitors and they are bottlenecked by a cheap cable. I would love to get a 4K, HDR high refresh rate monitor but I will wait.
     
  4. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,759
    Likes Received:
    9,647
    GPU:
    4090@H2O
    Early adopter syndrome... this happens when display tech is ahead of other factors (here, connections).

    Couldn't USB-C work for this? I don't know the numbers tbh
     

  5. Agent-A01

    Agent-A01 Ancient Guru

    Messages:
    11,640
    Likes Received:
    1,143
    GPU:
    4090 FE H20
    Didn't they say "visually lossless compression" was supported by 1.4?

    Anyways, it's a good idea to skip this generation and wait for DP 1.5 or Gsync with HDMI 2.1 support.
     
    fantaskarsef and eGGroLLiO like this.
  6. Silva

    Silva Ancient Guru

    Messages:
    2,051
    Likes Received:
    1,201
    GPU:
    Asus Dual RX580 O4G
    I'll wait for a 16:9, 23/24''; Freesync 2 monitor, below 200€. Thank you.
     
    HitokiriX likes this.
  7. GameLord

    GameLord Member

    Messages:
    21
    Likes Received:
    5
    GPU:
    GTX 980 G1 / 4 GB
    422 looks very bad on my UHD TV. Only 444 or RGB is the way to go.
     
    HonoredShadow likes this.
  8. WhiskeyOmega

    WhiskeyOmega Master Guru

    Messages:
    291
    Likes Received:
    26
    GPU:
    RTX 3090 MSI GXT
    Its not even a 10bit panel. its an 8bit panel with processing
     
  9. Agent-A01

    Agent-A01 Ancient Guru

    Messages:
    11,640
    Likes Received:
    1,143
    GPU:
    4090 FE H20
    Yes we all know you want the very best for pennies.

    Anyways. There are already Freesync 1 monitors for that price.

    Why do you want FS 2? FS2 Adds HDR support and forces manufacturers to use LFC(which you can find on Freesync 1).

    Do you think you'll get a high refresh rate with HDR support for cheap?

    You'll be waiting a long time, probably indefinitely; even if they were that cheap I can guarantee they would be junk monitors as all are generally at that price point.
     
  10. sammarbella

    sammarbella Guest

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    Yes, early adopter syndrome and "smart" hardware manufacturers selling Ferraris without wheels. :D

    USB-C could be a solution but it's another huge mess by itself (get a couple of Tylernol at hand...just in case):

    http://blog.fosketts.net/2016/10/29/total-nightmare-usb-c-thunderbolt-3/

    https://www.digitaltrends.com/computing/usb-c-implementation-messy-and-unclear/

    HDMI is the proper solution for TV and monitors and will arrive not before q3 2018...
     
    fantaskarsef likes this.

  11. H83

    H83 Ancient Guru

    Messages:
    5,510
    Likes Received:
    3,036
    GPU:
    XFX Black 6950XT
    Selling 2500€ monitors with this kind of issues is a very bad joke :(
     
  12. Brit90

    Brit90 Member Guru

    Messages:
    124
    Likes Received:
    53
    GPU:
    R390X 8GB
    Most movies at 4k HDR stream no more than 60Hz, so there is a reason to their madness. I don't agree with what they did, but still - early tech is doing what it said it would.
    I am sure the specs mention it as well (although I haven't checked).
    Everything is always in the details, but if the specs say 4K HDR @ 144Hz then people will have a case against them.
     
  13. Pale

    Pale Member

    Messages:
    18
    Likes Received:
    10
    GPU:
    2 x GTX 1080 Ti
    As is mentioned above - isn't the panel actually an 8-bit panel with processing (8-bit+frc)? Doesn't that mean that you actually cannot see the difference between an 8-bit and a 10-bit image on this monitor? If that is the case, DP 1.4 supports up to 4K 120Hz@8-bit. Whereas 10-bit is limited to 98hz, as the article says. But if you cannot see the difference on this panel, due to it being 8-bit+frc, and not true 10-bit, then I was thinking I'd just run it at 120hz, and then leave it at that.
     
  14. Solfaur

    Solfaur Ancient Guru

    Messages:
    8,013
    Likes Received:
    1,533
    GPU:
    GB 3080Ti Gaming OC
    It was just a matter of time, I'd be surprised if there won't be more issues.
     
    fantaskarsef likes this.
  15. FeDaYin

    FeDaYin Guest

    Messages:
    72
    Likes Received:
    10
    GPU:
    MSI GTX 780 ref.
    4K blurays are 4:2:0, never seen someone being upset about it.
    Now 4:2:2 is not enough ?
     

  16. Chess

    Chess Guest

    Messages:
    390
    Likes Received:
    57
    GPU:
    ASUS GTX1080Ti Stri
    Am I naïve in thinking that... well, just use 2 cables from GPU to monitor?
    I get that surround/multimonitor enthousiasts will be left cold in the water, but It'd be a good temporary solution, I think?

    I remember that Dell 8K monitor used 2 or even 4 DP cables.
     
  17. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,541
    Likes Received:
    18,853
    GPU:
    AMD | NVIDIA
    It's a different level of source material. Apples versus oranges and the likes. 4:2:2 and 4:2:0 are indeed commonly applied towards Blu-rays (h.265 as well). However movie material has little to suffer from it as it's always a bit more washed out with compression and thus is less prone to be noticed. The refining level of quality you see with things like thin lines, filled polygons and text for example, in a game will just look much fuzzier and washed out due to color compression.

    I have yet to see it for myself though, and the fact that these monitors are 27" will make the effect look less visual in sight due to pixel density. But the fact that people who purchased such a monitor can actually noticed it and complained about it when switching towards 120 or 144 Hz, says enough.

    Whats worse, the display manufacturers simply did not include this info in their spec sheet while it clearly is a compromise they pro-actively made.
     
    yasamoka, Solfaur and fantaskarsef like this.
  18. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,759
    Likes Received:
    9,647
    GPU:
    4090@H2O
    Thanks for the reads. Yes looks like of course, again, the industry that doesn't use a standard kills it off right away.
     
  19. GxCx

    GxCx Guest

    Messages:
    17
    Likes Received:
    0
    GPU:
    burned
    gtx can output just 8bit, Quadros are for 10bit displaying
    even with SDI you can display 8bit with GTX ))
     
  20. Memorian

    Memorian Ancient Guru

    Messages:
    4,021
    Likes Received:
    890
    GPU:
    RTX 4090
    GTX can output 10bit colour in DirectX applications. You need Quadro for 10bit in OpenGL.

    As for the limitation of the monitors and their price..

    [​IMG]
     

Share This Page