8bit vs 10bit Displays

Discussion in 'Computer Monitor Forum' started by Reddoguk, Apr 8, 2021.

  1. Reddoguk

    Reddoguk Ancient Guru

    Messages:
    2,660
    Likes Received:
    593
    GPU:
    RTX3090 GB GamingOC
    I was looking at the 34" ultrawides but i really don't like/want a jankie res.

    I had a monitor before with a funny size res and it was annoying sometime because certain games didn't support that res and i had to put up with black bars or stretch the image which just looked wrong.

    Is 3440 x 1440 widely supported? Do older games work at this res? What is watching movies like on a 34" 3440 x 1440 screen and in what res do you even watch movies in?
     
  2. Undying

    Undying Ancient Guru

    Messages:
    25,332
    Likes Received:
    12,743
    GPU:
    XFX RX6800XT 16GB
    Most games have 21:9 support today. If not they always get a patch added in later on like medium. I really want that ultrawide it makes games really more immersive and normal monitors looks so squared after. Once you see it cant go back but maybe thats just me.

    As for movies one should have a tv for that. ;)
     
    Last edited: Apr 12, 2021
  3. Reddoguk

    Reddoguk Ancient Guru

    Messages:
    2,660
    Likes Received:
    593
    GPU:
    RTX3090 GB GamingOC
    I am gonna buy a 4K tv so i shouldn't really worry about movies on a gaming monitor.

    Nearly all the 4K TVs i looked at in the sort of size/price bracket i want are only 50hz panels but i imagine that's more than fine for movies/tv.

    I'll have to research TVs too because really i'm a Monitor/TV noob who's best ever monitor is a iiyama 1080p TN 24" 75hz.

    And yea i came into a bit of money recently and i'm spoiling myself but who'd have thought that monitors are like, you know when you have some pain somewhere and you start googling it and next thing you know is you have cancer and are dying.

    That's what monitors is like the more you read the more you realise monitors are crap atm and choosing which one you want without actually seeing them in the flesh is more akin to a lotto.

    If you get one without any of the common faults then you did well but after doing a lot of reading seems like most monitors all have at least one or maybe two down sides to them.

    I wonder if the Boss/HH has a gaming monitor because he's probably seen a lot of different monitors in the flesh but i'm guessing he has OLED TV/Monitor.
     
  4. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,691
    Likes Received:
    2,671
    GPU:
    Aorus 3090 Xtreme
    lol.

    They will be 60Hz.
    Some support 1440p120, my Q9FN does, a nice compromise.
     
    Last edited: Apr 13, 2021

  5. vf

    vf Ancient Guru

    Messages:
    2,184
    Likes Received:
    306
    GPU:
    ATi Radeon™
    These Ultrawide screens are so amazing for open world single player games. Another display I'd like to get some day.
     
    Undying likes this.
  6. rm082e

    rm082e Master Guru

    Messages:
    717
    Likes Received:
    259
    GPU:
    3080 - QHD@165hz
    I finally picked up this 32" Dell on sale yesterday and spent about 6 hours playing Assassin's Creed with it. I was previously using a 32" (flat, 60hz) BenQ. Even though it's a very similar display, it still feels like a significant upgrade:
    • I didn't think flat vs. curved mattered that much, even though I sit very close to the screen. I was wrong. Curved definitely feels better.
    • Even though it's a 10-Bit VA panel like the BenQ, it's a better panel that has more intensity to the colors.
    • I tried turning on HDR in Windows and playing with the HDR settings in Assassin's Creed - HDR looks terrible. It's brighter, but loses the contrast - blacks turn gray. I turned it off after 2 minutes and the image is much better without it.
    • High frame rate never seemed "worth it" to me in the past given I play single player games. But I haven't been able to shake the feeling that I'm missing out. I'm playing more Fortnite with my son these days, and running at 120-164hz for a couple of hours yesterday did feel better than 60, whatever that's worth.
     
  7. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,691
    Likes Received:
    2,671
    GPU:
    Aorus 3090 Xtreme
    You may need to calibrate the display when in HDR mode.
    Looks like brightness is too high.
    It should use independent settings for SDR and HDR modes,
     
  8. rm082e

    rm082e Master Guru

    Messages:
    717
    Likes Received:
    259
    GPU:
    3080 - QHD@165hz
    Okay, digging into the "Adjust desktop color settings" menu under the Nvidia Control Panel, and futzing with the sliders there did get a better picture out of HDR. However, it still overall has the effect of cranking the brightness. Basically I had to crank the contrast to max, to get the blacks back in line. Playing Assassin's Creed for an hour, it did look different, but I wouldn't say it's better. I also started to get a bit of a headache. I game in a dark room, so I feel like HDR may be a bit too much at this point.

    I'm going to roll with it turned off for now. I feel like this would be a different story if I were on an OLED panel, but I don't want a 4k TV at this point.
     
  9. Reddoguk

    Reddoguk Ancient Guru

    Messages:
    2,660
    Likes Received:
    593
    GPU:
    RTX3090 GB GamingOC
    I just spotted this OLED TV that has my interest. https://www.argos.co.uk/product/8461359

    A 4K 55" 100hz OLED TV for under a £1000. I'm not sure if it's any good for full time gaming though and it's too big for close desk gamimg but might be ideal for when i want to relax and play with wireless controller from my sofa.

    Still want a high refresh monitor though in 27 or 32".
     
  10. Chrysalis

    Chrysalis Master Guru

    Messages:
    373
    Likes Received:
    90
    GPU:
    RTX 3080 FE
    I feel like we never going to see mainstream market oled monitors, to me sticking a 50inch screen on my desk for my PC is just well umm weird. Quite not sure why people are buying 50inch CX tv for their PC's o_O.

    The problem seems to be a few things.

    1 - The current monitors are selling fine, which means manufacturers dont have marketing reason to jump to better tech.
    2 - Limited OLED manufacturing capacity, and they primarily been used on very large TV's that people willing spend 4 figures on to obtain, as long as they able to make that kind of money thats where they will continue to be used.
    3 - Burn in, quite how bad an issue this is in 2021 I dont know, the problem with this is it takes time for people to discover if it is an issue, what I do know is that on my one plus 6, I enabled the artificial anti notch feature for a year, then out of curiosity turned it off briefly and there is very visible burn in, just after only 1 year. Imagine that for taskbar, and then when playing full screen media or game you see that burn in. Bear in mind as well on phone screen is off most of time as well. So burn in is real issue, but as I said is possible they have working mitigations for it now.

    If a miracle does happen and we start seeing sub £500 and sub 30inch screens using OLED then it will likely unite VA/IPS guys as it combines the best of both. Currently IPS and VA users are split on what they consider the worst traits, for me I value viewing angle and hate black smear so use IPS, and just accept IPS glow, because I run my screens at low brightness though it does mitigate the glow. As for TN I cannot even tolerate it, on my laptop which is only a small screen, its impossible to view the entire screen without colour/contrast shift, the viewing angle is that bad, I either have to choose if its on top or bottom of screen depending on where I position my head.
     
    Undying likes this.

  11. Skinner

    Skinner Master Guru

    Messages:
    589
    Likes Received:
    58
    GPU:
    Gigabyte RTX 4090
  12. Undying

    Undying Ancient Guru

    Messages:
    25,332
    Likes Received:
    12,743
    GPU:
    XFX RX6800XT 16GB
  13. Skinner

    Skinner Master Guru

    Messages:
    589
    Likes Received:
    58
    GPU:
    Gigabyte RTX 4090
    Yeah, its also to much for me for the time being, but the image qualty should be awesome i can imagine.
     
  14. Reddoguk

    Reddoguk Ancient Guru

    Messages:
    2,660
    Likes Received:
    593
    GPU:
    RTX3090 GB GamingOC
    It's true though why would they charge that much for a 27" Oled when a 55" is a 1/3 the cost. Even $1500 would be more fair but 3000 wow just wow.
     
  15. Mineria

    Mineria Ancient Guru

    Messages:
    5,540
    Likes Received:
    701
    GPU:
    Asus RTX 3080 Ti
    If you place it on your desk a 55" flat panel will be to much for gaming, sides will be far out.
    Checked rtings comparisons yet?

    The 32" Odyssey G7 seems to be on top, alternatively you could look at Gigabyte G32QC, ASUS TUF's 32" and alike.

    I have the G32QC and prefer it over the PG348Q and 43" Philips Momentum.
    I find that the PG348Q looks somewhat milky (IPS) besides that it's 21:9 format gets to squeezed with only 34".
    The 43" Momentum is actually nice when mounted on an arm, but games that get hit hard at 4K don't look good when set to lower resolutions, and it's can be a bit of an exercise to play shooters with the sides going out that far on the desk.
    The G7 should be better.
    What ever you go for, make sure to buy your new monitor from a store where it is easy to return.

    EDIT
    I tried HDR on the two of them, don't really like how it looks on desktop and SDR content, to bright and less colors unlike my TV which handles it like a champ.
     
    Last edited: Apr 24, 2021

  16. Chrysalis

    Chrysalis Master Guru

    Messages:
    373
    Likes Received:
    90
    GPU:
    RTX 3080 FE
    Asus monitor on youtube I seen been reviewed, 32inch 4k, has lots of dimming zones really high peak brightness on IPS. The problems are it is 3k price o_O, which is insane and also even at that price its still only hdmi 2.0.

    It looks like affordable high quality screens are years away, for some reason the monitor manufacturers are really dragging this out.
     
  17. TheDigitalJedi

    TheDigitalJedi Ancient Guru

    Messages:
    3,958
    Likes Received:
    3,092
    GPU:
    2X ASUS TUF 4090 OC
    I'm currently using the LG C9 and CX. Gaming on an OLED is a visual blessing. The size is huge but it really isn't that bad once the T.V. is mounted to the wall or if you have sufficient distance. OLED televisions are indeed costly but I'm keeping these sets for years.

    The C9 has 4 full HDMI 2.1 ports at 48gb bandwidth but Vincent Teoh of HDTVTest states the panel is 10bit. I can choose 4k 120hz at 12bit but I honestly can't tell the difference. The CX is almost the same as the C9 barring a few small differences.

    In my opinion the OLEDS deliver the best visuals for gaming.
     
  18. Undying

    Undying Ancient Guru

    Messages:
    25,332
    Likes Received:
    12,743
    GPU:
    XFX RX6800XT 16GB
    Yes but its a tv and cant fit on the desk. Until we see oled in monitor form with higher refresh rate, adaptive sync, ultrawide im not interested. Olso hdr on oled is very dim, brighness is its weakness.
     
  19. Radical_53

    Radical_53 Ancient Guru

    Messages:
    4,358
    Likes Received:
    212
    GPU:
    EVGA RTX 3080 FTW3
    It would be bright enough normally but the size really is an issue still.
    I went with 144Hz 10bit instead of 165Hz 8bit and HDR really looks amazing.
    Of course, blacks and contrasts aren't en par with OLED (direct comparison to my TV), but still the nicest graphical upgrade I've had in gaming for years.
    It's a necessity to calibrate though, at least brightness/gamma has to be set to get a nice picture. Windows is a mess compared to XBox7TV.
     
  20. itpro

    itpro Maha Guru

    Messages:
    1,364
    Likes Received:
    735
    GPU:
    AMD Testing
    There is no difference "8bit vs 10bit". Panel matters not signal input/output. I use 240hz 8bit and it is exactly the same with 144hz 10bit.

    OLED is as bright as any other panel, IF YOU DISABLE all "eco" modes and adaptive brightness to save "burn-in". It gets really bright..
     

Share This Page