8bit vs 10bit Displays

Discussion in 'Computer Monitor Forum' started by Reddoguk, Apr 8, 2021.

  1. Reddoguk

    Reddoguk Ancient Guru

    Messages:
    2,066
    Likes Received:
    252
    GPU:
    RTX3090 GB GamingOC
    So boys and gals i just dropped £4300 on a new PC because it was the only way to get decent hardware in these tough times.

    These are the PCs components. All picked because they had some "builders stock" or was in normal stock.

    Corsair 5000D Airflow Tempered
    8Pack Team Group Edition 32GB CL16 kit (2x16) 3600mhz
    Corsair MP400 4TB Gen 3 M.2
    Noctua NH-D15 Pure Black
    Gigabyte RTX 3090 Gaming OC <-- wanted a 3080 but not available.
    Gigabyte X570 Aorus Ultra
    AMD Ryzen 7 5800X <-- wanted a 5900X but again not available.
    Seasonic PX-850 80 Plus Platinum
    Noiseblocker NB eLoop X-B14 Black Edition x2
    Noiseblocker NB eLoop X-B12-PS Black Edition x1
    Thermal Grizzly Kryonaut Thermal Paste 5.5g/1.5ml

    It's a lot of money granted but at least with a build like this you get 3 years C&R Warranty and cable management too. It's probably £700 overpaid mostly because Build Stock 3090 was £2200

    Since i was forced to go with a 3090 i'm in need of a decent monitor. That's where it gets hard for me because i know 0 about monitors apart from buzz words like local dimming and quantum dot or Nano dot.

    So lets rule out 4K 60HZ i don't want that and 4K 144HZ is out my price range. So it's 1440p and high HZ 144-240hz but should i go HDR10 or HDR400/600.

    I think 32" is too big so it's 27". Most monitors i see in these specs are either low nits like 350 but still rated HDR10. I thought HDR needed 400+nits?

    TN is out so it's IPS or VA or SSIPS. The more i read the more i think HDR 10bit is not worth it and is it even real 10bit or really fake 10bit? People buy 10bit monitors and switch it to 8bit for gaming? So whats the point in buying a HDR10 monitor if you only enable it for movies or content creation.

    I've looked at 27" monitors for days now and my budget is £400-£600 but i'm lost on what to get. There's the G5's G7's from Samsung and LG UltraGears both look decent but they are older tech with no local dimming and such.

    I really need someone with decent previous of owning these types of monitors to help me out. Since i'm going from Maxwell GTX 980 to RTX 3090 lol.
     
    Last edited: Apr 8, 2021
    DannyD likes this.
  2. DannyD

    DannyD Ancient Guru

    Messages:
    1,903
    Likes Received:
    1,446
    GPU:
    MSI 2080ti
    Solid build my man, love that psu (and that gpu :p).:)
     
  3. Reddoguk

    Reddoguk Ancient Guru

    Messages:
    2,066
    Likes Received:
    252
    GPU:
    RTX3090 GB GamingOC
    Thanks man but still have 20 working days until it is delivered so i have a while to decide on the monitor to go with this beast.

    I was pretty sold on the 27" G7 with it's HDR600 but it has a lot of 1 star reviews because of some flickering issues. So i'm back to square one.

    I realise this forum is probably hardly read but i had to try ask someone.
     
  4. DannyD

    DannyD Ancient Guru

    Messages:
    1,903
    Likes Received:
    1,446
    GPU:
    MSI 2080ti
    I lose 10 bit color as i OC my monitor from 60hz to 72, i do miss the 10 bit though.
    Choosing monitor has got to be the most difficult and stressful part of build, so many issues like deads pixels etc.
    I'd go for a nice samsung odyssey from amazon or something similar but just make sure it's from amazon seeing as it's a monitor.
    Best returns policy in the industry imo.
     

  5. Mufflore

    Mufflore Ancient Guru

    Messages:
    12,633
    Likes Received:
    1,250
    GPU:
    Aorus 3090 Xtreme
    Specs for HDR brightness are HDR400, 600, 1000 ...
    I dont recommend anything less than HDR1000 unless you get an OLED.
    My TV is HDR2000 and is superb. My next TV will be HDR4000 or higher.
    Many films are mastered at HDR2000 or higher.
    ps The HDR max brightness is only for small HDR objects like a sun, lightbulb, brightly lit object ..., it isnt an overall screen brightness.

    HDR10 is not a brightness spec, its the number of bits used per colour.
    ie future spec is HDR12. In some situations HDR8 with RGB can be used (instead of HDR10 with YCbCr), I've used this on Windows 7 on an HDR10 TV so I didnt have to keep changing between 8/10bit and RGB to YCbCr. 8bit might show some banding though.
    In Windows 10 it doesnt appear simple to use HDR 8bit RGB, it will default to 10bit YCbCr when HDR is enabled (Windows 7 doesnt have an HDR enable feature, it is set per app and often wont work).

    An 8bit none HDR display will not be able to perform as an HDR10 display without trickery and ass pain, and some luck it is good enough.
    If you want the HDR experience, get a 10bit+ HDR display, 10bit is the current HDR standard.
    There is no worth it or not.

    A 10bit HDR display can be 8bit with dithering. It will still be classed as an HDR10 display because it simulates 10bits.
    There arent many of these now so its not really worth caring about.
    Although some displays would benefit from dithering because they have occasional banding even though they are 10bit.

    You dont switch a display to 8 or 10bit, it is set in the OS/video driver. The display does as it is told.

    HDR isnt just for content creation and video, many games use it and are better for it.
    Some games have fake HDR that doesnt look good, these are better with HDR turned off.
    Older games also dont use HDR so will display in 8bit colour.
    My Samsung TV has its own fake HDR mode that is really very good, I use it for everything that doesnt have real HDR, including TV. It is almost permanently on.
    Its not as good as true HDR well done but its better than SDR, to my eyes at least ;)
     
    Last edited: Apr 8, 2021
  6. Mufflore

    Mufflore Ancient Guru

    Messages:
    12,633
    Likes Received:
    1,250
    GPU:
    Aorus 3090 Xtreme
    Forgot to add, I use the Samsung 49CRG9 ultra wide for gaming.
    HDR on this is very bright but as you pointed out, few dimming zones is not so good.
    Black brightness with HDR is not very good but it can still give a decent HDR experience.

    If I didnt have an HDR TV for movies I would be miffed though because some movies dont fill the height, the player appears to default to 1080p height, meaning not only black bars left/right (which is ok imo) but black bars above and below as well!
    I found zooming in seems to work well but its a pita.
     
  7. Reddoguk

    Reddoguk Ancient Guru

    Messages:
    2,066
    Likes Received:
    252
    GPU:
    RTX3090 GB GamingOC
    I like the look of this monitor and it has good reviews and isn't too expensive. Granted it's from 2019 but has decent specs and HDR10.

    LG 27GL850-B 27" 2560x1440 NANO IPS 144Hz 1ms FreeSync/G-Sync Compatible Widescreen LED Gaming Monitor.

    This is the only monitor i like the looks of overall. Plus it's come down from £440 to £379. I think i'll just buy a 4K TV 50/60hz for movies and either this monitor or the Samsung G5/7 27" for gaming.

    It's still a massive upgrade for me going from a 24" 1080p 75hz monitor run by a 2600X and poor old GTX 980 G1 from 2014. 7 years and still going but it could hardly do 75fps @ 1080p anymore with some games.

    Now i need to read up on DLSS and RTX stuff because i've never had an interest in Ray Tracing before now. I even had to search for RT titles online and i'm not impressed one bit by the list size of just 27 RT games of which i'll play about 4 of them. Hopefully the new games like F1 2021 and others all have it day one.
     
  8. metagamer

    metagamer Ancient Guru

    Messages:
    1,919
    Likes Received:
    749
    GPU:
    Palit GameRock 2080
    In general, HDR on PC monitors is pretty awful, that includes DisplayHDR1000. In comparison to proper HDR/DV done by OLED TVs. This is coming from someone who has a DisplayHDR1000 monitor and an OLED TV.

    DisplayHDR 400/600 is just something I would ignore. Even DisplayHDR1000 on my monitor is something I've tried and quickly switched off (Samsung Odyssey G9). We're talking about a £1000 monitor that has a handful of dimming zones, a fraction of what TVs offer, and then there's OLED that is just on another level. And that is one of the best, if not the best HDR monitor on the market and it's still a joke.

    Where in the UK are you? I have this monitor all boxed up in the back of my wardrobe doing nothing. If you're close to me, I could hop in my car and let you have it on the cheap, like £250. I have no use for it. This is the monitor AOC AGON Gaming AG271QG - 27 Inch QHD Monitor, 165Hz,1 ms, IPS, G-Sync, AMD FreeSync, USB Hub, Ergonomic stand, Speakers (2560x1440@ 165Hz, 350 cd/m², HDMI/DP/USB 3.0): Amazon.co.uk: Computers & Accessories

    Anyway, my advice would be do make sure you buy the largest monitor you can possibly fit on your desk. And take it from there, consider all the options in that size.

    EDIT: @Mufflore do you have any links to these HDR4000 panels?
     
    Last edited: Apr 9, 2021
  9. Reddoguk

    Reddoguk Ancient Guru

    Messages:
    2,066
    Likes Received:
    252
    GPU:
    RTX3090 GB GamingOC
    That AOC monitor is nice but it's DP1.2 and i want one with DP1.4 and HDMI 2.0. Plus that monitor is 4ms according to amazon not 1ms.

    1ms in the title is incorrect or misleading so you wanna remove or delete that crap if you intend on selling it.
     
    Last edited: Apr 10, 2021
  10. Chrysalis

    Chrysalis Member Guru

    Messages:
    199
    Likes Received:
    19
    GPU:
    RTX 3080 FE
    I dont think HDR is worth it on PCs at the moment, to me colour accuracy is important, I dont want e.g. to be watching a movie and someone wearing a red coat looks like its glowing due to the over color saturation.

    This is because windows doesnt have proper color management yet where it wont auto map HDR screens to sRGB profiles.

    About the only content where I enjoy it is probably anime/cartoony stuff.
     

  11. Mufflore

    Mufflore Ancient Guru

    Messages:
    12,633
    Likes Received:
    1,250
    GPU:
    Aorus 3090 Xtreme
    HDR isnt used unless the media being played is HDR.
    Then it wont look as you suggest.
    None HDR movies will look exactly as they should normally.

    The argument vs colour accuracy stands in favour of HDR.
    Displays without it are missing a whole spectrum of brighter colours that are present in real life, which is why those displays cannot show HDR movies correctly.
     
  12. Mufflore

    Mufflore Ancient Guru

    Messages:
    12,633
    Likes Received:
    1,250
    GPU:
    Aorus 3090 Xtreme
  13. metagamer

    metagamer Ancient Guru

    Messages:
    1,919
    Likes Received:
    749
    GPU:
    Palit GameRock 2080
    That must be some marketing mumbo jumbo, that panel has a peak brightness in a 2% window of roughly 1700 nits. Your standard current gen Samsung QLEDs will be hitting close to 1500 nits so it's not really anything groundbreaking.

    EDIT: My bad, I was looking at the Q900, not the Q950. According to what I've seen, the Q950 can hit 3800 nits. That's crazy but I don't believe it as the 55" Q950R is £1499 which is a regular price of a QLED TV so it's weird that it would be that much brighter.

    EDIT2: I haven't found a review from a respectable source for the Q950R but from the comparison videos to a standard OLED (LG CX) in HDR content OLED is miles better. In fact, a 4 year old OLED like a C6 is better than this Q950R.
     
    Last edited: Apr 10, 2021
  14. Undying

    Undying Ancient Guru

    Messages:
    15,258
    Likes Received:
    4,308
    GPU:
    Aorus RX580 XTR 8GB
    What about gaming? Heard Doom is amazing and has a good hdr support. And what about 500nits is that even considered capable?
     
  15. metagamer

    metagamer Ancient Guru

    Messages:
    1,919
    Likes Received:
    749
    GPU:
    Palit GameRock 2080
    On a PC monitor, no. It will look worse than HDR off, most definitely.

    I think Mufflore seems to think that peak brightness is the be it all. I understand where he's coming from, the more peak brightness, the better. But in reality, OLEDs with 600 nits still look miles better than these 4000 nit QLEDs in HDR. The thing is, when you have an OLED that can produce real blacks, brightness only matters so much. From what I've seen these Q950R high peak brightness TVs, they deliver bright highlights but they often blow them out and they don't have the black to compliment the brightness. OLED is just so much better. Honestly, I'll say this, OLED must be seen to be believed. It puts a smile on your face. @Undying you'll hate me for saying this, but it just works (like Nvidia said about Ray Tracing).

    My 1000 nits monitor (Odyssey G9) is nowhere near my OLED TV that has under 700 nits peak brightness. It's not even close, it shouldn't even be called the same thing.

    HDR on OLED looks ridiculous, when I say HDR, I mean HDR10 and more importantly Dolby Vision. It's that good.

    Imagine watching a film on an OLED in a dark room and then the film has a transition with a couple of seconds of just black between two scenes. You're watching the film and when it goes black, your TV disappears in front of your eyes. Because all of the pixels switch off. Has to be seen to be believed but it's insane.

    OLED is just wow. Shame I still don't trust OLEDs to be a daily PC monitor. I'll keep using my G9 for a couple of years, maybe up to 3-4 years and by then the anti burn in technology will make me confident to use an OLED as my main monitor, most probably. To use an OLED as a TV, I'm already super confident that I won't get burn in, knowing how to prevent it. I just don't quite trust it as a PC monitor. But one day, sure, why not.
     
    Last edited: Apr 11, 2021
    Undying likes this.

  16. Mufflore

    Mufflore Ancient Guru

    Messages:
    12,633
    Likes Received:
    1,250
    GPU:
    Aorus 3090 Xtreme
    You assume too much.
    The following is meant for anyone reading this, not aimed at you.

    Black level and uniformity is very important but that doesnt mean displays without OLED capability are not worth using.
    OLED has a major drawback, lack of max brightness.
    This means you need a dark room to get the best from it and even then only a few displays get near HDR1000 and all OLEDs have very strong limiters to prevent near peak brightness being used for long.
    Thats not to say OLED displays with lesser max brightness are not a good experience, it depends what you are ok with.
    Colour volume also matters a lot, the high end displays these days all have a handle on this.

    My Q9FN TV is wonderful, HDR2000 (or as close as) is worth having when done well.
    It wont suffer burn in, so if that matters to you, a QLED is worth considering.
    It has a lot of dimming zones but more is always better. Yet I am not looking to replace this with an OLED simply because it is so close to all I want.
    What I am holding out for are the new miniLED and microLED (a bit later).
    These will have much higher peak brightness while not burning in and great black levels.
    The holy grail.

    Current PC displays cant give the best HDR experience but it can still be a good one.
    For example, I wont play Cyberpunk without HDR on, despite the black levels on my CRG9 Ultrawide not being that great.
    Be aware of this though:
    A lot of bad HDR experiences with Cyberpunk and other HDR games were caused by the NVidia drivers at the time of the 30xx card releases leaving HDR black as grey. It took some time for them to fix this (or you could roll back to an earlier driver), it was one reason I gave Cyberpunk a break.

    It pays to go see some HDR displays for yourself and decide what matters to you.
    Dont write off PC HDR unless you are sure its not for you, you can use a decent TV or choose a monitor well.
    Everyone is different, its a shame to waste HDR when it could be just what you want.
    And bear in mind, if you buy an HDR PC display, you dont need to use it for all HDR games, use it for those it works well with.
    There is no downside if you dont enable HDR other than not having HDR, colours render correctly as if it isnt an HDR display.
     
    Undying likes this.
  17. metagamer

    metagamer Ancient Guru

    Messages:
    1,919
    Likes Received:
    749
    GPU:
    Palit GameRock 2080
    @Mufflore You say I assume too much but I don't. It'll come down to how bright your room is. That's one of the first thing anyone should consider. If you are watching your TV in a dark room, OLED will blow away a QLED every time. If you need a TV for a brightly lit room, QLED will be better.

    I personally will always pick an OLED for my viewing needs because closing curtains is dead easy and I'll get a better image. Also, I live in the UK so it's not like we get a lot of sunny days plus my work schedule means I often watch TV when it's dark outside. OLEDs are less bright, yes, everyone knows that, but the dynamic range is through the roof because of the infinite contrast of OLEDs.

    I find it weird that you're telling me "if I buy a HDR PC display, I don't need to use it for all HDR games". I do have a OLED TV and I do have a Odyssey G9. And I have tried HDR in a few games on the Odyssey G9 and I never turned it back on again because it looks like dog. And yes, I tried Cyberpunk with HDR on and I switched it off right away. Maybe I'm f*cking weird, I don't know.

    I still struggle to understand what you're trying to say. I never said LCD TVs are not worth using. I just don't see myself needing a TV with a peak brightness of 4000 nits or more. No chance, unless I lived in Spain and used it outside.

    That being said, Mini LED looks great but yeah, the price. It'll take years for it to become competitive in the mainstream.

    For now, I'll stick to my OLED for HDR content and use my Odyssey G9 hardware calibrated to 6500k, 120 nits and 2.2 gamma. Happy days.

    BTW, if you have any suggestions for games that will look great in HDR on my Odyssey G9, I'm all ears. I've tried a few, including Cyberpunk but maybe there's some games I've missed so I'd appreciate your input.

    I don't mean to sound like an arse btw, and after reading over this post, I find that perhaps I'm coming across a bit aggressive, that is not my intention. Just wanted to put my point forward. And it's obvious that we value different things and use our TVs differently. Which is perfectly fine.
     
  18. Mufflore

    Mufflore Ancient Guru

    Messages:
    12,633
    Likes Received:
    1,250
    GPU:
    Aorus 3090 Xtreme
    Yes, you did.
     
  19. metagamer

    metagamer Ancient Guru

    Messages:
    1,919
    Likes Received:
    749
    GPU:
    Palit GameRock 2080
    Not really, no.
     
  20. Mufflore

    Mufflore Ancient Guru

    Messages:
    12,633
    Likes Received:
    1,250
    GPU:
    Aorus 3090 Xtreme
    I stated "The following is meant for anyone reading this, not aimed at you."
    Part of your our reply is
    "I find it weird that you're telling me "if I buy a HDR PC display, I don't need to use it for all HDR games""
    I wasnt telling you how to think, I was presenting my view to everyone.
    You are after a battle.

    It looks like you only see OLED as the solution.
    I presented a way for those that want to see for themselves and what I use as the solution.
    Theres nothing wrong with OLED as long as you can manage the damage a PC can do.
    I cannot recommend a PC user buy an OLED display uinless they are capable of managing it well.

    I'm not opposed to this if you understand the situation clear enough, I used a Panasonic Plasma for a long time with my PC and it suffered no damage.
    I passed it on to cousins who are still using it with no problems.
    But, from what I have read, its far easier to damage an OLED than it is a late plasma TV.
     

Share This Page