Since when did nVidia allow 10 bit colour on non-pro cards?

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by Neo Cyrus, Aug 15, 2015.

  1. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    10,777
    Likes Received:
    1,388
    GPU:
    黃仁勳 stole my 4090
    Title. Did they recently add this option, is it Windows 10 only, or did I somehow just not notice it and it's been around for a while? It's nice to see nVidia now matching AMD and allowing me to use my damn monitor's full colour range... too bad nearly everything on the internet is 8 bit.
     
  2. edigee

    edigee Active Member

    Messages:
    63
    Likes Received:
    0
    GPU:
    Zotac GTX 960/GT 820M
    It depends on the display not the OS.
    For instance ,I recently purchased a TV/Monitor Samsung LT32E210EW and the old system which it is connected (GT 630!) can see it as a 12 bit display capable.
    353.49 driver ,Windows 7 32 bit.
     
  3. HeavyHemi

    HeavyHemi Guest

    Messages:
    6,952
    Likes Received:
    960
    GPU:
    GTX1080Ti

    Couple driver revisions ago... 352.86 I believe. As far I know, Vista on up.
     
    Last edited: Aug 16, 2015
  4. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    10,777
    Likes Received:
    1,388
    GPU:
    黃仁勳 stole my 4090
    1 - It was always artificially locked through the drivers in the past regardless of whether or not the hardware was capable. Just as wireframe acceleration for professional work is probably still crippled through drivers.

    2 - I recently switched to Win 10 and only noticed the new option now, so I didn't know if they artificially limited it by OS.

    3 - There's no such thing as a 12 bit colour depth TV, whatever you're thinking of or referring to is not colour depth. It's a TV, so most likely 6 bit + dithering at best. A lot of LCD type displays in the past were 4 bit + dithering.

    Alright, thanks.
     

  5. Terepin

    Terepin Guest

    Messages:
    873
    Likes Received:
    129
    GPU:
    ASUS RTX 4070 Ti
    He's referring to deep color, which is 10-bit per channel and above. A true 10-bit monitor costs a fortune, so a 12-bit TV is just a marketing gimmick.
     
  6. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    10,777
    Likes Received:
    1,388
    GPU:
    黃仁勳 stole my 4090
    There is no such thing as a TV with more than 10 bits per colour. In fact I've never even heard of an actual 10 bit TV. "True" 10 bit monitors haven't been a thing in an eternity. These days they are all 8 bit + FRC, just as the 8 bit monitors are 6 + FRC. Considering it gives the same result, it doesn't matter.
     
    Last edited: Aug 16, 2015
  7. edigee

    edigee Active Member

    Messages:
    63
    Likes Received:
    0
    GPU:
    Zotac GTX 960/GT 820M
    [​IMG][/URL][/IMG]
    I know it is definitely not a 10 bit display ,just wanted to show the odd situation
     
  8. Anarion

    Anarion Ancient Guru

    Messages:
    13,599
    Likes Received:
    386
    GPU:
    GeForce RTX 3060 Ti
    We have similar Samsung at our summer cottage and 12-bit option is also there for AMD cards. I guess that it may just reduce banding by dithering it down to 8-bit (though, I didn't have much time to test what the display actually does with 10-bit or 12-bit content).
     
  9. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,095
    Likes Received:
    2,601
    GPU:
    3080TI iChill Black
    I had this option with early r350.xx @ win8.1 but then I tried quadro variant and it glitched something in gpu and now I don't have it anymore, even in Win10 heh..:infinity:
     
  10. nevcairiel

    nevcairiel Master Guru

    Messages:
    875
    Likes Received:
    369
    GPU:
    4090
    No, it was not. It was always achievable, just very tricky before the options existed, and not always reliable.

    What is locked to "Pro" cards is 10-bit OpenGL, however consumer cards can do 10-bit through Direct3D.

    Its not really relevant how much the TV can display, but what the TV accepts. Many TVs perform a lot of image processing, and giving it 12-bit over 8-bit can help, assuming your system actually outputs 12-bit, and not just 8-bit padded with zeros to 12 (like most desktop software would)

    PS: UHD TVs will have to be real 10-bit, or they'll look like crap on proper UHD material, as UHD standards pretty much require it. But I'm not sure where the ones released so far stand, UHD/4K is still pretty early to judge, with no real content etc..
     

  11. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    10,777
    Likes Received:
    1,388
    GPU:
    黃仁勳 stole my 4090
    So in other words it was artificially locked through the drivers. Which is what I said.
    That's the same as saying image quality is not relevant. No amount of image processing is going to actually compensate.
    They won't be 10 bit. They won't be 8 bit. They'll be 6 bit + whatever tricks just as they are now.
     
  12. HeavyHemi

    HeavyHemi Guest

    Messages:
    6,952
    Likes Received:
    960
    GPU:
    GTX1080Ti
    My Sony Bravia is a true 10 bit panel. But you're correct, most consumer TV's are 8 bit panels. This is a good read on what's going on in the consumer space.
    http://hdguru.com/whats-a-10-bit-tv-and-is-it-better/
     
  13. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    Incorrect to claim that most TVs currently being sold are 6-bit + FRC in order to display 8-bit content. Pretty much none are.

    That's a pretty low standard you have for TV panels my friend.

    Given the majority of the TV market is ruled by IPS / VA panels, IPS has only had one period in history where one of its variants (e-IPS) was 6-bit + FRC, and it was a panel type known to be used for cost savings.
     
  14. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    10,777
    Likes Received:
    1,388
    GPU:
    黃仁勳 stole my 4090
    That's pretty strange considering anything you watch on a TV will be 8 bit anyway. Well, time to use it as a giant monitor. :wanker:
    Dunno how that's a low standard, 6 + FRC is effectively displaying the same thing as an 8 bit panel. Nearly all content is 8 bit. Nearly everything I've seen in the past eternity has used FRC. Even my "10 bit" U2713H is an 8 bit + FRC AH-IPS. When it became common I stopped seeing "true" X bit panels of any kind.

    I'd say that's a relatively high standard, most TVs (unless it's changed in the last few years or something) used crap 6 bit panels without FRC. Those that I saw anyway. Remember those old TNs with poor colour reproduction? 4 bit + every trick they could pull. And are you sure TVs commonly use IPS these days? I was under the impression the vast majority were some variant of VA.

    It sounds like TV panels have improved a lot in the last few years based on what you guys are telling me.
     
    Last edited: Aug 17, 2015
  15. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    LG seems to use IPS exclusively, given it's LG's technology. Sony mostly uses VA, Samsung as well. Toshiba uses IPS, at least for some / most / all of its TVs.

    Even a cheap 1366x768 32" LG TV I have at home uses LG IPS. It has weird viewing angles, but up front it's pretty accurate (measured with an i1DP colorimeter).

    TVs stopped using these tricks. 8-bit panels are abundant, they don't need to use 6-bit+FRC anymore.

    As for 10-bit, it's easy to tack FRC onto 8-bit rather than make a 10-bit panel. True 10-bit panels have been very expensive in the last few years. With 10- and 12-bit content becoming common, we should hopefully see native 10-bit panels becoming a whole lot more common.

    Are you sure there is even a 4-bit + FRC TN panel? I couldn't find anything online. I don't think 4-bit + FRC could be managed for any modern display.
     
    Last edited: Aug 17, 2015

  16. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    10,777
    Likes Received:
    1,388
    GPU:
    黃仁勳 stole my 4090
    I lose track of time so I don't know when it was exactly, but yes I'm certain at one point in time 4 bit + some form of dithering TN panels were a thing, especially for monitors. I even remember a conversation where a friend was ranting about them.

    I don't know if they were advertised as 6 bit or what. It's pretty strange that there's no direct Google result, makes me question my own memory. I was hoping to find out how they could get away with that, even with any form of dithering the colours would still be really abysmal. Then again that's what those old TNs were, unreal bad.

    Personally I'm not bothered by the use of FRC if in the end it's all the same as far as we see it.
     
    Last edited: Aug 17, 2015
  17. Terepin

    Terepin Guest

    Messages:
    873
    Likes Received:
    129
    GPU:
    ASUS RTX 4070 Ti
    I have native 8-bit monitor. Shocking, right?
     
  18. Martigen

    Martigen Master Guru

    Messages:
    534
    Likes Received:
    254
    GPU:
    GTX 1080Ti SLI
  19. edigee

    edigee Active Member

    Messages:
    63
    Likes Received:
    0
    GPU:
    Zotac GTX 960/GT 820M
  20. lmimmfn

    lmimmfn Ancient Guru

    Messages:
    10,491
    Likes Received:
    194
    GPU:
    4070Ti
    My LG34UC97 has 10 bit colour, but i dont get the option in Nvidia CP, only 8 bit, grrrrr
     

Share This Page