After using this fantastic screen for some time, I just found out (yes I know... "facepalm") that it does support 10 bit color. However in my Nvidia Control Panel only 8bit option exists. I assume its due to cable I use? Currently its this one. What can you recommend as a good solid display port 1.4 cable? Ideally 1m long. Thanks a lot!
Did you install the appropriate Monitor INF? Normally the 10bit support identifier comes from the EDID info. After reading the manual, you may need to put it into sRGB mode.
Where would I get the INF file / EDID file? Currently it runs in RGB format according to Nvidia Control Panel.
You need Nvidia Quadro card for 10 bit support in windows. Nvidia GeForce gaming card won't support as Nvidia disable it. This may also same for AMD gaming card. In short you cannot utilize 10 bit for normal display card in windows. 10 bit will auto activate in games (mentioned by Nvidia) that support it in full screen and you won't know it is turn on or not as if not wrong not games got option to let you manual turn on. Beside need Quadro card, you also need to make sure it is pure 10bit. If your monitor is 8bit + FRC (fake 10 bit), this is not a real 10 bit Monitor and you not able to choose 10 bit even you are using Quadro card. It just help eliminate the dithering issue in 8 bit.
https://forums.evga.com/FindPost/2510424 There was some info on it there but it's pretty much exactly as the above post, exclusive full screen is required for Nvidia GPU's to support 10-bit it seems (For DX11) and yeah there's a difference between 8-bit FRC and 10-bit. (My own monitor is 8 with FRC.)
Funny that Radeon Settings allows me to select 10bit color for my 390: RadeonMod also has a setting for enabling 10bit color for the desktop.
it's referring to the color depth of the pixels. 8bit color monitors can display 16,777,216 unique colors. (8bit RGB 24bit True Color, or 72% sRGB) 10bit monitors can display 1.07 billion colors per pixel (100% sRGB or Deep Color)