Intel today released its latest Graphics Driver for Windows v15.60 WHQL. It is compatible with most GPU embedded procs including 6th generation Skylake 7th generation Kaby Lake,... Intel Releases latest HD Graphics Driver - Enables Netflix HDR
I feel we are very long overdue for REAL 32-bit color, not the fake one that Windows claims to use. 24-bit (what most of us are actually using) was fine back when 720p was common, because dithering was pretty hard to notice. But now, a lot of media is starting to look real ugly when subtle gradients are spread across a long distance. Depending on your definition of "consumer", some workstation GPUs support it.
32-bit is not something you'll really see being used, because it doesn't make much sense, either way. 24-bit is often handled as 32-bit with the addition of an alpha channel, or just for data alignment reasons (because 4 bytes/32-bit is a neat power of two, 3 bytes is not) More accurately one should refer to the bits per component (bpc), which typically has been 8-bit for a long time, and for HDR is now moving to 10 and 12-bit (which makes 30 or 36 bits per pixel) As long as PCs are stuck in the sRGB color space however, more then 8-bit isn't really needed. Would be nice, but not required. It'll still be a long time before PC applications become actually color managed and wide-gamut PC screens become widespread beyond professional use. PS: Note that 10-bit is available on all recent desktop GeForce cards through DirectX, and probably on AMD as well.
Yeah i figured, but when landing on the thread from this link, its not obvius as this states Win7 driver support http://www.guru3d.com/files-details/intel-hd-graphics-driver-download-4590.html