GeForce 373.06 WHQL driver download & Discussion

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by Memorian, Oct 6, 2016.

  1. Agent-A01

    Agent-A01 Ancient Guru

    Messages:
    11,640
    Likes Received:
    1,143
    GPU:
    4090 FE H20
    That's a very common anomaly in many games that only support 6bit or 6bit+dithering(not true 8bit).

    Not many games support 8bit or 10bit color output, sad really.
     
  2. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,930
    Likes Received:
    1,044
    GPU:
    RTX 4090
    All modern games are outputting in at least 8bit per color, some in 10bit. 6bit+dithering is a display matrix tech which has nothing to do with what the games are outputting. Such banding can be a result of a game ignoring the color profile of a wide color gamut display.
     
  3. Arabaldo

    Arabaldo Active Member

    Messages:
    90
    Likes Received:
    4
    GPU:
    1070Ti
    Thx for the color profile keeper tip :)
     
  4. Agent-A01

    Agent-A01 Ancient Guru

    Messages:
    11,640
    Likes Received:
    1,143
    GPU:
    4090 FE H20
    If that were true, then forcing a color profile to work inside games would fix it.

    Alas, it does not fix it.
     

  5. HonoredShadow

    HonoredShadow Ancient Guru

    Messages:
    4,326
    Likes Received:
    21
    GPU:
    msi 4090
    How do you select 10 bit in the driver for colour when it only shows 8-bit or 12 bit
     
  6. Agent-A01

    Agent-A01 Ancient Guru

    Messages:
    11,640
    Likes Received:
    1,143
    GPU:
    4090 FE H20
    It should only show the supported color outputs, so if 12bit is an option choose that one.
    But if it supports 12bit it will support 10bit as well.
     
  7. HonoredShadow

    HonoredShadow Ancient Guru

    Messages:
    4,326
    Likes Received:
    21
    GPU:
    msi 4090
    Ok thanks for the reply I know that my panel has 10 bit colour and it shows 12 bit so I will use that.

    Will that make a difference to windows in general? Like pictures etc.

    Also if forza horizon 3 and other games start supporting hdr10 then I heard that you need to have a display in 10 bit to get it to work so I'm guessing any game like forza or shadow Warrior 2 would need to be set to 10 bit at least.

    Would this have an effect on performance in Windows?

    Sorry about the amount of questions it's all just a bit new to me
     
  8. Agent-A01

    Agent-A01 Ancient Guru

    Messages:
    11,640
    Likes Received:
    1,143
    GPU:
    4090 FE H20
    It will affect windows in general, basically the higher the bits the more possible colours a monitor can produce.

    10-12bit is billions of colors.. So either is more than enough.

    As for pictures, the camera has to support 'HDR' colour. Ie the new iphone will take 10-12bit photos.

    An older phone will likely only be 6-8bit, so in that case those pictures would not look any different.
    High end cameras have supported 10bit or better for a while.
    This whole HDR thing is a birth of stupid marketing just like the 4k thing(its not real 4k)

    As for games supporting HDR, they will still run on older monitor. It's not a requirement by any means.
    And no it wont affect performance.
     
  9. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,103
    Likes Received:
    2,606
    GPU:
    3080TI iChill Black
    Well it is game issue, AMD is doing it too now..
    They fcked it up after 1.11 update, guess to speed things up for VR or idk...


    But this is unacceptable imo, I also reported in steam forums. Morning skybox has it the worst.

    Few more examples;
    [​IMG] [​IMG] [​IMG]
    images hosting
     
  10. TheDeeGee

    TheDeeGee Ancient Guru

    Messages:
    9,671
    Likes Received:
    3,446
    GPU:
    NVIDIA RTX 4070 Ti
    Your car is pretty dented, been looking at the sky too much?
     

  11. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,103
    Likes Received:
    2,606
    GPU:
    3080TI iChill Black
    Nope, its from 7 races ;)
     
  12. aufkrawall2

    aufkrawall2 Ancient Guru

    Messages:
    4,500
    Likes Received:
    1,875
    GPU:
    7800 XT Hellhound
    There are hardly any games that don't have banding problems. That's by definition, because gradients with 8 bit always show banding (unless they have been calculated with higher bitdepth and then converted down to 8 bit with dithering).
    But in some games like Witcher 3 or Dying Light, gradients look more like 6 bit. Seems there is an utter lack of understanding how to achieve banding-free results between lots of developers.
     
  13. Prophet

    Prophet Master Guru

    Messages:
    865
    Likes Received:
    34
    GPU:
    Msi 680
    Went back to 372.70, it's just better imho. Less stutter, less input lag.
     
  14. Memorian

    Memorian Ancient Guru

    Messages:
    4,021
    Likes Received:
    890
    GPU:
    RTX 4090
    Less input lag ? Since when drivers add input lag ?
     
  15. Nastya

    Nastya Member Guru

    Messages:
    185
    Likes Received:
    86
    GPU:
    GB 4090 Gaming OC
    Subjective opinions.
     

  16. Keesberenburg

    Keesberenburg Master Guru

    Messages:
    886
    Likes Received:
    45
    GPU:
    EVGA GTX 980 TI sc
    I use now 12 bit bpc but i can not select 10 bit option in the Shadow Warrior 2 graphic menu.
     
  17. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,103
    Likes Received:
    2,606
    GPU:
    3080TI iChill Black
    You know why?

    Maxwell doesnt support hdr displays.
     
  18. Bradders684

    Bradders684 Maha Guru

    Messages:
    1,007
    Likes Received:
    3
    GPU:
    MSI GTX 980 Ti GAMING
    According to NVIDIA they do.

    https://developer.nvidia.com/getting-know-new-hdr
     
  19. Keesberenburg

    Keesberenburg Master Guru

    Messages:
    886
    Likes Received:
    45
    GPU:
    EVGA GTX 980 TI sc
    Mmm, i think my TV is not HDR ready, only 12 bit BPC. Or i must set it to YCbCR 4.4.4.4?
     
  20. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,930
    Likes Received:
    1,044
    GPU:
    RTX 4090
    Or maybe the tool which forces it doesn't work as intended.

    Not really. My TV input accepts 8 and 12 but not 10.

    HDR is not only about BPC, your TV must support HDR10 standard (or Dolby Vision but that's very rare and I don't think that any games support it) which actually tells the display what HDR parameters to use for the input it's getting.

    Then some HDR TVs only accept HDR in 4:2:2 or 4:2:0 modes of YCbCR due to their HDMI input limitations. This is all pretty ****ed up right now, as with any early tech out there.

    Also - why do you even use YCbCR instead of RGB?
     
    Last edited: Oct 15, 2016

Share This Page