Temporal dithering ps4 xbox one on Radeon chips

Discussion in 'Videocards - AMD Radeon' started by Link8295, Oct 20, 2017.

  1. Link8295

    Link8295 New Member

    Messages:
    2
    Likes Received:
    0
    GPU:
    Amd
    Hi I am wondering if anyone can help. I hear Radeon chips have temporal dithering enabled by default. I hear temporal dithering or the flashing of 2 colors can create a flicker that is bad for health. I was wondering since xbox one and ps4 also use Radeon chips is temporal dithering enabled? Anyway to test for this? Would having a true 8bit monitor mitigate temporal dithering a source might be forcing?
     
  2. thatguy91

    thatguy91 Ancient Guru

    Messages:
    6,557
    Likes Received:
    55
    GPU:
    XFX RX 480 RS 4 GB
    You can look it up here:
    https://github.com/FastLED/FastLED/wiki/FastLED-Temporal-Dithering

    It's not so much 2 colours, it's alternating between values of 1/255 difference. This shouldn't be an issue and is pixel based changed not backlight. In the old days of CRT, the screen used to flicker becuase of electron bombardment. This caused flicker which was known to be bad for eye strain, but with the advent of 100 and 120 Hz TV's this was greatly reduced and the picture looked a whole lot better. When LCD's first came out, they were backlit with Cold Compact Fluorescent Lamps (CCFL), which had a limited lifespan and could start to flicker when they were on their way out. However, for normal screen brightness it was always on. When LED backlit screens were released, the LED's were perceptually dimmed by quickly turning them on and off based on the required brightness of the backlight. This created imperceptible (mostly) flicker that caused eyestrain. In many ways this was worse than CRT's, since the CRT's at least had some residual glow in the phosphor, whereas the LCD's were a stark on/off. This level of brightness difference of course depends on the luminance of the pixel, but overall it is a very signficant switch between lit and not lit. These days most decent monitors are 'flicker-free', they just use more modern LED lighting that has adjustable brightness of a wide range.

    It would be far more important to have a flicker-free monitor than worrying about temporal dithering, seeing as temporal dithering is such a small change. I can see where they are coming from, but the difference is minute.
     
    Link8295 likes this.
  3. Link8295

    Link8295 New Member

    Messages:
    2
    Likes Received:
    0
    GPU:
    Amd
    Thanks for that info. I understand. Thing is some people still are bothered by temporal dithering so I was just wondering if there is a way a) Test if a source is dithering (maybe a high speed camera) and b) ensure dithering is not active.

    Wouldn't a true 8bit monitor not dither regardless of source material or gpu input?
     

Share This Page