Catalyst 10 btc stand for?

Discussion in 'Videocards - AMD Radeon Drivers Section' started by Pictus, Apr 15, 2015.

  1. Pictus

    Pictus Master Guru

    Messages:
    234
    Likes Received:
    90
    GPU:
    RX 5600 XT
    Catalyst 10 btc mystery...

    [​IMG]

    I know that we need a Quadro/Fire Pro card for the 10 bpc thing to work
    and it is a very pitiful decision from Nvidia and AMD(not any more?) to
    *artificially* limit this for expensive cards we will never buy...

    But in the new Catalyst control Center>Properties(Digital Flat-Panel)>Color Depth>
    There is this 10 bpc thing...
    What is it?
    Because here with a Radeon R9 280x DisplayPort + Windows 8.1
    neither Photoshop CC 2014 or Zoner Photo studio or NEC Demo can work in 10 bpc.

    To enable the 10 bit in Zoner Photo Studio
    http://zmatek.cz/tutorials/zps-en/posts/2010/06/30-bit-display-support-in-zoner-photo-studio.html

    To enable the 10 bit in Photoshop
    https://photographylife.com/what-is-30-bit-photography-workflow

    10 bit test ramp (link at the page bottom)
    http://www.imagescience.com.au/kb/questions/152/10+Bit+Output+Support#theory

    NEC 10 bit Color Depth Demo(not sure if it only works with NEC)
    http://www.necdisplay.com/documents/Software/NEC_10_bit_video_Windows_demo.zip
     
    Last edited: Nov 7, 2016
  2. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,132
    Likes Received:
    974
    GPU:
    Inno3D RTX 3090
    My guess is that the driver supports the output, and then it's a matter of the software to support the specific depth.
     
  3. Pictus

    Pictus Master Guru

    Messages:
    234
    Likes Received:
    90
    GPU:
    RX 5600 XT
    This is true and works with Quadro/Fire Pro cards, but now
    Radeon have this option too, but does not work.
     
  4. sirDaniel

    sirDaniel Guest

    Messages:
    105
    Likes Received:
    2
    GPU:
    940MX
    BPC probably stands for Bits Per Colour. Since RGB is three colour format, the output bit depth is 30-bit then.
     

  5. Pictus

    Pictus Master Guru

    Messages:
    234
    Likes Received:
    90
    GPU:
    RX 5600 XT
    Yes Bits Per Color, but why it does not work for Radeon cards? (Works with Fire Pro cards)

    The option is there, but does not work with the software that can make use of it...
     
    Last edited: Apr 15, 2015
  6. sirDaniel

    sirDaniel Guest

    Messages:
    105
    Likes Received:
    2
    GPU:
    940MX
    Here is the the Media player that can output 10 bit and more. http://forum.doom9.org/showthread.php?t=171120 You can use it as a test of your setup. How do you know that "it doesnt work", TV shows 8bit input? I hope your tv set suports 10 bit.
    Im not sure about this, but radeons (not firepro) can output video only in higher depths, not entire desktop. But better verify this.
     
  7. Pictus

    Pictus Master Guru

    Messages:
    234
    Likes Received:
    90
    GPU:
    RX 5600 XT
    I use a capable 10 bpc monitor, not TV.
    PhotoShop displays banding with the 10 bit test ramp file
    if the 10 bpc was working there will be no banding

    The NEC demo does not work
    [​IMG]


    Zoner studio does not work
    [​IMG]
     
  8. LtMatt81

    LtMatt81 Master Guru

    Messages:
    475
    Likes Received:
    8
    GPU:
    4x Fury X
    bpc (bits per channel) is different than bpp (bits per pixel). The bpc represents one aspect of “Color Depth” and has nothing to do with 10bit Pixel Format which is required by Photoshop for instance provided that a 10bit monitor is connected.

    · 10bit Pixel Format defines the number of bits in each pixel. The higher the number, the sharper is the overall image/display.
    · Color Depth defines the number of bits per COLOR Channel (R=Red, G=Green, B=Blue). For example 8 bpc has less shades of RED color than 10 bpc.

    I hope this explains. For more info see:
    http://en.wikipedia.org/wiki/Color_depth
     
  9. Pictus

    Pictus Master Guru

    Messages:
    234
    Likes Received:
    90
    GPU:
    RX 5600 XT
    ....
     
    Last edited: Dec 8, 2015
  10. sirDaniel

    sirDaniel Guest

    Messages:
    105
    Likes Received:
    2
    GPU:
    940MX
    Thanks for bringing the bpc/bpp issue, i have learned some. But digging deeper showed to me, that difference between the two is on academical level https://www.doom9.org/showthread.php?p=1651289#post1651289 and we need to bring into consideration RGB vs YUV video.

    Anyway, looking back in time, AMD in their own papers http://fireprographics.com/resources/10-Bit.pdf said things simple as that: "10-bit displays use 10 bits to represent each of the red, green and blue color channels. In other words, each pixel is represented with 30 bits, instead of the 24 bits used in conventional displays. To this end, 10-bit displays can generate more than a billion shades of colors".

    So taking above into consideration i might be wrong, but the confusing naming convention between bpc and bpp and bit-depth, might be used now by amd to silently differ radeon/firepro:

    Radeon: folks, you have bpc option, great, you can send 10 bit channels (y,u,v channels) to your TV, in other means just watch film in 10 bit.
    Firepro: mister, you have bpp option here, now your 10 bit pixels (r,g,b pixels) will be sent to your monitor, hence you will see all your workspace/pulpit in 10 bit [because that is how normally gpu sends video signal to monitor, by rgb].

    Am i right or made more confusion?
     
    Last edited: Apr 18, 2015

  11. sambul81

    sambul81 Guest

    Messages:
    24
    Likes Received:
    0
    GPU:
    AMD FirePro 4900
    Its better to ask 2-nd level AMD Support about that directly by filing a ticket, since they refrain from explaining it on the forums, including their own. :) It looks like AMD Control Center allows to select Color Depth property of each Display separately, as some displays support input signal downsampling to several color depth values based on user choice.

    Signal output is also restricted by your setup available video bandwidth. For example, if your video card engine allows to saturate 21.8 Gbps DP1.2 max bandwidth by outputting 4@60Hz 10bpp including overhead, but you connect the GPU DP1.2 port to a 4K Monitor HDMI2.0 port via DP-to-HDMI2.0 adapter, your available video bandwidth would only allow to pass 8-bit signal at 4K resolution @ 60Hz refresh rate, and 10-bit at 4K@30Hz due to HDMI2.0 somewhat lower 18Gbps signal bandwidth limit.

    In addition to setting display mode to 10-bit input, FirePro driver can send 8 or 10bpp signal output based on user choice, if 10-bit is supported by an application like Photoshop and your monitor, while original Radeon cards driver only outputs 8bpp signal. In practice, AMD often delays for years adding advanced feature support to lower end FirePro cards too, so you see an option in the driver, but it doesn't work, which is confirmed in its Release Notes release after release. AMD never offered any apology or explanation for ongoing non-compliance with their advertised GPU specs, despite buyers pay premium prices namely for these specs, and really need 10-bit color support in their professional work.


    [​IMG] [​IMG]
    [​IMG]
     
    Last edited: Jan 14, 2016
  12. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    That is the common misinformation. The fact is that 10 bit output has been working for several years with Radeon cards (and as much as I know it works fine with Gforce cards on Linux drivers and may be also with Windows drivers now).

    The special 10-bit mode which has always been exclusive to the FirePro/Quadro drivers is a custom proprietary solution which has nothing to do with the standard APIs.
    That's a "hack" to make 10 bit graphics visible inside the old 8 bit window managers (like X11 or Wayland on Linux and DWM on Windows) which is completely different from setting the display mode to 10-bit.
    The main purpose of this is to work with 10 bit contents while continue to enjoy the benefits of a window manager rather than using the editing software and some display devices in fullscreen exclusive mode which would limit every display to a single application (hence the "exclusive" name) and the application should provide a new fullscreen GUI rather than using the same old one. Or, of course, the window manager could support 10 bit display mode but neither Linux, nor Windows developers had interest in doing that. As much as I can tell, they still don't, regardless of "HDR" in HDTV marketing. That's why it was "hacked together" but since it was actual work, somebody had to pay for it (those who wished to use 10 bit in a convenient way before the industry slowly catches up in ~10 years or so - they had to pay the price by buying professional products even if otherwise unnecessary).

    You can set 10 bit in fullscreen exclusive mode since DirectX 9.0b or c (I can't remember which update it was) if the hardware supports it (it's not mandatory because it's an optional extension to DX9.0) and technically every GPU from the DX10+ era has support for it (and I am not sure but I think it's theoretically mandatory for DX11 conformance, regardless if the end-user driver is "conveniently malfunctioning" [artificially limited to make it technically Quadro exclusive in all regards] to not work properly with this mode -> conspiracy theory but it's also possible nVidia simply didn't care to make this mode working with their drivers because nobody cared before the "HDR marketing" started and many still don't care...).

    I know I have been using 10-bit since several years with my good old Panasonic G30 (a consumer plasma TV from 2011) and my old HD5850 (which I replaced with a 290X in the mean time). But I know the Geforce drivers were (or still are) more buggy/restrictive in this regard.
     
    Last edited: Jan 14, 2016
  13. sambul81

    sambul81 Guest

    Messages:
    24
    Likes Received:
    0
    GPU:
    AMD FirePro 4900
    Could you give links to hacked AMD and Nvidia Windows or Linux consumer card drivers allowing to set signal output to 10-bit and enable HDR mode in Control Panel when working with applications in "new full screen GUI mode" and HDR capable 10-bit displays? What popular applications or app hacks (links?) allow in Windows to enable such mode, and how its different from usual "Full Screen" option available in many current packages? Can you post some demo pics illustrating your words, i.e. showing driver versions, "full screen mode" for certain apps with 10-bit option enabled, etc?

    Another obvious question never answered by video card makers: why do they offer in consumer cards Control Panel a choice of different display input modes (8, 10, 12 bpc) for capable displays, despite not offering corresponding choice of GPU output modes? In such case, what difference does it make, if a user selects one or another monitor input mode?
     
    Last edited: Jan 14, 2016
  14. Alessio1989

    Alessio1989 Ancient Guru

    Messages:
    2,959
    Likes Received:
    1,246
    GPU:
    .
    I really like to ruin parties ^_^

    Since Windows 10, everything runs on windowed mode, full-screen application included: the exclusive full-screen mode tipical of previous version of Windows is no more, it's a emulation through the windowed border-less mode with un-throttled frame-rate support (and if you have a notification or an OS element pop-uping, like volume control or action center notification, your frame-rate will throttle to v-sync * #back-buffers).
    So what? On Windows, everything that runs on windowed mode does not support beyond 8-bit colour channel with all current versions of DXGI.
    Well, the big deal is that we need an update to DirectX and WDDM.

    Beyond all this software annoying things:
    all modern GPUs since.. SM4?.. Support 10-bit (and even larger!) render targets.
    None of current GPUs support rec.2020 gamout. Not a big deal after all.
    We do not have any hardware surface compression built for 10-bit or greater colour accuracy. And this is the big deal: nobody cares about 10-bit or greater colour accuracy panels just for post-processing on 8-bit colour on compressed textures, it would help only to cut some banding for lazy developers.

    Now I want to see when the HDR marketing departments will speak about all of this.
     
    Last edited: Jan 15, 2016
  15. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    Hah! I didn't know that. (Win10 is fairly new.) Thank you for this information, I will check it for myself (not like I don't believe you).

    As for the other arguments:

    I think since DX11.x you can push 16 (or even 32?) bit/channel and the card should truncate/round/dither it (as it seems fit but it should use dithering...) automatically to the highest bit-depth accepted by the display device, so it might not be that much of a problem for game developers to support practically "anything" without problems.

    And textures might very well be compressed in 8 bit, but they are shaded in high bit depth float precision, so they get "recolored" to some degree, thus it would make sense to push 10 bit (if not more) to the display. Just think about the HDR tone mapping used since about a decade. That's now calculated in "insane precision" before it's mapped back to 8 bit (I think it went as high as 128-bit or so over the years, from the FP16 start).

    I hope the HDTV HDR marketing will make Microsoft implement some new display modes for their window manager.
     
    Last edited: Jan 14, 2016

  16. sambul81

    sambul81 Guest

    Messages:
    24
    Likes Received:
    0
    GPU:
    AMD FirePro 4900
    Thanks for refusing to provide any links to drivers & software, or even screenshots supporting the above claims on your consumer video cards outputting 10-bit color. Now this forum readers would definitely be unable to "check it for themselves", thus converting your claims into forum flud. :)

    And to that extend, would you care to make this thread a bit more useful for readers, and answer the simple question:

    Why GPU makers offer in consumer cards Control Panel a choice of different display input modes (8, 10, 12 bpc) for capable displays, despite not offering corresponding choice of GPU output modes? In such case, what difference does it make, if a user selects one or another monitor input mode?
     
    Last edited: Jan 15, 2016
  17. Alessio1989

    Alessio1989 Ancient Guru

    Messages:
    2,959
    Likes Received:
    1,246
    GPU:
    .
    -On DirectX Graphics, fullscreen is actually capable to handle 10-bit per colour front-buffer. It does not (currently) on windowed mode. The issue is that with WDDM 2.0 everything runs on windowed mode since the full-screen exclusive mode is done, so we need that Microsoft update the driver model on its last operating system.
    -If DWM is disabled, third-party ICDs can handle 10-bit (or greater) colour front-buffer on windowed mode.
     
    Last edited: Jan 15, 2016
  18. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    Huh?
    I shred what I knew and then somebody else else commented on my post, indirectly confirming what I just said when he pointed out some details I missed (new information regarding the last changes in the latest Windows release which I wasn't aware yet). What better confirmation could I offer than "peer review"? Anything I might show or link comes from me (the same person you are debating) but we got a third party to make things more clear and thus I considered the topic closed.



    As for the Control Center/Panel settings: those are there to limit the output for (in)compatibility (or may be "benchmarking") purposes.

    There is another thing in the mix: the VGA calibration LUT. It's always 16-bit, so you might desire 10+ bit output between the display engine of your GPU and your display device even if your source material is 8-bit, so it's reasonable to default to the maximum possible bit depth between the GPU and the display, although it's nice to have a manual override (if nothing else, it can help you test if you actually do benefit of 10+ bit or not by flipping that switch back and forth and see for yourself...).

    Hint: If you don't know what the "calibration LUT" is then you probably don't have one (other than the default 1:1). But some people like me do. (It's usually the result of measuring your display with certified instruments in order to get it closer to the expected behaviors before profiling them.)
     
    Last edited: Jan 16, 2016
  19. sambul81

    sambul81 Guest

    Messages:
    24
    Likes Received:
    0
    GPU:
    AMD FirePro 4900
    Thanks for the comment. Did you mean that a driver mod must be used for that - which one (a link)? Or what does it suppose to mean "conveniently malfunctioning" in practical terms (click-click) for a simple Joe like me?

    How "a new full screen GUI" is different from "same old one" full screen GUI provided by many applications? Can you give example application links?
     
  20. Prefix

    Prefix Member Guru

    Messages:
    176
    Likes Received:
    18
    GPU:
    Sapphire R7 260X 2GB

Share This Page