2016 model Samsung TVs to support Youtube HDR

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Dec 20, 2016.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    42,439
    Likes Received:
    10,254
    GPU:
    AMD | NVIDIA
    Samsung Electronics today announced that it will support YouTube’s global HDR playback on television for the first time, through an expanded version of the YouTube application. ...

    2016 model Samsung TVs to support Youtube HDR
     
  2. Mannerheim

    Mannerheim Ancient Guru

    Messages:
    4,834
    Likes Received:
    43
    GPU:
    Asus HD6950 2GB
    HL2 has been supporting HDR for ages with all monitors..
     
  3. ttnuagmada

    ttnuagmada Member Guru

    Messages:
    161
    Likes Received:
    36
    GPU:
    1080 Ti SLI @2101
    You're confusing HDR with "HDR".
     
  4. ttnuagmada

    ttnuagmada Member Guru

    Messages:
    161
    Likes Received:
    36
    GPU:
    1080 Ti SLI @2101
    Don't be so sure. The HDR videos look good on the Youtube app, but I had to manually install the 1154 firmware before the TV actually kicked into HDR mode with those videos.
     

  5. Mannerheim

    Mannerheim Ancient Guru

    Messages:
    4,834
    Likes Received:
    43
    GPU:
    Asus HD6950 2GB
    Yes... HL2 is real HDR :3eyes:
     
  6. Denial

    Denial Ancient Guru

    Messages:
    13,565
    Likes Received:
    3,117
    GPU:
    EVGA RTX 3080
    HDR in HL2 renders the image in 10bit then maps it to 8bit. HDR in modern TVs/apps are 10bit throughout the entire process.
     
  7. Mannerheim

    Mannerheim Ancient Guru

    Messages:
    4,834
    Likes Received:
    43
    GPU:
    Asus HD6950 2GB
    So.. we dont need a HDR TV.. It can be done with 8bit also.
    I call this marketing hype =)
     
  8. Denial

    Denial Ancient Guru

    Messages:
    13,565
    Likes Received:
    3,117
    GPU:
    EVGA RTX 3080
    It's not the same. It's like comparing 4K downsample on a 1080 monitor to a real 4K screen. It offers some advantages but it isn't replacing a 4K screen.

    For starters color banding is significantly reduced on a 10bit display:

    http://i.imgur.com/lWqWHnd.png

    http://abload.de/img/8-bite8xve.jpg - 8bit
    http://abload.de/img/10-bit6uybq.jpg - 10bit

    You also get access to significantly more color spectrum, especially reds - which are noticeably different in HDR.

    https://i.imgur.com/OdQFWJC.png

    Some more information about how HL2 other games use it:

    In the end its not just marketing. I have a Samsung 8500 with HDR support and while Marco Polo isn't that great (Netflix show that supports HDR) Obduction in HDR is beautiful. I'll be buying the first QHD/4K 144Hz HDR monitor that's available regardless to the price.

    Edit: Additional reading for those interested

    https://developer.nvidia.com/implementing-hdr-rise-tomb-raider
    https://developer.nvidia.com/preparing-real-hdr
     
    Last edited: Dec 20, 2016
  9. ivymike10mt

    ivymike10mt Master Guru

    Messages:
    226
    Likes Received:
    12
    GPU:
    GTX 1080Ti SLI
    Great Post.
    Just I like add..
    Wider Color Space or rec.2020 (like on pic) is a part of UHD standard - HDR itself is abit other UHD element/component.
    This WCG come more from improved backlighting or filtering system, more dense pixels etc..
    If use Full RGB signal can get UHD color without HDR itself technically.
     
  10. Xionor

    Xionor Member

    Messages:
    36
    Likes Received:
    4
    GPU:
    MSI Geforce 560Ti 1GB

    LOL @ bull**** 8/10 bit screenshots.
    All those 8/10 bit comparison are complete marketing scams and fakes.

    They're actually showing 5 bit vs 8 bit colors.

    8 and 10 Bit mode should show exactly the same photo on a normal 8bit monitor, because the monitor shouldn't be displaying the extra 2 bits of color.(which is only in the form of extra smooth gradient between two shades of the same color).

    So why are 8bit and 10bit mode showing different photos on an 8bit monitor?

    You obviously don't need a 10 bit monitor since you can see 10 bit color on your 8 bit monitor just fine, right? Christ...
     
    Last edited: Dec 22, 2016

  11. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,518
    Likes Received:
    2,912
    GPU:
    MSI 6800 "Vanilla"
    The image is just showcasing how the difference would look, an actual 6-bit, 8-bit, 10-bit or whatever check (Usually a gray scale type image with clear defined borders between each color grade.) would be from a single image and then how that would look would depend on the display. :)

    It's also further complicated by the whole 6+2 = 8 and 8+2 = 10 bit stuff that a lot of monitors have. (True 10 bit would well you'd probably notice from the price point alone I'm guessing heh.) my own display is 8+2 for example but of course it's marketed as 10-bit whereas the actual info is something you'd have to look up in a test where someone actually took the display apart.

    EDIT: This seems to explain it pretty well though it too uses a more visible comparison between two images.

    http://www.ronmartblog.com/2011/07/guest-blog-understanding-10-bit-color.html

    Far as I understand back with CRT there was early experimentation with up to I think it was called 48-bit or some such.
    (Not exactly sure what that would correspond to, 8-bit would be 24 and 10-bit 30 so probably the recent 16-bit LCD displays?)

    Not really sure how it is with gaming and 10-bit either, Alien Isolation is among the earliest I believe offering support for deep-color (30 instead of 24 or 10-bit) and I'm guessing DirectX 12 and Vulkan also helps a bit with more standards and what not, starting to see some few games supporting actual HDR now such as Obduction via UE4 or Shadow Warrior 2 via their own RoadHog engine tech.

    And far as AMD and Nvidia and driver tech goes I think AMD supports 10-bit via most GCN GPU models and for Nvidia Maxwell and Pascal with the little side bit about OpenGL requiring exclusive full-screen mode for it.
    (Probably about the same with 12-bit support too although I don't think there's many actual monitors using 12 or higher so at the moment that's mainly in more recent TV's.)
     
    Last edited: Dec 22, 2016
  12. Denial

    Denial Ancient Guru

    Messages:
    13,565
    Likes Received:
    3,117
    GPU:
    EVGA RTX 3080
    Uh, I'm pretty sure everyone on Guru3D knows it's an example.
     
  13. Xendance

    Xendance Ancient Guru

    Messages:
    5,554
    Likes Received:
    12
    GPU:
    Nvidia Geforce 570
    With 8 bit colours colour banding is pretty obvious with things like skies in games.
     

Share This Page