future drivers nvidia support hdmi 2.0a for GTX 9xx series?

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by maur0, Apr 9, 2015.

  1. maur0

    maur0 Master Guru

    Messages:
    943
    Likes Received:
    97
    GPU:
    point of view gtx 570 1gb
    Last edited: Apr 9, 2015
  2. lantian

    lantian Guest

    if it supports 2.0 it should be a matter of patching it in at least that's my takeaway from this
    " Version 2.0a of the HDMI® Specification. It is available to current HDMI 2.0 Adopters via the HDMI Adopter Extranet."
     
  3. sapo_joe

    sapo_joe Master Guru

    Messages:
    669
    Likes Received:
    81
    GPU:
    ASUS TUF RTX4090 OC
    Ony the Geforce 9XX fully support HDMI 2.0, since it's hardware based. No special drivers needed. Right now, I'm on a 4k @60hz display, using HDMI 2.0.

    I believe it's been this way since 343.XX drivers.
     
  4. Merlena

    Merlena Active Member

    Messages:
    78
    Likes Received:
    16
    GPU:
    RTX 3070 8G
    I'm also running at 4K+ at 60 Hz, with a HDMI 2.0 cable, and I'm using a 780Ti. Only when using nVidia HDTV Play for 3D Vision it ****s up and limits me to 24 Hz.

    1024x720 > 60 Hz
    1920x1080 > 24 Hz
    3840x2160 > 24 Hz
    (With Stereoscopic 3D enabled; with transmitter.)
     

  5. sapo_joe

    sapo_joe Master Guru

    Messages:
    669
    Likes Received:
    81
    GPU:
    ASUS TUF RTX4090 OC
    It's limited to 24Hz because your card is HDMI 1.4a only. 1.4a is capable of running 4k@30Hz (full collor). The 4k@60Hz you're having is on 4:2:0 chroma, which is losing color information.
     
    Last edited: Apr 11, 2015
  6. lantian

    lantian Guest

    Guys neither of you even remotely answered the ops question, he didn't ask about hdmi 2.0 or what it supports or what you are running, he asked if 9xx geforce cards will support hdmi 2.0a with a driver update and answer to that question is yes it most likely will since there is no reason for nvidia not to support it, it a firmware/driver thing
     
  7. maur0

    maur0 Master Guru

    Messages:
    943
    Likes Received:
    97
    GPU:
    point of view gtx 570 1gb

    depends use for play bluray not losing information

    bluray movie is encode to 4:2:0 NV12
    same color used to hdmi 1.4a to 4k and 60 hz

    then for movies would not have lost any theory right?

    next generation bluray 4k UHD is encode native 4:4:4 or keeps old pattern 4:2:0?
     
    Last edited: Apr 11, 2015
  8. zais

    zais Guest

    Messages:
    100
    Likes Received:
    0
    GPU:
    1070
    Display should update digital to analog (driver or firmware) pc mode so all tweaks can be setup via gpu if needed or even how about display anti alias mode with os setting that applies daa via cpu display
     
    Last edited: Apr 11, 2015
  9. CDJay

    CDJay Guest

    Messages:
    136
    Likes Received:
    3
    GPU:
    NVIDIA GTX 1080 FE SLI
    The add on for 2.0a is for HDR, it would be great if this was exploited but it's not an imminent concern surely?
     
  10. JulianBr

    JulianBr Member

    Messages:
    44
    Likes Received:
    5
    GPU:
    RX 5700 XT
    AMD has announced HDMI 2.0a support and 10bit + HDR support on their 2016 GPUs. Would be nice if nvidia could say something about the 9xx series support for this too.

    My 960 is used in my HTPC so it would be nice to know. Hopefully I don't have to buy new one next year.
     

  11. Reddoguk

    Reddoguk Ancient Guru

    Messages:
    2,665
    Likes Received:
    597
    GPU:
    RTX3090 GB GamingOC
    Whats the difference between HDMI 2.0a and DisplayPort 1.3?
     
  12. nevcairiel

    nevcairiel Master Guru

    Messages:
    875
    Likes Received:
    369
    GPU:
    4090
    NVIDIA GPUs already support 10-bit output over HDMI, and HDR is just some fancy metadata transmitted with the image, so applications could potentially make use of that today, if screens existed, that is.
    AMD might try to make a big deal out of it because their current generation does not support HDMI 2.0 at all.

    We'll have to see how HDR for Gaming PCs and HTPCs alike pans out next year - in general that is.
     
    Last edited: Dec 8, 2015
  13. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,930
    Likes Received:
    1,044
    GPU:
    RTX 4090
    Did you already buy an HDR capable display? Because that's the only thing which 2.0a have above 2.0.

    As for 10 bit - NV supports up to 12 bits per channel on HDMI output in Maxwell right now.
     
  14. OrdinaryOregano

    OrdinaryOregano Guest

    Messages:
    433
    Likes Received:
    6
    GPU:
    MSI 1080 Gaming X
    Why are you worried about that right now?

    HDR capable displays are a rarity currently and most likely will take many years to become common place. Currently there isn't even consensus on what will be the brightness(nits) threshold for these displays ranging from 1000/1500 nits all the way to 10,000 nits, everyone has their own idea and the EU has it's own idea of what should be allowed.

    It's going to be a good long time before HDR and Rec.2020 are commonplace.
     
  15. nullack

    nullack Guest

    Messages:
    20
    Likes Received:
    0
    GPU:
    Titan
    I dont agree. BT2020 is being done in existing high end TVs. Wont be long before its done with mid range TVs in 2016. HEVC 4K is already well established and so is P010 GPU acceleration for 10 bit colour. The 960 released all the way back in January does this. I run 4K 60Hz in 12Bpc colour depth through HDMI 2 on my 960 in 4:2:0 chroma subsampling no problems, and the 960 chews up even high bitrate MAIN 10 L52 stuff at 127 Mbps.

    What we need is for NVIDIA to do an update for HDMI 2a in the 960 and 950. Then you already have a full hardware decoder solution for MAIN 10 4K HEVC UHD video footage, even at 60 FPS and UHD BD birates of 127 Mbps.
     

  16. OrdinaryOregano

    OrdinaryOregano Guest

    Messages:
    433
    Likes Received:
    6
    GPU:
    MSI 1080 Gaming X
    Not arguing about it 'existing' today, arguing that it's going to be an enthusiast niche for a long time before it's everyday business where you don't have to 'worry' about specs. Speaking of any display "tech" really - 4K, HDR, wide gamut colour matrix/spaces, non-interpolated high framerates, higher bit depth.

    Right now you have to look specifically for most of these, that's not commonplace.
     
    Last edited: Dec 9, 2015
  17. nullack

    nullack Guest

    Messages:
    20
    Likes Received:
    0
    GPU:
    Titan
    Yeah I see your point mate. Most of the sheeple out there will be happy with crappy bitrate stuff from Netflix at 4K or pirate stuff. I think increasingly disc based delivery is seeing to be old skool. The reality ofcourse is very different and the plain realities of 4K with HEVC means 127 Mpbs wide gamut colour stuff is "enthusiast" even though thats what is actually needed for a decent picture in HDR 4K.
     
  18. Bzzzz57

    Bzzzz57 Guest

    Messages:
    1
    Likes Received:
    0
    GPU:
    EVGA GeForce GTX 980
    GTX 980 and 4K HDR TV

    Hello,

    I am a poor beginner and I appeal to your expertise.
    I read your posts above, and I confess I do not understand everything.
    Could you please help me?

    I recently purchased a 4K OLED TV: LG 65EF950V.
    This TV is compatible HDR via USB and via HDMI 2.0a (HDCP 2.2) with a 10-bit panel and compatible HEVC and VP9.

    I bought at the same time a new PC with an EVGA Geforce GTX 980 card to play and watch movies.

    I chose the GTX 980 card because I thought it was compatible HDMI 2.0a to fully enjoy 4k capabilities and HDR of my new TV.

    As soon as I got my TV, I downloaded some HDR videos to test my TV. But I was very disappointed to see the pale colors rendered by my TV.
    I then tested the same HDR videos by reading directly from a USB3.0 stick on a USB3.0 port of my TV... and there, my TV is automatically placed in HDR mode, and the colors were superb.

    I said that I activate the HDR on the 3 HDMI ports of my TV.
    I tried to change (somewhat randomly) the parameters of the GTX 980 card, but I do not see any difference, and my TV never goes in HDR mode when I watch the HDR videos (I use MPC-HC as a video player).

    Also, when I watch movies on my PC in ISO blu ray format with Cyberlink Power DVD 15, I do not find the result fantastic. :bang:

    My questions are:
    1- For HDR video, do you think the problem is due to the GTX 980 card or to a configuration problem (card or video player)? If so, what do you recommend?

    2- What configuration of my GTX 980 card do you recommend to get the best result for watching movies on my TV and for playing video games?

    For info, my GTX 980 card supports:
    - RGB (8 bits or 12 60HZ)
    - YCbCr422 (8 or 12 bps)
    - YCbCr444 (8 bits 60HZ or 8, 10, 12 bits 24HZ)
    - YCbCr420 (8, 12 bits 60HZ)

    I can also select the output dynamic range:
    - Limited (16-235)
    - Full (0-255) but only in RGB

    Thank you beforehand. :)
    Best regards.
     
  19. tsunami231

    tsunami231 Ancient Guru

    Messages:
    14,750
    Likes Received:
    1,868
    GPU:
    EVGA 1070Ti Black
    if your using HDMI, you have to set Full in HDTV to FULL in the Change Resolution tab of NVCP or the color are washed out, as NVIDIA drivers dont set this right when connected to HDMI
     
  20. VeeVee

    VeeVee Guest

    Messages:
    1
    Likes Received:
    0
    GPU:
    Geforce 770
    For HDR you will need HDMI 2.0a which none of the current cards from Nvidia or ATI currently supports. GTX 980 has HDMI 2.0.
     
    Last edited: Jan 8, 2016

Share This Page