Is there any solutions to get 4k 60p 10-bit 4:4:4 (RGB) on a HDMI 2.1 TV (LG C9) from a PC right now? HDMI 2.0 does not have enought bandwidth to support 10bit with 4:4:4 chroma at 4k 60p as far as I know. Is there any adapter/cable which would convert the display port of the GPU (1080ti in my case) to HDMI? I read somewhere that they are working on a DP 1.4 to HDMI 2.1 adapter, but no info since then,
Realtek Demonstrates RTD2173 DisplayPort 1.4 to HDMI 2.1 Converter Is this a quality solution? https://www.anandtech.com/show/1453...s-rtd2173-displayport-14-to-hdmi-21-converter
Fun fact: G-Sync now works through a passive DisplayPort->HDMI adapter (RTX2060 with 450.12, the adapter a 1$ no-name Chinese gadget), although it's limited to 1080p60 RGB (the driver imposes a very low pixelclock limit for these kind of adapters). I am not aware if similar adapters with higher bandwidth limit are available (HDMI 2.0 capable versions exist but can only be found integrated to PCBs, not standalone in similar external adapters and I am not aware if HDMI 2.1 capable versions exist at all). It would be interesting to check if this works with Pascal cards (the original argument about the HDMI port limitations of Pascal won't hold for the DisplayPort). Edit: Huh! I managed to add 2160p60 as custom resolution and it works (at least with "reduced blanking", the image was fuzzy with "automatic" timings)!
It's an LG C9 (HDMI 2.1 VRR compatible OLED TV with G-Sync Compatible certification). I had to use "pixel clock patcher" to get above 1080p60. This is supposed to be a DP -> HDMI 1.x adapter but it can handle 2160p60 RGB just fine. However, 2160p100 or 120 still results in "No Input" error (may be the TV's current firmware isn't ready for that, HDMI 2.1 hardware or not). I sold my GTX1070 and bought an RTX2060 because this didn't work the last time I tried. I remember checking this with the Turing card for fun while I still had the GTX1070 and it didn't seem to work. That's why I think this is something new in recent (beta) drivers. I will be really mad about selling the GTX1070 Sea Harw EK X for a stupid air cooled RTX2060 if G-Sync would now work on this TV with this adapter (and the 1070's DP output). ALLM works as well but HDR10 doesn't.
No graphics card to date, supports the full feature set of HDMI 2.1. When one supports it, i will be upgrading my HDTV and audio receiver to take advantage of 4k@120hz.
Nvidia Turing cards have Displayport 1.4... it already supports 8K 60Hz and 4K 120Hz (with chroma subsampling) But there is no Displayport 1.4 to HDMI 2.1 Adapter to connect it to a 8K TV. So you only need to wait for the Adapter, you dont need HDMI 2.1 card. Adapter will come very soon.
Ther is a Displayport 1.4 to HDMI 2.0b HDR active adapter from Club 3d, that should work. It supports HBR2 transmission and 4:4:4 10bit @ 60Hz https://www.club-3d.com/en/detail/2442/displayportt_1.4_to_hdmit_2.0b_hdr_active_adapter/
I agree soon and I read March 20 2020-Apirl X 2020 I forgot the actual date in April.Could be BS but it is just not Club 3D putting out Displayport 1.4 to HDMI 2.1 Adapter for 4K 120Hz over HDMI G-Sync for VRR
Well which ever does come sooner I will likely pick up. The first piece I'm waiting for is a 5.1 audio receiver that passes the full 48gbps of the HDMI 2.1 spec for under 700$. This gtx 1660 isn't going to drive 4k@120hz very well anyways, which is why i'm waiting to see how the next graphics cards from Nvidia or AMD perform and if they include HDMI 2.1 hardware support, then I won't need to worry about an adapter.
I haven't looked into it yet (I am used to running my HDMI 1.3 AVR on a separate output, did this for years) but I think the VRR feature is optional, so make sure you pick an AVR with VRR if you insist on chaining them. Although with eARC it should be possible to connect the AVR to the TV (rather than the PC) directly (with separate cable, not in a chain) but the LG firmware is not ready for that right now (it can't pass through multi-channel PCM from an HDMI input to the eARC port yet). So I would be looking for AVRs which are known to be eARC compatible with the LG TVs (they seem to be picky, at least with AVRs from 2018). Even just being able to run the desktop in 4k120 (with 4:2:2 at least, since the 4:4:4 PC mode has visible quality issues anyway [gradient banding, mainly on near-black]) would worth some money for me. This would also have the benefit of not having to switch the refresh rate between 60 and 24 Hz for movie playback (24*5=120, so no judder). But any active adapter (or AVR for that matter) could be problematic for G-Sync (I guess the adapter itself would need to be certified in order for the driver to automatically activate G-Sync, so it will need to be forced On manually like it had to be for old per-certification LG firmwares). It would be easier if the DisplayPort 1.4b outputs on the Turing cards could work in a nonstandard HDMI 2.1 alternate mode with passive adapters (with the bandwidth limited to DP 1.4 speeds but with HDMI 2.1 signalling). But I think that must wait until DisplayPort 2.x standardizes a new HDMI 2.1 alt-mode (if it ever happens). Even the current HDMI 2.0 alt-mode lacks HDR support for some reason (besides of being limited to HDMI 1.x speeds without the unofficial limiter patch).
I used the Yamaha RX-V385 for over a year and apparently it have hdmi 2.1. Cheap and good for it's price https://www.avnirvana.com/threads/hdmi-2-1-officially-arrives-with-yamaha’s-rx-v385-av-receiver.2632/page-2
I agree, though the only reason I don't run my avr on a separate output is that in Windows the avr needed to be either cloned or extended with the display's output from the GPU to have audio. If the display was only set to show the desktop on the HDTV I have, and not the AVR at the same time, the AVR would not be passed audio. And with it cloned or extended it would create issues in some programs of framerate/image composition issues without having a display connected to the avr. That and trying to use 4k@60hz 4:4:4 RGB with audio just the HDTV I was testing with, would cause image flickering unless I disabled audio to the TV. So that too made me just wait for something like HDMI 2.1 while the wife got to enjoy the new 4k TV
I find a secondary (extended) display useful in general anyways, so I have an old little LCD monitor hooked up to the AVR (sitting on the floor). Sure it gets annoying at times (and even restricts some stupid DRM solutions like Netflix UHD playback which needs all displays to be HDCP 2.x compliant) but it works. I never knowingly experienced any issues with this setup other than having to rearrange the windows after turning either devices on/off or changing their resolutions, etc. I also never experienced image flicker with audio enabled on for the TV. Sometimes I watched youtube talkshows with the TV's built-in speakers (although I never did that with G-Sync enabled, I don't care about the "wasted" idle electricity during winter since it heaths the room anyways, so everything is always on or in active standby...).
Hey guys, the Club3D adapters WILL NOT support G-Sync, FreeSync, VRR. They said that because of some technical issues, they will not be supporting those technologies in the adapters! The updated Club3D catalog now, it says "Support for G-Sync, FreeSync and VRR bypass." which is quite misleading as it mentions all the technologies but then it says "bypass"... Also the HDMI2.1 name have been removed as it is not HDMI2.1 if there is no VRR...
HDMI 2.1 VRR works through passive DP->HDMI adapters on Turing cars, so the nVidia driver doesn't restrict this kind of G-Sync Compatible functionality on the DP ports. Thus, it's up to the active adapter to make use of it if possible (and if the manufacturer intends to do so). I think it's possible because HDFury devs mentioned (a long time ago) they managed to extract audio from HDMI 2.1 VRR links with their experimental processors. I guess we will see once this adapter is in the hands of some users. That's BS. VRR is an optional HDMI 2.1 feature. Devices doesn't have to support it in order to claim HDMI 2.1 compatibility.[/QUOTE]