Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by Dener de Paula Pereira, May 9, 2019.
Is The HDMI 2.1 cable worth buying? why?
If the equipment you have can take advantage of the extra bandwidth a HDMI 2.1 cable gives you then yes.
If the TV and card both support 2.1 then by all means yes. Also, getting 2.1 means the cable will be useful down the road.
If you plan on using anything higher than 4K60 8bit (non-native HDR) then an Ultra High Speed HDMI cable is necessary. The current cables may work but most likely not since 2.1 introduces a new data channel which current cables haven't been tested with and may not hold such a data stream.
They're saying there's no difference between HDMI cables but that's not true at all. HDMI 1.4 does not support HDR, HDMI 2.0 can do 4K @ 60Hz with HDR while HDMI 2.1 can do 4K 120Hz with HDR and also supports eARC. However... if you compare a cheap HDMI 2.0 cable to an expensive HDMI 2.0 (apples to apples) then there's no difference in bandwidth or features. The more expensive cable may have more layers of shielding around it or be a braided cable making it more durable, that is all.
The article was right, past tense. HDMI 2.0 didn't need special cables. HDR didn't need special cables. ARC didn't need special cables. There were only two types of cable, HDMI "Standard", and HDMI "High Speed", and these didn't change from HDMI 1.3 to HDMI 2.0, or anywhere in between. (There is one exception, there is also HDMI cables with Ethernet, but thats not relevant for PCs either way)
HDMI 2.1 however needs new cables, since they drastically increased the bandwidth. HDMI 2.1 cables are officially branded as "Ultra High Speed" or 48G.
Still not true.
These Speed designations (Standard, High, Premium, Ultra) tell you what bandwidth the cable was tested for. Theoretically any cable could later pass the new tests which didn't even exist at the time of the initial manufacturing, certification and retail sale. They don't necessarily do but certainly can.
ARC/Ethernet wasn't obvious to work in the 1.x era because not all cables used good enough wires for those lanes (because they had negligible use if any before those features), hence the "with ethernet" designation after these were introduced. But that doesn't mean there were no old cables which used good enough wires between those PINs as well.
Some of the cables sold in the 1.3 era (may not even properly tested for the High Speed certification, just sold "as is") will be fine for HDMI 2.1 while some cables sold in the 2.0 era (properly certified for Premium Speed) won't. It's about sheer manufacturing quality and luck.
All I said is true.
Its certainly possible that someone overengineered a cable in the past and it may work, however if you're out to buying a new HDMI cable, thats really not relevant. Either that manufacturer got their cable validated on the new specification (in which case, great), or you should not buy it, because buying HDMI cables does not have to be a gamble - thats what these classifications are all about. Get one validated for the highest spec that you care about (ie. Ultra High Speed for HDMI 2.1), and any manufacturer that carries this validation badge should be fine.
If you wish to keep using your existing cable. Well, you can always gamble, but its important to know that cable limitations can show up at any time and rather unpredictably. So I would urge anyone to not do that, and just get a properly certified cable if they are upgrading to HDMI 2.1.
So, i'm still thinking then all my games are smoother after i've installed the cable
Did the features of hdmi 2.1(Like VRR) enabled "by default"?
No, HDMI 2.1 is not available on any graphics card yet.
As a hardware engineer theorised to me recently, Turing has Pascal serdes, so it is not 2.1 capable, reusing existing serdes is done for economy reasons, with the 7nm shrink being more logical to prepare for and implement hdmi 2.1.
An active DP -> HDMI 2.1 adapter could be a good compromise for these cards (especially if it came out soon at a fair price - though neither of those seem probable).
At the end, these are just wires, insulation and shieldings. The configuration for the 4 identical TMDS pairs is basically the same from the beggining.
But I just looked it up: the new Club3D HDMI 2.1 cable claims to use tin plated 30 AWG copper. The old ones I have at home (manufactured in the 1.3 era) claim to use silver plated 27 AWG OFC (the only difference I care about here is the AWG, not the silver moniker or even the "oxygen free" part, although I didn't cut open either of them to check). And these happen to have pretty much the same retail price. So, I guess I would buy from the old ones yet again with this in mind. I am looking forward to try the old ones with HDMI 2.1 out of curiosity.
Not really, HDMI 2.0 cannot do 4k 60hz with HDR while maintaining 4:4:4 chroma ss. It will have to drop down to either 30hz or 8-bit (no HDR) or 4:2:2 chroma ss.
HDMI 2.0 = 18gbs bandwidth
4K 60hz 4:4:4 8-bit (no HDR) = 17.82Gbps
4K 30hz 4:4:4 10b-HDR (no 60hz) = 11.14Gbps
4K 60hz 4:2:2 10b-HDR (no 4:4:4) = 17.82Gbps
4K 60hz 4:4:4 10b-HDR = 22.28Gbps
Will have to wait for HDMI 2.1 for the last one.
Incorrect. HDR10 works fine with 8bit RGB in 2160p60. The GPU dithers it's output well enough. Actually, my TV handles this dithered 8bit better than native 10bit in it's "PC mode" (which is required to preserve the chroma resolution inside the TV's internal processor but comes at a price of lowered processing precision aka "banding" - PC mode is fine for SDR, not so much for HDR).
I should have said 'proper HDR', not 8-bit dithered. Looks like crap on my TV/display, the banding kills it.
The problem is most folks here do not know you can screw up a 100Base-T connection just by untwisting the pairs to far from the termination point. You are correct that in cables that are pin out correct for the interface, the construction quality is the most important for signal integrity. That is the base line metric: signal integrity over distance.