1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.
  1. Dener de Paula Pereira

    Dener de Paula Pereira Member

    Messages:
    49
    Likes Received:
    4
    GPU:
    Vega 56 Pulse
    Is The HDMI 2.1 cable worth buying? why?
     
  2. patteSatan

    patteSatan Active Member

    Messages:
    99
    Likes Received:
    21
    GPU:
    Sapphire R9 390
  3. anxious_f0x

    anxious_f0x Ancient Guru

    Messages:
    1,520
    Likes Received:
    268
    GPU:
    RTX 2080Ti SLi
    If the equipment you have can take advantage of the extra bandwidth a HDMI 2.1 cable gives you then yes.
     
  4. Chastity

    Chastity Ancient Guru

    Messages:
    1,944
    Likes Received:
    479
    GPU:
    Nitro 390/GTX1070M
    If the TV and card both support 2.1 then by all means yes. Also, getting 2.1 means the cable will be useful down the road.
     

  5. Valerys

    Valerys Master Guru

    Messages:
    367
    Likes Received:
    11
    GPU:
    MSI Aero GTX 1070Ti
    If you plan on using anything higher than 4K60 8bit (non-native HDR) then an Ultra High Speed HDMI cable is necessary. The current cables may work but most likely not since 2.1 introduces a new data channel which current cables haven't been tested with and may not hold such a data stream.
     
  6. RavenMaster

    RavenMaster Maha Guru

    Messages:
    1,060
    Likes Received:
    87
    GPU:
    1x RTX 2080Ti FE
    They're saying there's no difference between HDMI cables but that's not true at all. HDMI 1.4 does not support HDR, HDMI 2.0 can do 4K @ 60Hz with HDR while HDMI 2.1 can do 4K 120Hz with HDR and also supports eARC. However... if you compare a cheap HDMI 2.0 cable to an expensive HDMI 2.0 (apples to apples) then there's no difference in bandwidth or features. The more expensive cable may have more layers of shielding around it or be a braided cable making it more durable, that is all.
     
    Maddness and Blackfyre like this.
  7. nevcairiel

    nevcairiel Master Guru

    Messages:
    597
    Likes Received:
    187
    GPU:
    MSI 1080 Gaming X
    The article was right, past tense. HDMI 2.0 didn't need special cables. HDR didn't need special cables. ARC didn't need special cables. There were only two types of cable, HDMI "Standard", and HDMI "High Speed", and these didn't change from HDMI 1.3 to HDMI 2.0, or anywhere in between. (There is one exception, there is also HDMI cables with Ethernet, but thats not relevant for PCs either way)

    HDMI 2.1 however needs new cables, since they drastically increased the bandwidth. HDMI 2.1 cables are officially branded as "Ultra High Speed" or 48G.
     
    alanm and Blackfyre like this.
  8. janos666

    janos666 Master Guru

    Messages:
    659
    Likes Received:
    43
    GPU:
    MSI GTX1070 SH EK X 8Gb
    Still not true.
    These Speed designations (Standard, High, Premium, Ultra) tell you what bandwidth the cable was tested for. Theoretically any cable could later pass the new tests which didn't even exist at the time of the initial manufacturing, certification and retail sale. They don't necessarily do but certainly can.
    ARC/Ethernet wasn't obvious to work in the 1.x era because not all cables used good enough wires for those lanes (because they had negligible use if any before those features), hence the "with ethernet" designation after these were introduced. But that doesn't mean there were no old cables which used good enough wires between those PINs as well.
    Some of the cables sold in the 1.3 era (may not even properly tested for the High Speed certification, just sold "as is") will be fine for HDMI 2.1 while some cables sold in the 2.0 era (properly certified for Premium Speed) won't. It's about sheer manufacturing quality and luck.
     
  9. nevcairiel

    nevcairiel Master Guru

    Messages:
    597
    Likes Received:
    187
    GPU:
    MSI 1080 Gaming X
    All I said is true.
    Its certainly possible that someone overengineered a cable in the past and it may work, however if you're out to buying a new HDMI cable, thats really not relevant. Either that manufacturer got their cable validated on the new specification (in which case, great), or you should not buy it, because buying HDMI cables does not have to be a gamble - thats what these classifications are all about. Get one validated for the highest spec that you care about (ie. Ultra High Speed for HDMI 2.1), and any manufacturer that carries this validation badge should be fine.

    If you wish to keep using your existing cable. Well, you can always gamble, but its important to know that cable limitations can show up at any time and rather unpredictably. So I would urge anyone to not do that, and just get a properly certified cable if they are upgrading to HDMI 2.1.
     
    Last edited: May 13, 2019
  10. Dener de Paula Pereira

    Dener de Paula Pereira Member

    Messages:
    49
    Likes Received:
    4
    GPU:
    Vega 56 Pulse
    So, i'm still thinking then all my games are smoother after i've installed the cable
    Did the features of hdmi 2.1(Like VRR) enabled "by default"?
     

  11. nevcairiel

    nevcairiel Master Guru

    Messages:
    597
    Likes Received:
    187
    GPU:
    MSI 1080 Gaming X
    No, HDMI 2.1 is not available on any graphics card yet.
     
  12. Astyanax

    Astyanax Ancient Guru

    Messages:
    3,283
    Likes Received:
    853
    GPU:
    GTX 1080ti
    As a hardware engineer theorised to me recently, Turing has Pascal serdes, so it is not 2.1 capable, reusing existing serdes is done for economy reasons, with the 7nm shrink being more logical to prepare for and implement hdmi 2.1.
     
  13. janos666

    janos666 Master Guru

    Messages:
    659
    Likes Received:
    43
    GPU:
    MSI GTX1070 SH EK X 8Gb
    An active DP -> HDMI 2.1 adapter could be a good compromise for these cards (especially if it came out soon at a fair price - though neither of those seem probable).

    At the end, these are just wires, insulation and shieldings. The configuration for the 4 identical TMDS pairs is basically the same from the beggining.
    But I just looked it up: the new Club3D HDMI 2.1 cable claims to use tin plated 30 AWG copper. The old ones I have at home (manufactured in the 1.3 era) claim to use silver plated 27 AWG OFC (the only difference I care about here is the AWG, not the silver moniker or even the "oxygen free" part, although I didn't cut open either of them to check). And these happen to have pretty much the same retail price. So, I guess I would buy from the old ones yet again with this in mind. I am looking forward to try the old ones with HDMI 2.1 out of curiosity.
     
    Last edited: May 15, 2019
  14. alanm

    alanm Ancient Guru

    Messages:
    8,889
    Likes Received:
    1,268
    GPU:
    Asus 2080 Dual OC
    Not really, HDMI 2.0 cannot do 4k 60hz with HDR while maintaining 4:4:4 chroma ss. It will have to drop down to either 30hz or 8-bit (no HDR) or 4:2:2 chroma ss.

    HDMI 2.0 = 18gbs bandwidth

    4K 60hz 4:4:4 8-bit (no HDR) = 17.82Gbps

    4K 30hz 4:4:4 10b-HDR (no 60hz) = 11.14Gbps

    4K 60hz 4:2:2 10b-HDR (no 4:4:4) = 17.82Gbps

    4K 60hz 4:4:4 10b-HDR = 22.28Gbps

    Will have to wait for HDMI 2.1 for the last one.
     
  15. janos666

    janos666 Master Guru

    Messages:
    659
    Likes Received:
    43
    GPU:
    MSI GTX1070 SH EK X 8Gb
    Incorrect. HDR10 works fine with 8bit RGB in 2160p60. The GPU dithers it's output well enough. Actually, my TV handles this dithered 8bit better than native 10bit in it's "PC mode" (which is required to preserve the chroma resolution inside the TV's internal processor but comes at a price of lowered processing precision aka "banding" - PC mode is fine for SDR, not so much for HDR).
     

  16. alanm

    alanm Ancient Guru

    Messages:
    8,889
    Likes Received:
    1,268
    GPU:
    Asus 2080 Dual OC
    I should have said 'proper HDR', not 8-bit dithered. Looks like crap on my TV/display, the banding kills it.
     
  17. HeavyHemi

    HeavyHemi Ancient Guru

    Messages:
    6,263
    Likes Received:
    592
    GPU:
    GTX1080Ti
    The problem is most folks here do not know you can screw up a 100Base-T connection just by untwisting the pairs to far from the termination point. You are correct that in cables that are pin out correct for the interface, the construction quality is the most important for signal integrity. That is the base line metric: signal integrity over distance.
     

Share This Page