AMD Radeon Fury X doesn't have HDMI 2.0 support

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jun 18, 2015.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    41,043
    Likes Received:
    9,324
    GPU:
    AMD | NVIDIA
    So if you had a peek about all everything presented in the past few days, one thing you will have noticed. The new architecture doesn't support HDMI 2.0, but instead uses HDMI 1.4a. This means with F...

    AMD Radeon Fury X doesn't have HDMI 2.0 support
     
  2. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,810
    Likes Received:
    3,363
    GPU:
    6900XT+AW@240Hz
    AMD promotes AdaptiveSync, at time they get it into HDMI standard or hack it in, they'll get their cards up to that standard.
    I personally preferred DL-DVI over HDMI since IQ is same and I use separate sound.
    (And I was not happy that DP did not manage to kill HDMI, but that is interface which matters to me now.)
    DP all the way, if it can do 1440p @144Hz then it is good enough.
     
  3. pimp_gimp

    pimp_gimp Ancient Guru

    Messages:
    6,633
    Likes Received:
    33
    GPU:
    RTX 2080 Super SLI
    You do realize that even if DP managed to kill HDMI there would be many upset users. Many users who use HDTV's as a monitor (or second screen) probably still connect to an older HDTV that supports 1080p which is HDMI only (or VGA/DVI). DisplayPort is only supported on some UHD4k screens. Just look at the AMD Fury thread and see how many are upset that the card lacks DVI, which is even older than HDMI. Just because you don't use it, doesn't mean that the need or want for it is gone.
     
    Last edited: Jun 18, 2015
  4. WhiteLightning

    WhiteLightning Don Illuminati Staff Member

    Messages:
    28,892
    Likes Received:
    1,734
    GPU:
    GTX1070 iChillx4
    30Hz lol thats crap
     

  5. raptor15sc

    raptor15sc Member

    Messages:
    15
    Likes Received:
    0
    GPU:
    G.SKILL Ripjaws X 64GB
    I think this is a really bad move on AMD's part.

    Look at the sales charts. There are A LOT of 4K TVs being sold right now, and most of the 4K content is on PC, not on 4K Blu-Ray or Smart TV networks yet.

    Watching 60Hz content at 4K 30Hz on HDMI 1.4 is crappy, I know from experience.

    I bought a GTX 980 for the PC attached to my Vizio P702ui-B3 and it's great. Controller based games that aren't very demanding look great. Video looks great. If the Fury X was out when I got it, I probably would have bought it instead if it had HDMI 2.0 (Dust is a problem in media center PCs and AMD's radiator paired with an easily accessible filter would have been a great solution).

    I wonder how much cost is saved by not using HDMI 2.0 -- I bet lost revenue is more.
     
  6. rl66

    rl66 Ancient Guru

    Messages:
    2,901
    Likes Received:
    366
    GPU:
    Sapphire RX 580X SE
    DP is very versatile and do every thing that DVI does (DL-DVI and DVI-i included) but i agree with those who pout because there is no HDMI 2.0, this card is the flag ship of the brand... it's not middle segment, those who will buy it are very precise on spec they want.

    Lot of people play on their TV and DP is not on their screen (and are less expensive than computer screen at these res, only freq are lot lower) so they can only use HDMI.

    btw i was expecting the same for NVidia.
     
  7. nohdmi2

    nohdmi2 Banned

    Messages:
    2
    Likes Received:
    0
    GPU:
    16
    the right question which video card have True HDMI 2.0? Only one problem, no graphi

    the right question which video card have True HDMI 2.0?


    Only one problem, no graphics cards with HDMI 2.0 in 2015

    AMD Radeon Fury X doesn't have HDMI 2.0 support


    everything is cheating, false and fake test in nvidia
    not support hdmi 2.0
    not support 18 gbps
    to processing 18GBPS you need 1.8 times power


    NVIDIA has sold tens of thousands to customers promised video cards with HDMI 2.0
    It's a lie!

    not have hdmi 2.0

    NVIDIA
    not Supports DCI-P3 because they do not have enough bandwidth only 10.2 gbps like hdmi 1.4
    not support HDR because they do not have enough bandwidth only 10.2 gbps like hdmi 1.4

    some people brought as testing like NVIDIA supports HDMI 2.0 this is wrong test
    test does not test bandwidth .
     
  8. (.)(.)

    (.)(.) Banned

    Messages:
    9,094
    Likes Received:
    0
    GPU:
    GTX 970
    Even the gtx960 is Hdmi 2.0
    http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-960/specifications
     
  9. Spets

    Spets Ancient Guru

    Messages:
    3,077
    Likes Received:
    176
    GPU:
    RTX 3090
    AMD just keeps on making bad decisions.
     
  10. Ourasi

    Ourasi Master Guru

    Messages:
    294
    Likes Received:
    7
    GPU:
    MSI RX 480 Gaming X 8GB
    There was shown a bizlink displayport to hdmi 2.0 4K@60hz at CES, so if you absolutely need HDMI2.0 since you have a 4k TV-set with HDMI2.0 already, there is a way or atleast very soon..

    The total messup on HDMI2.0 standard is horrific, complaints everywhere, uncomressed color format failng with HDCP2.2 etc.. I'll never buy a UHD-TV without DisplayPort, ever..
     

  11. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,810
    Likes Received:
    3,363
    GPU:
    6900XT+AW@240Hz
    I think that is what he is trying to say... His statement shortened:
    nVidia states that cards have HDMI 2.0, but in reality it does not have all HDMI 2.0 capabilities to drive all HDMI 2.0 features.
    (Not that it would be 1st time when nV declares something what is not so true. Even DX12_1 feature set has to be tested 1st to see if it is real HW or just driver level = SW implementation.)

    But I think that at time DX12_1 goes live in games, even Fury X/Titan X will be kind of weak.
     
    Last edited: Jun 18, 2015
  12. nevcairiel

    nevcairiel Master Guru

    Messages:
    751
    Likes Received:
    290
    GPU:
    3090
    No GPU ever supported all HDMI features. CEC anyone?

    Yet NVIDIA does 4K @ 60Hz (at 4:4:4, of course) on the 900 series, which is the important part, and contrary to what that guy says, requires 12 gbps in performance, which wouldn't work on HDMI 1.4 link (which only does 10.2 gbps)
    Many professional review sites have confirmed this working, so don't take the word of some anonymous online troll for it.
     
    Last edited: Jun 18, 2015
  13. CDJay

    CDJay Member Guru

    Messages:
    133
    Likes Received:
    3
    GPU:
    NVIDIA GTX 1080 FE SLI
    That is completely and utterly *insane*.

    Good luck getting that into any Steam Boxes or HTPCs.
     
  14. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,810
    Likes Received:
    3,363
    GPU:
    6900XT+AW@240Hz
    There are things and things. Those DVI/HDMI can be OCed a bit. And nVidia apparently used in past this:
    TOM's-HW
    AMD has 4:2:0 available too if someone wants to use it, I never did.
     
  15. rl66

    rl66 Ancient Guru

    Messages:
    2,901
    Likes Received:
    366
    GPU:
    Sapphire RX 580X SE
    ... of course if you read ONLY amd test...

    on pro side it support 4K since few after release, on normal consumer as already stated even the 960 is capable (despite it is not intended too (i guess it might be horrible exept in video playback) as middle/low segment)...

    about power and bandwith you haven't get the right info...
     

  16. AlmondMan

    AlmondMan Master Guru

    Messages:
    728
    Likes Received:
    148
    GPU:
    5700 XT Red Devil
    Buy a dongle and win?
     
  17. leszy

    leszy Master Guru

    Messages:
    325
    Likes Received:
    17
    GPU:
    Sapphire V64 LC
    "In any case, with 4:2:0 4K TVs already on the market, NVIDIA has confirmed that they are enabling 4:2:0 4K output on Kepler cards with their R340 drivers. What this means is that Kepler cards can drive 4:2:0 4K TVs at 60Hz today, but they are doing so in a manner that’s only useful for video. For HTPCs this ends up being a good compromise and as far as we can gather this is a clever move on NVIDIA’s part. But for anyone who is seeing the news of NVIDIA supporting 4K@60Hz over HDMI and hoping to use a TV as a desktop monitor, this will still come up short. Until the next generation of video cards and TVs hit the market with full HDMI 2.0 support (4:4:4 and/or RGB), DisplayPort 1.2 will remain the only way to transmit a full resolution 4K image."
    http://www.anandtech.com/show/8191/nvidia-kepler-cards-get-hdmi-4k60hz-support-kind-of
     
  18. Oversemper

    Oversemper Member

    Messages:
    36
    Likes Received:
    20
    GPU:
    Radeon 7
    Then what about AMD Quantum which "we’re talking 60 to 90 fps in all games when played at 4K (3840*2160) resolution" and has Fury X2? They promote it as a living room set, right?

    Also, may be Fury's Display Port can send (via adapter) 4K@60 4:4:4 signal which can be correctly recognized by a TV?
     
  19. KissSh0t

    KissSh0t Ancient Guru

    Messages:
    9,077
    Likes Received:
    3,037
    GPU:
    ASUS RX 470 Strix
    Fury X is AMD manufactured card right? when MSI / Gigabyte / XFX and there others I can't recall of the top of my head.. When they do their versions maybe they will put HDMI 2.0 on their cards?

    Or won't there be other manufacturers making this?
     
  20. NamelesONEMail

    NamelesONEMail Member

    Messages:
    35
    Likes Received:
    1
    GPU:
    Zotac 1080Ti Extreme
    Lots of you people don't even come close to having the GPU power to drive 4k ... some of you can't even get 30Hz in games 1-2 years old at any decent quality settings yet all I see is people complaing about not getting 60Hz on the HDMI version on cards.
    Where's all this elitists c**p coming from ? ... why should they bother implementing something that less that 0.1% of people use (and I'm sure its much less than 0.1%) ? ... if a product is good for 99.9% of people that's all that matters ... those 0.1% don't matter ...
    The majority of people who buy a 4k TV buy it to watch TV(upscale on blu-ray looks great) not connect it to a PC so I doubt they care much about the few who actually DO connect them to PC's.
    I seriously doubt they will be weak by any stretch of the imagination ... I have a GTX 780 that drives most games to the max on 1440p at above 45-50Hz and its a fairly old card at this moment.
    If 700+ euro cards would be considered weak 1-2 years down the road then there would be little reason to buy one unless you were swimming in cash ... it would be much better to get a cheaper card and upgrade more often.
     

Share This Page