HDTV used as monitor

Discussion in 'The HTPC, HDTV & Ultra High Definition section' started by jodokast, Jun 21, 2011.

  1. jodokast

    jodokast Guest

    Messages:
    32
    Likes Received:
    0
    GPU:
    ASUS DirectCU DC2OC-R9290
    I'm thinking about using my Samsung LCD as a monitor, the model number is LN46A650. I'm curious on what cables are best to use. In the manual it says I must use a DVI to HDMI cable in the HDMI2 terminal. But then I would have to use a separate cable for audio. Does DVI to HDMI have any advantage or should I just use HDMI?
     
  2. JaylumX

    JaylumX Master Guru

    Messages:
    614
    Likes Received:
    41
    GPU:
    MSI 3080 TI 12G
    My reply to a similar thread. I use my TV as a monitor and connect my Panasonic TX-L32E30B to my Nvidia 470 via HDMI

     
    Last edited: Jun 21, 2011
  3. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,730
    Likes Received:
    2,701
    GPU:
    Aorus 3090 Xtreme
    Your post is good.
    It needs to be clear that DVI also supports all the enhanced pixel formats etc, as you said it is identical to HDMI in that respect but is limited in the maximum resolution using a DVI to HDMI cable.
    Dual DVI to Dual DVI gets around this if HDMI 1.4 isnt available.

    OP
    If you had a 120Hz TV you would need HDMI 1.4 or Dual DVI to Dual DVI (normally only present on monitors) to use all the features.
    As you have a 60Hz TV, just about any old HDMI cable or DVI to HDMI cable will work fine.
    (for longer lengths like 5m+, get a known decent cable, no need to pay silly money though)
    If using DVI from your video card, you need to feed separate audio to the TV/amp.
    If using HDMI from your video card, you can tell your PC to use HDMI audio and it will feed directly to the TV through HDMI.


    Note:
    The 120Hz part of your TV is frame interpolation, ie it inserts another frame between each 60Hz frame which is a hybrid of the previous and next frames.
    The TV wont accept 120Hz input.
     
    Last edited: Jun 21, 2011
  4. Sever

    Sever Ancient Guru

    Messages:
    4,825
    Likes Received:
    0
    GPU:
    Galaxy 3GB 660TI
    i think you kinda contradicted yourself a little there.

    for 120hz frame interpolation, he doesnt need hdmi1.4 support. as you said, TV only accepts 60hz input and the interpolation is done using the processor inside the TV. as a result, hdmi 1.3 is fine.

    in my case, im running hdmi 1.3b from my 560ti soc to my panasonic 37inch ips panel tv. i can still enable the 120hz frame interpolation functionality, but i generally choose not to as it seems to introduce more artifacts than forcing my computer to run interpolation. my TV also has no issues with rgb and ycbcr, but my receiver only supports rgb input so ive stuck with rgb.
     
    Last edited: Jun 22, 2011

  5. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,730
    Likes Received:
    2,701
    GPU:
    Aorus 3090 Xtreme
    I guess you only read part of my post.
    You quoted the bit you missed ;)
    I also said
    I didnt say he needs anything other than a standard HDMI connection.
     
  6. Sever

    Sever Ancient Guru

    Messages:
    4,825
    Likes Received:
    0
    GPU:
    Galaxy 3GB 660TI
    i didnt miss that part, i guess its just the phrasing that sounded odd to me after working a 20 hour day.

    when you said he would need hdmi 1.4 or dual link dvi for a 120hz tv, its not necessarily true, as 120hz tvs still only accept 60hz input, so hdmi 1.3 is fine.
     
  7. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,730
    Likes Received:
    2,701
    GPU:
    Aorus 3090 Xtreme
    Touche :)
    Although he wont get some of the main benefits of the display with HDMI 1.3, but yes you are right.
     
  8. TruMutton_200Hz

    TruMutton_200Hz Guest

    Messages:
    2,760
    Likes Received:
    1
    GPU:
    Iris Xe
    This is slightly incorrect.

    Using a special type of DVI adapter piece (which usually comes with graphics cards), it's possible for DVI to carry an audio signal (also, there are DVI cables to be found that eliminate the need for the adapter piece).

    The DVI specs do provision higher than 8-bit color formats, albeit only with Dual-Link DVI. Some high end visualization products (from Barco, for example) are capable of using 12-bit twin DVI. Also, the HDMI specs do provision dual link, albeit only with HDMI "B connector" type. Therefore, theoretically it's even possible to use "HDMI deep color" via Dual-Link DVI (but AFAIK the "B connector" has never been implemented).

    Quite obviously, Dual-Link DVI can support sterescopic 3D also. As a matter of fact, HDMI 1.4a supports 3D with 60 Hz input signals (60 Hz per eye, that is) at only up to 720p resolution (at 1080p resolution, it's actually limited to 24 Hz) - whereas Dual-Link DVI does not suffer from this limitation.
     
    Last edited: Jun 23, 2011
  9. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,730
    Likes Received:
    2,701
    GPU:
    Aorus 3090 Xtreme
    The maddest thing is that HDMI 1.4 has the bandwidth to handle 1080p at 120Hz with 24bit colour.
    It is specced to do 1080p at 60Hz with 48bit colour which is the same bandwidth.
    Such an unnecessary limit.
     
  10. Sever

    Sever Ancient Guru

    Messages:
    4,825
    Likes Received:
    0
    GPU:
    Galaxy 3GB 660TI
    my guess is they wont bump up to hdmi1.5 until something new comes into play. i dont think the wii u console will introduce anything new that is worthy of a hdmi revision update, but here's hoping.

    sucks that the hdmi 1.4 standard doesnt allow for 1080p60 3d. if it did, i would jump onto the nvidia 3dtv play bandwagon.
     

  11. TruMutton_200Hz

    TruMutton_200Hz Guest

    Messages:
    2,760
    Likes Received:
    1
    GPU:
    Iris Xe
    I guess if it did support it nobody would still want to buy a 3D capable Sony PS3. lol
     
  12. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,730
    Likes Received:
    2,701
    GPU:
    Aorus 3090 Xtreme
    It defies logic that they would do a worldwide re-release of 3D and not support upcoming standards that will really show it off, when it has the ability.
    Its practically as simple as adding a new table at both Tx and Rx to map the signals.

    Perhaps it will come in a new revision of 1.4 when 120Hz input TVs become mainstream.
    Fingers crossed.
     
    Last edited: Jun 23, 2011
  13. TruMutton_200Hz

    TruMutton_200Hz Guest

    Messages:
    2,760
    Likes Received:
    1
    GPU:
    Iris Xe
    Fixed! xD
     
  14. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,730
    Likes Received:
    2,701
    GPU:
    Aorus 3090 Xtreme
    :)
    I just thought of another issue.
    Newer 120Hz HDMI 1.4 TVs will support the new standard if it emerges, older 120Hz HDMI 1.4 TVs will need a BIOS flash.
    Older TVs probably support Dual DVI to get at the 120HZ's, so it wont matter for PC use unless Dual DVI isnt available.
     
    Last edited: Jun 23, 2011
  15. TruMutton_200Hz

    TruMutton_200Hz Guest

    Messages:
    2,760
    Likes Received:
    1
    GPU:
    Iris Xe
    You mean an EDID flash? Flashing it yourself is generally not recommended because even if you're sure you've downloaded the correct EDID update (for the correct model, from the manufacturer's official website) the EEPROM chip itself still has a slight risk of dying during the flash procedure, or there might be a power / mains failure (or even a failure of the storage medium needed to perform the flash procedure, like for example an USB stick) causing the flash procedure to fail and corrupt the EDID data that's stored on the chip. If that happens, the TV has to be sent in for repair. Flashing it yourself also may or may not void the TV's warranty, by the way.

    To avoid having to send in the TV for an EDID flash, as an alternative fix you could override the EDID in Windows (if the problem only manifests itself on a PC of course).
    http://msdn.microsoft.com/en-us/windows/hardware/gg487330
    On Nvidia cards however, AFAIK only the base EDID can be overridden and not the CEA Extension Block(s) - whereas ATI Radeon cards do not suffer from this limitation. I think the limitation is in the Nvidia driver software (yeah, it's not always only ATI drivers that suck... the ones from Nvidia can equally suck IMHO).

    To be able to override the EDID data with the modded EDID data, you obviously need to capture the EDID data that you want to mod before you can start modding it. Nvidia GTX 4xx cards lack the ability to let you capture EDID data (using moninfo freeware from EnTech Taiwan, for example), unlike all of the other cards AFAIK (including GTX 5xx). That is, they do allow you to capture EDID data if they have a HDMI connector (because DVI does not support capturing EDID data either IIRC). Moreover, long HDMI cables have been reported to often cause errors when capturing EDID data (even though the same long HDMI cables have also been reported to not suffer from other problems of any kind whatsoever).

    There's still various reasons why people would want to override an EDID even though, for the most part, these reasons have been eliminated by now - much thankfully, at last. lol

    I haven't seen any 3D TVs capable of using dual DVI for stereoscopic 3D.
     

  16. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,730
    Likes Received:
    2,701
    GPU:
    Aorus 3090 Xtreme
    Firmware flash with any new data that is needed.

    I believe there are some projectors and back projector TVs.
     
  17. TruMutton_200Hz

    TruMutton_200Hz Guest

    Messages:
    2,760
    Likes Received:
    1
    GPU:
    Iris Xe
    ^ +1, I didn't know that yet. :)
     
  18. mcantu

    mcantu Active Member

    Messages:
    84
    Likes Received:
    0
    GPU:
    GTX 560Ti 1 GB
    the manual says that because when the A650 was released very few PCs had HDMI ports. either HDMI or DVI to HDMI will work. what is your sound set-up? if youre not outputing to a receiver there really is no need for audio over HDMI
     
  19. TruMutton_200Hz

    TruMutton_200Hz Guest

    Messages:
    2,760
    Likes Received:
    1
    GPU:
    Iris Xe
    Not unless you want to use the built-in speakers of the TV, that is. I use them occasionally, for software that can use sound notification popups (multimedia converters and instant messaging software / e-mail client mostly) because my TV has an option to turn off its picture without also turning off its speakers so then I can turn off the power hungry amp of my separate 5.1 surround speaker package also, for additional powersaving whenever I don't need the higher sound quality for a while.
     
  20. Sever

    Sever Ancient Guru

    Messages:
    4,825
    Likes Received:
    0
    GPU:
    Galaxy 3GB 660TI
    some receivers can still do pass through audio when the receiver is off, so you can use those to pass through to your TV if you want to simplify the cabling.
     

Share This Page