Run 1080i on old TV

Discussion in 'The HTPC, HDTV & Ultra High Definition section' started by Stevethegreat, Feb 18, 2016.

  1. Stevethegreat

    Stevethegreat Guest

    Messages:
    42
    Likes Received:
    2
    GPU:
    eVGA Geforce 8800 GTX (630/1000)
    Hello, I've found an old TV ("HD-ready") than can only get component input (no HDMI).

    Anyway through using old parts I've managed to make an underpowered HTPC to drive that TV. Only problem is that the resolution is locked at 720p. Though nVidia control panel I've managed to enable 1080i but the image quality gets much worse and most of the image is outside the TV's frame.

    Now, I know that that the TV supports 1080i (it says so on the manual). The HDMI to component adapter supports 1080i too, why then the result is as atrocious?

    I'm going to use the the TV for light desktop computing and (mostly) videos/movies. Nothing too demanding (no gaming). 720p makes it useless for desktop use though, everything's too big.

    Any idea why 1080i misbehaves? I've read of this issue quite a lot on the net but most didn't know why it is so. Others were able to output 1080i no problem.

    My graphics card is a GTX 750 ti, it's the only relatively good part of that HTPC so I'd imagine it should not be culprit.

    Any ideas?

    Thanks.
     
  2. Extraordinary

    Extraordinary Guest

    Messages:
    19,558
    Likes Received:
    1,638
    GPU:
    ROG Strix 1080 OC
    Set the scaling option in NVCP ?

    [​IMG]
     
  3. Stevethegreat

    Stevethegreat Guest

    Messages:
    42
    Likes Received:
    2
    GPU:
    eVGA Geforce 8800 GTX (630/1000)
    It can fix the image (being outside the frame), but it's the quality that is very bad too. 1080i for static images should be as good as 1080p, yet I'm getting something worse than 720p...
     
  4. Anarion

    Anarion Ancient Guru

    Messages:
    13,599
    Likes Received:
    387
    GPU:
    GeForce RTX 3060 Ti
    Of course it's worse since the display is likely 1366x768. It's almost always better to use 720p instead. On those old TVs downsampling interlaced content gives rather bad results.
     

  5. EspHack

    EspHack Ancient Guru

    Messages:
    2,799
    Likes Received:
    188
    GPU:
    ATI/HD5770/1GB
    manufacturers and their marketing bs... the tv most likely "supports" 1080i in the way of just getting the content and displaying it downsampled at its native resolution which is likely to be 1366x768 as Anarion said
     
  6. Jw_Leonhart

    Jw_Leonhart Ancient Guru

    Messages:
    4,178
    Likes Received:
    0
    GPU:
    EVGA GeforceGTX 770 SC
    I can confirm that your problem is that the TV has a resolution of 1366x768. I had the same issue when I was trying to play movies on my TV because I was trying to set it to a higher resolution only to find out through a lot of looking around that even though the TV was 1080i its native resolution was 1366x768.

    It sucks but its the way it is and those older HDTVs are almost worthless to try and get a clear desktop picture out of. If you use a monitor and then just play videos on the TV it should look fine, but web browsing and such will probably not be so great.
     
  7. TimmyP

    TimmyP Guest

    Messages:
    1,398
    Likes Received:
    250
    GPU:
    RTX 3070
    i=interlaced 30hz
    1080i is half the resolution (540 lines) being displayed interlaced.

    The display is a 720p (p=progressive, non interlaced, 720 lines all the time) though its probably a "768p" (1280x768 native, make sure you find this out, Ive seen windows say 720 native and set as such, but the technical physical resolution is 1280x768, so create a custom resolution)
     
    Last edited: Feb 26, 2016

Share This Page