A few question about playing HD material from my pc on my TV.

Discussion in 'The HTPC, HDTV & Ultra High Definition section' started by tonvanrijn, Oct 8, 2007.

  1. tonvanrijn

    tonvanrijn Active Member

    Messages:
    74
    Likes Received:
    0
    GPU:
    MSI GeForce GTX 460 OC
    I have built a pc for use with my Sony Bravia TV, to play HD material and DVD files direct from the harddisk. The videocard is a XFX7600GS and I installed the 163.71 drivers. I use VLC media player as the default player.
    Now I have a few questions, particularly about the resolution to use and about deinterlacing, because everything is rather new for me and a little confusing. I hope you can help me and/or give me some advance.
    1) The TV is HD-ready and can display 720p and 1080i. The pc's videocard is connected to the HDMI input of the TV through a DVI to HDMI adapter. In the NVidia drivers (163.71) I can choose between 720p and 1080i, but I noticed that when I choose 1080i the screen is slightly flickering. Is this normal or do I have to make other settings ? When I choose 720p there is no problem and the screen is perfect.
    2) Apart from the "problem" under 1), I assume that when I want to play HD material in 720p the resolution has to be set to 720p in the drivers. Am I right ?
    But what should I do with 1080i material ? Is it best to leave the settings on 720p and to use deinterlacing ?
    Thanks in advance.
    Ton
     
  2. tonvanrijn

    tonvanrijn Active Member

    Messages:
    74
    Likes Received:
    0
    GPU:
    MSI GeForce GTX 460 OC
    Nobody ....?
     
  3. Passion Fruit

    Passion Fruit Guest

    Messages:
    6,017
    Likes Received:
    6
    GPU:
    Gigabyte RTX 3080
    1080i flickers because its not a progressive format, thats why standard TV's flicker. Its drawing two halves of the image seperately so to speak. Progressive draws the image all in one go which is why it doesnt flicker (in Lehmanns terms)

    If you set the drivers to 720p you'll have no problems playing 1080i material.
     
  4. tonvanrijn

    tonvanrijn Active Member

    Messages:
    74
    Likes Received:
    0
    GPU:
    MSI GeForce GTX 460 OC
    Thanks Passion Fruit.
    But will I have to choose "deinterlacing" then in VLC media player, when playing 1080i ?
     

  5. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,509
    Likes Received:
    18,809
    GPU:
    AMD | NVIDIA
    Use 720P it doubles the framerate (preventing flicker opposed to 30 FPS interlaced at 1080) and is closest to your television's native resolution.

    You could even set your TV at the proper native resolution (1366 x 768) which quite frankly would be optimal.
     
  6. tonvanrijn

    tonvanrijn Active Member

    Messages:
    74
    Likes Received:
    0
    GPU:
    MSI GeForce GTX 460 OC
    Thanks Hilbert, but how should I do that ?
    When I choose custom resolution in the (163.71) drivers and "create" 1366 x 768 and do the test, it says "error, test failed".
     
  7. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,509
    Likes Received:
    18,809
    GPU:
    AMD | NVIDIA
    Then revert back to 720P. Chances are though that you'll have some under- or overscan. With the 163.71 and likely higher drivers you can fix that by the way.
     
  8. Belz

    Belz Master Guru

    Messages:
    552
    Likes Received:
    0
    GPU:
    Sapphire HD5870 1GB
    Uhm... Does VLC support Nvidia PureVideo?

    I would use a player that supports PureVideo like WinDVD or PowerDVD, since that will make the deinterlacing and other functions work much better and with less cpu resources.
     
  9. N0sferatU

    N0sferatU Ancient Guru

    Messages:
    1,772
    Likes Received:
    153
    GPU:
    EVGA RTX 3080 Ultra
    sounds like it's your TV. I had a Sony HDTV Ready RPTV (no ATSC tuner and did 720p/1080i). On 1080 she'd flicker (1920x1080) but not on 720p (1280x720).

    My current LCD now supports 1080p and either resolution works fine. Even if I play 1080i content I don't get any problems with high motion video (e.g. the streaking if it's not deinterlaced). Media Player Classic does the job for me just fine. :)
     
  10. tonvanrijn

    tonvanrijn Active Member

    Messages:
    74
    Likes Received:
    0
    GPU:
    MSI GeForce GTX 460 OC
    Hilbert, only for better understanding, could it be that 1366x768 isn't possible because I use the HDMI input of my TV and that this input only allows specific resolutions ?
     

  11. ogheeren

    ogheeren Master Guru

    Messages:
    244
    Likes Received:
    0
    GPU:
    SLI 8800 GTS 512 760C/1900S/2100R
    No, it is a nvidia driver issue. HD playback has flaws with the nvidia drivers anyway. Use programs like powerstrip to overide the driver settings. 1080i flickers on nvidia drivers because frame timing is bad on nvidia drivers. If you use 1080i 50 or 60 you won't see much flickering if frame timing and ddc is working, but with nvidia drivers it doesn't. To make it easy if you don't want to use any complex tools:
    Try to set the resolution to 1280x768, this resolution is supported by most nvidia drivers and reported by ddc. If not, manually enter the resolution of 1360x768, this will stand the awesome nvidia test bull****. If your TV only supports the small hd resolution this should be the easiest way to get an acceptable quality. Your computer downscales the 1080i to your resolution. Only activate deinterlacing when you experience tearing.

    @Hagedoorn: 1080I 25/30 is an artifical ad speak. There is no such thing like 1080i with 25 or 30 frames. The stream exists of 50 or 60 frames with line spacing. in fact real 25/30 frames per second only exist with progressiv scan. By the way, if you use a lcd panel you will never see interlacing, it is always progressiv, just the input is interlaced and is deinterlaced by the scaler unit.

    Sorry for my english, it is not my native language
     
  12. tonvanrijn

    tonvanrijn Active Member

    Messages:
    74
    Likes Received:
    0
    GPU:
    MSI GeForce GTX 460 OC
    Thanks Ogheeren for your reply.
    Unfortunately 1360x768 also gave me the error message.
    Mind you I am a complete noob and don't want to sound self-willed, but I still wonder if it isn't the HDMI input. I looked in the Sony manual and there it says under Specifications:
    HDMI In 5: 1080i, 720p, 576p, 576i, 480p, 480i.

    Btw, I don't understand what you mean with "....if you use a lcd panel you will never see interlacing...". With some material I do see interlacing although my Sony TV is a LCD. Or did I misinterpret what you wrote ?
     
  13. Joey

    Joey Guest

    Messages:
    4,144
    Likes Received:
    0
    GPU:
    2600XT + Panasonic S10
    I had HDMI flickering on my TV with a DVI -> HDMI converter. There is an option in the ATI driver menu under "DTV - attributes" for.. "alternate DVI operation mode" this stopped the flickering entirely.
    But there is an option right beside it to "reduce DVI frequency on high resolution displays" I have found this messes something up and the TV says "out of range" so maybe don't try that one. It's annoying of ATI to have this on by default when you install a new driver... bastards.
    I'm not sure about the specifics of your flickering... but mine were, for instance, the top half of the screen would go to black for a split second.. thick horizontal lines as well.. not constant. If you can find the equivalent settings in the Nvidia driver that might help.
     
  14. N0sferatU

    N0sferatU Ancient Guru

    Messages:
    1,772
    Likes Received:
    153
    GPU:
    EVGA RTX 3080 Ultra
    I don't want to jinx myself but I've had zero issues with HD resolutions and playback on my 42" HD display using a simple $7 DVI-->HDMI cable :p
     
  15. ogheeren

    ogheeren Master Guru

    Messages:
    244
    Likes Received:
    0
    GPU:
    SLI 8800 GTS 512 760C/1900S/2100R
    The specs of the tv yes but not the specs of hdmi. Maybe your tv only allows the resolutions you summed up but you can use hdmi for all resolution which are within the parameters. But it wouldn't be normal for a hdtv to only display those specific hd resolutions. Normally at least standard resolutions are supported like 800*600, 1024*768, 1280*1024, 1600*1200 and widescreen resolutions. I'am development engineer of hdtv's and I would never let out a tv which isn't capable of doing these resolutions. Sony doesn't the same as far as I know, but well, they also released tvs which are only capable of 1080i but have a 1080p panel, so everything is possible.....
    To second point: No, you never saw interlacing on your tv, you only saw the effects of deinterlacing or an single interlaced frame stuck in the frame buffer. LCD panels only do progressiv scan, that is a technical and physical issue concerning tft displays. The flickering you see has other reasons, some i wrote in my last post.
    I know it can be very frustrating to get per pixel mapping running on a hdtv, manufacturers of hdtv and gfx cards don't take norms very seriously which end in a bad compatibility. This problem is so bad that 50% of my job consists of testing, writing new firmware and swearing about this topic. Give it a Bob Marley stile: "Don't give up the fight".

    @N0sferatU: Yeah, right card, right tv and nothing is easier but connecting a computer to a hdtv over DVI/HDMI cable. Simple as USB. Same for me at home.
     

  16. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,509
    Likes Received:
    18,809
    GPU:
    AMD | NVIDIA
    That's very possible. I have tested a good number of HTPCs and LCD's/Plasma and most of them have a problem with exactly that resolution. Only a VGA (analog) connection usually picks up that resolution fine (which is still bloody brilliant quality).

    I know, it's very confusing. And ogheeren is absolutely right about the LCD being Progressive only. The problem however is simple. You select 1920x1080 ... yet your screen is 1366x768. So the illusion that people seem to have is that their screen can actually output at 1920x1080 which is not true, you are looking at 1366x768 active pixels on your screen, not one more.

    So in the end, pick a resolution as close to your native resolution as possible and with the drivers under/overscan until the screen fits properly in your screen which work perfectly with the 162.71 ForceWare drivers.

    I so wish that LCD's could have full per-pixel support & detection, as that would mean perfect, well, maximized image quality.

    Ogheeren .. why do monitors do this just fine and HDTV's mess this up ?
     
  17. Vampiro2000

    Vampiro2000 Member

    Messages:
    27
    Likes Received:
    0
    GPU:
    XFX 8800GTS 320MB@580/1350/850
    I have a similar problem... I use a htpc with a LG HD Ready LX2R LCD TV 32'. While in desktop the image is full. With no bars or what so ever. When I watch a movie, say with a ratio of 1:78.1 or 1:85.1 I get black bars on top and bottom! My standalone dvd player only shows that bars when playing movies with a ratio of 2:35.1 or 2:40. The other ratios (except 4:3 - 1:33.3) always fill the image completly... My graphic card is an ATI HD2400 Pro. But there is another thing. Using Media Player 11 to play an HD IMAX Movie enconded wiht WMV HD, it shows the image completly without those bars... So I don't know what to do....
    Usind DVI-DVI connection, and also tryed DVI-HDMI...
    Thanks
     
    Last edited: Oct 11, 2007
  18. ogheeren

    ogheeren Master Guru

    Messages:
    244
    Likes Received:
    0
    GPU:
    SLI 8800 GTS 512 760C/1900S/2100R
    Have the slight impression that there is no problem but that you are looking at 21:9 material and your dvd player is just cropping the left and right border of the picture. Or do you have a wrong aspect ratio of the content? In this case it would be to wide.

    @hilbert hagedoorn:
    thats almost a pilosophical question ;-) Well, the main reason many tvs do overscan (or even worse cropped underscan) is to avoid pixel errors at the borders of the picture. Next problem is that so many source devices deliver bad ddc values or they are misinterpreted by the tvs. Another problem is that as a tv manufacturer you have to produce a tv that is compatible especially to "dumb" devices which are not capable of changing their offset and other values like dvd players, receivers, etc.. It is difficult to write a software which is able to recognize a smart and exact device or a dumb and imprecise and react in the correct manner. That is the reason why many manufacturers have a "PC" or "Game mode" setting for aspect ratio to give the user the opportunity to manipulate the setting the way they want it.
    Last thing, graphic cards often only support clean aspect ratios. 1366x768 is not a clean 16:9 aspect ratio (you can calculate easily). That is the reason why many cards (or to be more exact drivers) forbid this setting. Try to force the resolution with other tools like power strip for example. Many people get their per pixel mapping to work with these tools. Or buy a good full hdtv (real full hd: 1080P!) with real native 16:9 1920x1080 resolution... just kidding, I know this is not a solution but the reason people with these tv have less problems with per pixel mapping. Less, not none....
    It is a complex topic and getting deeper in this topic would be boring for all of you I guess. I'll just try to give some tips, I hope I can help some people.
     
  19. Vampiro2000

    Vampiro2000 Member

    Messages:
    27
    Likes Received:
    0
    GPU:
    XFX 8800GTS 320MB@580/1350/850
    Don't understand what you are saying... Please be more especific. So, its a LCD, graphic card or a driver issue?
     
  20. ogheeren

    ogheeren Master Guru

    Messages:
    244
    Likes Received:
    0
    GPU:
    SLI 8800 GTS 512 760C/1900S/2100R
    None off all. Let's have a different approach: Are you sure you have 16:9 material or is it 21:9? Some player cropp the left and right border of content to have full screen 16:9. Similar to cropping of a 16:9 video to 4:3: a part of the left and right border of the picture is cut off to fit it on the screen without or with smaller black borders on the top and bottom of the screen. Most players on the computer give you full picture playback, therefore you would see the black borders on top and bottom. IMAX movies as wmv hd are often encoded as 16:9 so you will have full screen on a 16:9 tv.
    To make it short: Is content with black borders on top and bottom real 16:9 content?
     

Share This Page