DSR only showing 1080i

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by b0uncyfr0, Feb 9, 2017.

  1. b0uncyfr0

    b0uncyfr0 Master Guru

    Messages:
    241
    Likes Received:
    14
    GPU:
    Gaming XXX 1070
    I just acquired a new 1070 and tried to play some juicy downsampled 4k games on my TV only to find out i'm receiving 1080i for all downsampled resolutions. Why is this?

    Its working fine for me second monitor connected to the PC. My TV is natively 720p ; but i don't ever reading this as a problem.

    Am i missing something?
     
  2. MrBonk

    MrBonk Guest

    Messages:
    3,385
    Likes Received:
    283
    GPU:
    Gigabyte 3080 Ti
    Your TV and HDMI EDID standards are the cause.
    Because of maintaining compatibility with Video timings (Film, television,etc). It ****s up any real PC signal.

    Use Custom Resolution Utility , edit the extension block to remove HDTV resolutions. Remove 50hz resolutions if you want. Also remove 29.97hz resolutions. You can probably leave the 23hz resolutions.

    Also edit the HDMI support data block and remove the check boxes for YCbCr, everything is done internally in RGB anyway. So if your intention was to say watch film, it wouldn't do you any good to begin with unless your TV doesn't support RGB.


    Use the included Restart64.exe to reset the driver. Or restart your computer and then try it.

    My 720p does not work with DSR correctly unless all HDMI stuff is stripped out of the EDID so it is seen as a DVI monitor.
     
    Last edited: Feb 10, 2017
  3. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,730
    Likes Received:
    2,701
    GPU:
    Aorus 3090 Xtreme
    I have 3 display devices.
    When all 3 are enabled and 2 or more mirrored, DSR fails to work properly and prevents the NVidia control panel from opening once DSR is setup.
    ie click NVidia control panel icon, nothing happens. Reboot, same.
    DSR has bugs for sure.
    Its quite a pita for me.
     
    Last edited: Feb 10, 2017
  4. Keesberenburg

    Keesberenburg Master Guru

    Messages:
    886
    Likes Received:
    45
    GPU:
    EVGA GTX 980 TI sc
    I have a TV to but shows 1920x1080i and 1920x1080p and the same for DSR resolutions.
     

  5. b0uncyfr0

    b0uncyfr0 Master Guru

    Messages:
    241
    Likes Received:
    14
    GPU:
    Gaming XXX 1070
    Awesome - this worked ; i now have DSR at least showing some resolutions. But the maximum is still only 1440p. I thought i could use 4K downsampled even on a 720P TV?
     
    Last edited: Feb 10, 2017
  6. MrBonk

    MrBonk Guest

    Messages:
    3,385
    Likes Received:
    283
    GPU:
    Gigabyte 3080 Ti
    At 4x scaling, if your TV is actually 768p like it should be. Your maximum will be 2732x1536.

    4k is advertised for use on 1080p TVs because 4x DSR is 2x2 resolution. 1920x2 = 3840
    1366x2=2732.

    You can however get close to 4k by manually editing the registry. Or by using DSRTool.
    http://forums.guru3d.com/showpost.php?p=5121671&postcount=3088
    With a 768p TV, if you use a multiplier close to 7.84 it will get you near 3840x2160. But not exact because the Aspect Ratios are different.
     
  7. Mda400

    Mda400 Maha Guru

    Messages:
    1,090
    Likes Received:
    201
    GPU:
    4070Ti 3GHz/24GHz
    That isn't universal though if I don't experience this with DSR on my TV.

    Yes, lets freely remove things without understanding repercussions, should you happen to play content that relies on them.

    GPU's with HDMI/DisplayPort connectors are able to transmit the YCbCr colorspace when using content with that compressed colorspace. So no, not everything is done internally in RGB. Again, don't remove things without understanding how they benefit image quality of certain content (where least conversion steps are preferred).


    Seems the going trend with display setting issues is wipe everything out with CRU, end of discussion.


    Under Adjust Desktop Size and Position, do you have GPU or Display scaling selected and Aspect Ratio, Full-screen, or No-scaling as the method? GPU scaling will scale all resolutions to what it thinks your TV's native resolution is.

    Display/No-scaling is the recommended as it leaves the image adjustment to the display itself (what its designed to do. Else you pass the image through 2 scalers [GPU and your TV]) and can possibly enable DSR for other resolutions.

    Also make sure Desktop resize mode in the Size tab (under Adjust Desktop Size and Position) is set to Do Not Report.
     
    Last edited: Feb 14, 2017
  8. MrBonk

    MrBonk Guest

    Messages:
    3,385
    Likes Received:
    283
    GPU:
    Gigabyte 3080 Ti
    I totally understand things.

    HDTV EDID standards completely mess up PC signals. You remove them, and it breaks nothing. And only gives you the proper signalling a PC expects. Video timings serve no purpose on PC. You can completely remove them, and still use a custom 23.976hz resolution when playing back that content and the TV will still process it properly.
    And then have none of the issues having HDTV EDID information present that only serves to break things.

    A lot of games get messed up trying to use the correct resolutions when HDTV timings are present.

    You are lying if you've never seen a game try to boot up in 1080p60-RGB only to have it switch to a 1080pTV 16-235 23-29hz mode instead.
    Or if you try to use a resolution LOWER than 1080p the same happens with 60hz resolutions. It tries to scale it automatically to some wacky 1080p framebuffer in 16-235.
    I've seen this happen on both HDMI monitors and several TVs.
    Even Monitor Review sites have this issue with a lot of monitors.

    Just look at this list of bull**** scaled resolutions even on DP
    [​IMG]
    CRU has been a god send and i've never had a single issue using it

    50hz is useless unless you are planning on viewing 25hz video content. If you do, then keep it.

    I've had several TVs mess up DSR. I had one 768p TV that when using DSR thought it was a 1080p Native TV set, went into 4:2:2 16-235 mode, and used 4k as it's DSR resolution. Which looked incorrect compared to properly running that high of a ratio with DSR using 768p native with DSR Tool.


    And YES ALL processing done by the video cards internally is DONE IN RGB.
    Everything is converted to RGB before output. There's ZERO point to outputting YCbCr unless your TV doesn't like RGB.

    http://forum.doom9.org/showthread.php?p=1271418#post1271418



    The connectors are plenty capable. But they aren't the issue. It's the EDID video timing standards, video cards and drivers in combination.
     
    Last edited: Feb 15, 2017
  9. Supertribble

    Supertribble Master Guru

    Messages:
    978
    Likes Received:
    174
    GPU:
    Noctua 3070/3080 FE
    I'm using a 1360x768 display at the moment (my 1080p set broke) and I can get 2880x1620 using a custom resolution. Strangely 1440p doesn't work neither does 4K but for some odd reason 2880x1620 works perfectly.
     
  10. Mda400

    Mda400 Maha Guru

    Messages:
    1,090
    Likes Received:
    201
    GPU:
    4070Ti 3GHz/24GHz
    Its breaks audio over HDMI (turning it into a DVI cable) and you get stutter with content that expects those timings to be present.

    Because now you are having the GPU do the scaling work to the display's native resolution. The display cannot process what it doesn't have in its EDID. CRU (or using Moninfo to extract EDID and Phoenix EDID Designer to install that file over Generic PnP monitor) acts as an override. By doing so, you are disregarding what the display actually supports but you cannot actually remove it from the display itself.


    Like Crysis (which can be fixed by alt-tabbing till it picks up the desktop's display settings)? That is the fault of the application and its developers not coding it properly to detect these features of the display. This behavior has been going on since I had my GTX 480 and a few HDTV's later (LG 32LD450, LG 32LN5300, Samsung UN40J5200, and now LG 55UH6150).

    The display is providing information of how it scales the image fastest and cleanest by telling the graphics card what dimensional framebuffer to put the image into. If the display is 16:9 aspect ratio and you send it a 16:10 aspect ratio such as 1680x1050, the fastest way for the display to show this is putting the 16:10 image in a 16:9 framebuffer that the display can show with little to no scaling involved.

    Since DP supports video material options like YCbCr and ATSC resolutions like 720p and 1080p, it has to conform to these features.


    Probably because you view nothing that relies on it (or that you thought you haven't). Doesn't mean it should be the same for anyone else.

    In other words, its not useless.

    I would believe that DSR is at fault here, not the display as these types of displays have been around for a while and DSR is two years old with what should have been built around decades worth of image adjustment.

    This next bit is my opinion (remember, it all is unless you choose to agree) but downscaling is messier than simply supersampling. Of course they provide the same spatial resolution in the end, but the steps they take to get there is different. Downscaling messes with the UI of an application (4k on a 1080p means 4x smaller UI on a 4x smaller resolution). Supersampling just multiplies 2x,4x,etc. the pixels around the original image which leaves the UI untouched.
    But i know, they say downsampling is for when its not possible to supersample so that's your choice.


    That link was before Nvidia had produced an option to set colorspace in its control panel. Even though they had HDMI video cards since, they neglected the support of its capabilities till they offered contrast level and colorspace toggles (which were still possible, but through registry only).

    GPU's do output YCbCr. Else how would my 4k LG 55UH6150 be able to display 4k 60fps with the 'UHD deep color' option disabled (which is essentially the ability for the display to run its processing engine to use all 18gbps of its HDMI 2.0 interface solely for 4k, 60 fps, 4:4:4 RGB. If you use audio along with it, the display will flicker from the lack of bandwidth)? Seventh generation consoles even output YCbCr.

    I agree that the cables/connection isn't the issue and never said it was. It's one of those three you stated, the drivers. Standards are standards for a reason, they should be followed. Hardware is made from a standard (you know, hard meaning firm). There is no standard for the software, you can pretty much do whatever to the software to make the hardware behave in a certain way, but will it always do that?

    Simply put, its mostly the software (or more commonly the user). Think of the display as the piece your device should revolve around. Its what causes you to choose a certain graphics card after all.
     
    Last edited: Feb 20, 2017

Share This Page