Forcing refresh rate does not seem to work.

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by PrMinisterGR, May 22, 2020.

  1. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    Hello,

    I have a GTX 1070, and an LG C9. The TV is directly connected to the GPU via a good quality HDMI cable.

    I get all the resolutions and refresh rates that the display exposes, there are no issues except one:

    Games that I run at 1080p will always default to 60Hz, even if specified to run with the highest possible refresh rate. Selecting the refresh rate manually from either Windows itself or the Nvidia control panel works just fine.

    Any way to make this work? Also any idea why it defaults to 60Hz no matter what?

    Windows: Windows 10 1909 18363.836
    Nvidia Driver: 445.87, clean installed with DDU.
     
  2. MrBonk

    MrBonk Guest

    Messages:
    3,385
    Likes Received:
    283
    GPU:
    Gigabyte 3080 Ti
    If you are trying to hit 120hz only. Your best bet is to probably use CRU to create an EDID override that disables all display modes except 120hz.
     
  3. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    Any reason that the force doesn't work?
     
  4. MrBonk

    MrBonk Guest

    Messages:
    3,385
    Likes Received:
    283
    GPU:
    Gigabyte 3080 Ti
    With Nvidia Drivers, who knows honestly.
     

  5. anxious_f0x

    anxious_f0x Ancient Guru

    Messages:
    1,907
    Likes Received:
    616
    GPU:
    ASUS TUF RTX 4090
    Not sure if it works the same on newer models but on my LG B7 I had to rename the input label on the TV to “PC” for games to recognise the higher refresh rate, although some games still needed a custom resolution/refresh rate to work correctly, was usually games that didn’t have refresh rate settings in game that caused me problems.
     
    Maddness likes this.
  6. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    You won't like this but: replace the GTX1070 with an RTX2060. This "highest refresh rate" setting never seemed to work for me either (GTX1070 + LG C7, C8, C9) until I gained G-Sync (RTX2060 + C9).
    Of course the only magic of Turing is probably synthetic market segmentation. I tied using a passive DP->HDMI adapter (about $4 from aliexpress) between the C9 and both VGA cards (using DP ports on both, obviously) and the result was the same: G-Sync works with Turing only (so much about the capabilities of the "native" HDMI port which is most probably a DP port in standard HDMI Alt-Mode with integrated voltage level shifter anyways).
    G-Sync is worth it anyways. Actually, I barely payed ~20% extra (from used card to another used card on second-hand market) and gained a little speed as well in some games (may be 5, 10 at top but only in recent games/benchmarks) for significantly less power consumption. And I get to experiment with DLSS 2.0 today to see how it might performs when I can get an Ampere card with HDMI 2.1 to do some upsampling for 2160p120.

    Hmm. The C 7, 8, 9 don't need PC mode for 1080120 and the B7 shares the same SoC as the C7. But it could have changed with firmware updates. Or it was a coincidence and you will still have 1080p120 after removing the PC label.

    PC mode has it's uses on certain models but do mind that it causes decreased precision (visible banding on near-black shades, especially in HDR10, you might even catch some crazy magenta tinting in PC mode HDR10). I am not sure if the CX fixed this yet (it's still present on the C9, same as it was on the C7).

    But on the B7, you can't set the SDR Game mode up for proper colors, so yes, PC is your best bet for SDR Games. For both SDR and HDR movies and even HDR games (HDR Game mode can be configured for correct colors!), normal TV mode is better (well, HDR gaming is debatable, some will prefer the 4:4:4 sharpness of PC mode, other the higher shading precision of TV mode).
     
    Last edited: May 25, 2020
  7. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    That's a bit weird, honestly. I cannot see why something like that would be restricted to a single GPU family only. Also you're basically indicating it's not working with the Turing card also, as Gsync would make that setting meaningless.


    A lot of people forget that PC mode for these TVs is terrible, especially with NVIDIA GPUs. The banding is horrible. The ideal setup for them is YCbCR 4:2:2 with 12bit color, and certainly NO PC mode. You lose nothing (except a tiny bit of really small text precision which is irrelevant more than 20cm away from the TV), and you gain exceptional color and gradient handling. It's really no contest.
     
  8. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    nVidia is (most probably) artificially limiting G-Sync Compatible support for LG TVs to the Turing generation (GTX1660Ti or Super should work as well but I think the RTX2060 is a better choice: somewhat faster and you can experiment with DLSS and DX12 Ultimate features).
    Not at all. You are still limited to: 1080p120, 1440p120, 2160p60. I prefer the Windows desktop in 2160p60 and games in 1440p120. Effectively both benefit from G-Sync in one way or another: ALLM is tied to G-Sync, so you automatically get low latency mode for the Windows desktop as well (unlike with FreeSync on AMD where it's limited to fullscreen apps for some reason) but it disengages when madVR switches to 2160p23 (so you get smoother gradation for movies, although ALLM is not nearly as bad as PC mode). You also get to decide if ALLM also switches to Game mode or not (it does by default but you can override it by selecting, say, Cinema while playing a game, it won't auto-switch again if you manually switched once).
    Some games won't be able to (or at least allow you to) switch refresh rates but resolutions only while going fullscreen. The NVCP option is for those games.
    G-Sync with ALLM is very convenient (better than FreeSync due to how they handle windowed games and thus the plain Windows Desktop in regards to ALLM). I pretty much never have to touch the TV remote again (except for the power switch but even that could be automated with third-party software if I really wanted).

    The only missing component is HDMI 2.1 on the VGA card (or a DP->HDMI 2.1 adapter).

    PC mode is bad enough but still the best for SDR games on some older models (which have no properly configurable Game mode) and more convenient for gaming in general (even HDR included) than switching manually back and fort between modes with the remote (the C9 + G-Sync + ALLM solves this all). I often used PC mode on <2019 models for the desktop and some games (but kept switching back manually for movies and I often debated what's better for HDR games).

    By the way, the TV actually prefers to (or rather, would be preferred to) be fed with Full RGB >=10bit (in normal, not in PC mode). It's said to convert all kind of input (YCC, Limited RGB) to Full RGB as a first step (even though it converts everything, Full RGB input included, to some kind of YCC 4:2:2 internally for some processing steps -> a huge legacy issue which they simply aren't willing to part with, not even in this decade, they should have forgotten about 4:2:2 internal processing while transitioning to the HDR era, let alone after stepping into the 3DLUT and especially the HDMI 2.1 and VRR era, 4:4:4 should be a give in any and all modes, at least for low latency modes without visible drawbacks).


    This is how I set NVCP for the C9 (sorry for the non-English annotations, they are not relevant here). The refresh rate alsways switches to 120Hz when the game switches to 1440p (with no option to select the refresh rate). Otherwise it would remain in 60Hz.

    [​IMG]
     
    Last edited: May 26, 2020
    PrMinisterGR likes this.
  9. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    Yeah, that was my point of YCbCr 4:2:2. That's what it really uses internally. It's kind of terrible that Nvidia didn't support variable refresh via HDMI for older cards, I can't really believe that they didn't support it either. Also, unlike my ancient 7970, the Nvidia card seems to give monitor signal to a random port on boot, which is hilarious.
     
  10. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    Same as pretty much every digital HDTV since their inception around 2006 or so. Except a lot has changed over the years, especially with the introduction of these various HDR formats. The most demanding processing steps which can benefit from it still use chroma sub-sampling internally (but many of these are completely optional or even forcefully disabled in many operation modes like PC and/or Game and/or VRR, etc) but a lot of processing (including some stuff that's quays-mandatory like color management, especially for HDR content) is now done in Full RGB.
    Many HDTVs and media player boxes of the FullHD SDR era used to have a "true" YCC pipelines (up to the display driver which was obviously always RGB for panels with RGB sub-pixels), so it made a lot of sense to keep this multi-generation legacy thing around. Nowadays, however, it's a relic which clings on the edge for small cost savings. Most media player boxes use RGB internally as well as most TVs use RGB for many steps, so both devices keep converting back and forth, back and forth between RGB and various YCC-like formats...
    I don't have an official source for this but as much as I know, the LG Alpha (7 and 9, all Gen up to this day) processors always convert any and all kinds of input (from HDMI and possibly from other sources, might even the internal video player apps) to Full RGB as the very first step before anything else is done to the picture. And that means Full RGB is the optimal input format for them even if various interface bandwidth limits don't allow it or sometimes other formats might look visually better on the screen (like in PC mode where internal dithering is turned off or tuned to lower quality, so dithered input shows less banding).

    The UEFI Setup screen used to appear on the small DVI LCD connected to my AVR (sitting on a DP port with an HDMI adapter) but after some change I am not sure about it now appears on the TV (connected to the HDMI port on the VGA). May be it was a VBIOS GOP update or may be I use a different DP port for the AVR now (after I disassembled everything for a big room cleaning). I think it's the latter. I guess I used to plug the DP adapter to the right and not the left of HDMI.
     
    PrMinisterGR likes this.

  11. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    The amount of misinformation around this is kind of staggering. After spending a lot of time experimenting with settings, and reading literally thousands of pages in forums and tests, I just ended up using Ycbcr422 12bit from the GPU and setting the TV to Game console.
    I literally lose nothing, I don't need to have the useless rgb hdr that windows defaults to, and the TV doesn't lose any of its processing goodies.
    It's also better looking with the naked eye.
     
  12. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    I also use YCC 4:2:2 12bit for the C9 and RTX2060 with HDMI 2.0 ... but ...
    - if I had a VGA card with HDMI 2.1 then I would always feed the TV with Full RGB 12bit (based on the available info about the inner workings of the TV SoC plus the fact RGB is the native format of the GPU, YCC output is converted and dithered)
    - if the TV's PC mode wouldn't have any quality drawbacks then I would definitely keep the TV in PC mode at all times (to benefit from the extra sharpness of 4:4:4 chroma)
    May be one day we will have both (HDMI 2.1 full bandwidth and a non-gimped PC mode [hopefully in the form of standardizing 4:4:4 processing for any and all operation modes]).
     
  13. anxious_f0x

    anxious_f0x Ancient Guru

    Messages:
    1,907
    Likes Received:
    616
    GPU:
    ASUS TUF RTX 4090
    Yeah I don’t exclusively use the PC mode, at the time I was just testing out the 120Hz mode at 1080p and for some reason PC mode was all that worked, may have been fixed with later firmware versions.

    Usually I just run 4K 60Hz with SDR, HDR on PC is a bit of a mess, I wish there was just one universal implementation that devs could use. Consoles have nailed it, Windows not so much.
     
  14. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,730
    Likes Received:
    2,701
    GPU:
    Aorus 3090 Xtreme
    My guess is because you quit Jedi academy before your basic training ended.
    Yoda doesnt know yet, you might be able to save face still if you sneak back in.
     
    anxious_f0x and PrMinisterGR like this.
  15. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,730
    Likes Received:
    2,701
    GPU:
    Aorus 3090 Xtreme
    Forgot to add...

    I have the very same problem with 1440p only working at 60Hz with my Samsung Q9, no matter whether I set the desktop refresh to 120Hz, on my 1080ti.
    I even tried an Edid override which didnt help.
    But my reason could be different to yours, I am passing through a Denon AVR. (it supports ALLM, Auto Low Latency Mode which is actually pretty good, apart from the 60Hz issue)
    When I connect direct to the TV max refresh works properly.
     

  16. Shadowdane

    Shadowdane Maha Guru

    Messages:
    1,464
    Likes Received:
    91
    GPU:
    Nvidia RTX 4080 FE
    VRR over HDMI is only supported on HDMI 2.1, which isn't on any video cards yet! VESA Adaptive-Sync is a Display Port 1.2A standard, it's not technically supported on HDMI connections.

    AMD was the ones who took the the Adaptive-Sync standard and got it working on HDMI. The implementation is propitiatory to AMD though. So there really isn't anyway for Nvidia to just support it, same goes for the actual GSync screens.
     
  17. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    Turing does.
     
  18. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,036
    Likes Received:
    7,378
    GPU:
    GTX 1080ti
    Turing is HDMI 2.0, and VRR is optional on 2.0, as it is on 2.1.
     
  19. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    There are a thousand layers of smoke, colored with both intended and unintended, semi-official and unofficial,.... mis/information around that topic.

    1: someone claimed they started reverse-engineering AMD's proprietary FreeSync over HDMI<2.1 and found it's not a copy-paste of VESA Adaptive Sync from DisplayPort<1.4a (but uses the CEC channel for some additional two-way communication beyond the initial hand-shake, etc)

    2: someone claimed that they looked into the actual standards (the non-free documents) and HDMI 2.1 VRR is significantly different from VESA Adaptive Sync (a very good question why the hell they felt the need to reinvent the f'n wheel instead of aiming for easy work and wide compatibility but...)

    3: LG says HDMI 2.1 VRR is the only adaptive sync tech their 2019 televisions support. However they also mention G-Sync Compatible (2019 and '20) and FreeSync (2020 only) certifications.

    4: nVidia claims they only support HDMI 2.1 VRR over HDMI (at least for displays which do not have a G-Sync module, especially not the latest revision of the module with the latest firmware)

    5: some claim that the latest versions of G-Sync modules support some kind of G-Sync over HDMI<2.1 and it also works with AMD cards (over HDMI), effectively making them AMD FreeSync (the original) compatible but not HDMI 2.1 VRR compatible (at least that's not what they say).

    6: AMD doesn't indicate they support HDMI 2.1 VRR with their current cards and drivers

    7: The new G-Sync modules are said to use an exact (not customized) VESA Adaptive Sync implementation over DisplayPort, nothing fancy, nothing proprietary (the original modules used some completely unique and proprietary 1ms polling based mode over dual-link DVI-D but that's almost archaic now, most monitors and Geforce cards don't even have real DVI ports anymore and G-Sync won't work over active adapters and DP ports have no dual-link DVI-D Alt-mode, only single-link HDMI Alt-mode for passive adapters, so let's not complicate the current situation with that old case)


    Now, if you start putting all those together and you also tried running your LG C9 with FreeSync after a simple one-liner EDID override trick, you will probably wonder how all the above could possibly be true in the same universe at the same time without phase shifting aliens morphing into our realms to spread some well intended lies in order to generate a profit. And I probably left out some of those (potentially even more contradictory) "facts".
    But most people are sheep and actually enjoy blindly accepting whatever they were told by their rulers, so these things are rarely discussed this way. (And otherwise it's indeed useless since it doesn't really matter. Even if you proved that any and all VRR is factually the same, all manufacturers are still completely free to limit the support in any arbitrary ways, so it doesn't matter.)

    Although, all in all, I am obviously happy that, at the end of the decade, my C9 can do both G-Sync and FreeSync.
    The only thing I wasn't glad about is the forced GTX1070 -> RTX2060 upgrade which is 99.9% a purely arbitrary decision from nVidia. But even that was fairly cheap and I don't regret it since I learned about the upcoming DX12 Ultimate (released around today with Win10 20H1), just in case I get stuck with this card for a really long time for whatever (may be financial) reason.
     
    Last edited: May 28, 2020
  20. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    Ok, so you have managed to have FreeSync and G-Sync with your C9 and a Turing card? How? Any links/tutorials?
     

Share This Page