Forcing refresh rate does not seem to work.

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by PrMinisterGR, May 22, 2020.

  1. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,040
    Likes Received:
    7,379
    GPU:
    GTX 1080ti
    source?
     
  2. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    Sadly, I don't have it. It was an unofficial claim of someone close to AMD (like leaking some NDA crap on forums and writing AMD ass-liking but insanely in-depth articles/reviews kind of close, possibly an ex-employee). It was fairly easy for me to believe since nVidia was probably smart enough to plan ahead and make the option open and at the ready for "G-Sync Compatible" (now an actual reality) on the VGA side and also the FreeSync compatibility on the display module side (I am not sure about this but I read rumors about recent G-Sync module based monitors supporting FreeSync with AMD cards which is said to be a firmware-only thing, although older displays won't receive this update) while also enjoying the benefit of easy copy-paste work. But, of course, it could be a "tweaked" version (either to force incompatiblity or to actually perform better) or completely different (could it still be polling based?).

    All you need is to add the 40-120 FreeSync Range entry to the EDID extension block. You might need to delete some redundant detailed resolution entries first to make room (the block size limit is a bit crowded now) but that's all. I tested this with a Ryzen+Vega notebook. It seems to behave exactly the same as G-Sync Compatible with the RTX2060.
     
  3. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,040
    Likes Received:
    7,379
    GPU:
    GTX 1080ti
    There might be some truth mixed up in there, but i strongly doubt dvi was the encapsulation for original gsync.
     
  4. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    I believe the original G-Sync (let's call it v1 for simplicity) had nothing to do with VESA standards. It was something put tougher by nVidia (may be loosely inspired by VESA standards/drafts). We know it used dual-link DVI-D. DisplayPort was less popular, let alone VESA Adaptive Sync. Go back and check the first G-Sync monitors (what kinds of input ports they had and which of those ports supported G-Sync mode). Then AMD continued to popularize DisplayPort (it was a hard crush at first sight, they couldn't have enough of it) and also VESA Adaptive Sync with FreeSync (the original one, let's call it v1 which was called into existence as an answer to G-Sync, of course, the VESA standard was originally planned as a power saving method for built-in screens, it just became convenient to use for FreeSync). Then nVidia also started leaning towards DP (mainly for increased bandwidth, non-GSync related compatibility and feature support [present and future] as a natural evolution --- although we could have went for HDMI or even transitioned to ThunderBolt by now...) and G-Sync monitors started using DP instead of DVI. That's where the shady claim comes in that G-Sync silently transitioned to VESA Adaptive Sync (while keeping the arbitrary locks at place in the drivers for certified displays featuring the special modules.... but not without plans to open the gates if that became necessary or beneficial).
     

  5. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,040
    Likes Received:
    7,379
    GPU:
    GTX 1080ti
    Gsync was introduced with Kepler, everything beyond Fermi had displayports
    There are no dvi panels that support gsync, always been displayport.
     
  6. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    Hmm. Looks like you were right. I must have mixed it up with something else (or that was some early demo system but unlikely). It was probably DisplayPort-only all along (orry for my mistake) but still not standard VESA A-Sync. It was so long ago, I can't remember when nVidia supposedly transitioned to VESA A-Sync with G-Sync.
     
  7. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,040
    Likes Received:
    7,379
    GPU:
    GTX 1080ti
    Indeed, the early stuff was just vblank and hold, with a dram on an fpga to compare overdrive differences.
     
  8. MrBonk

    MrBonk Guest

    Messages:
    3,385
    Likes Received:
    283
    GPU:
    Gigabyte 3080 Ti
    What is exactly wrong with LG OLEDs in PC mode? Just the banding you get from running at 8-bit color instead??
     
  9. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    Yes, the prominent banding which gets really ugly in HDR10 (which has more processing steps, so inaccuracies accumulate).
    It's not as simple as dropping back to 8bit. The 8bit gradient is not smooth enough either. However, 10bit input is still superior overall to native 8bit input (unless it's 8bit dithered from 10bit to 8bit because that looks smoother). The inaccuracy is non-linear, the brighter region looks like ~9bit smooth (almost as smooth as 10bit with some inaccuracies), the near-black is more like ~6bit, so heavy banding (the panel gamma is non-linear and I guess the internal processing works in various color spaces, both linear and non-linear gamma). The banding, the inaccuracy can get so big in HDR10 that you get to see completely invalid colors, like magenta tinted shades on white (too many red and green steps band together, so the R+G+B mix comes out heavily tinted with lower R,G inputs). But it's exaggerated by other inaccuracies (the color management of LG tends to create a small amount of similar issues in HDR10 but it's much less prominent before you combine it with the effects of PC mode).
     
  10. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    I mean, if we get full HDMI 2.1 PC mode, that means 4k120 RGB at 12bit Full range, which makes every other option meaningless, and it would work with everything. That is my eventual target. I can't believe that LG went back to 40Gbps with the CX, I think it's the first time that with the exception of a single feature (120Hz Black Frame Insertion, which I particularly don't care about), their new model is worse than the old one. No full speed HDMI 2.1, and no DTS. What the actual f*ck.

    There is zero benefit, except the "Auto" black level setting.
    Why would you run this panel on 8-bit color is a mystery to me, unless you are using the display as a monitor, which you shouldn't with an OLED anyway.
    You lose a ton of proper post processing, it has more banding and less color volume. It's just there for people wanting to use it as a PC monitor for text.
     

Share This Page