G-Sync + In-game V-Sync OR Control Panel V-sync?

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by BlindBison, Oct 14, 2020.

  1. RealNC

    RealNC Ancient Guru

    Messages:
    5,093
    Likes Received:
    3,376
    GPU:
    4070 Ti Super
    @tsunami231 You can look them up here:

    https://www.nvidia.com/en-us/geforce/products/g-sync-monitors/specs/

    If it's "G-SYNC Compatible" then it doesn't have a g-sync module. The "G-SYNC Ultimate" thing is just some BS about HDR minimum nits or something that turned out to not matter in the end because nvidia relaxed that standard IIRC to the point of making it a useless label. So you need to actually look up monitor reviews to find out if a monitor is truly HDR or not.
     
  2. tsunami231

    tsunami231 Ancient Guru

    Messages:
    14,750
    Likes Received:
    1,868
    GPU:
    EVGA 1070Ti Black
    Basically none monitor i looking are actual have gysnc modules are really just vrr "monitors" or freesync like amd like to call it? Sooner or later ineed to get off 1080p monitor and switch to 1440p monitor so i more likely to use 1440p on UHDTV seeing it support it. I depsise res switching in games going from monitor to tv so i just use 1080p still

    HDR is crap on monitors still mostly cause it crap in windows, And frankly my UHDTV support HDR but it dont actual get bright enough to actual matter seeing 345nit is bout the bright it can do. For most part still dont like HDR. Example of why is FF7R with HDR on Stand right before the door way of Aeriths balcony door in her house looking out side and everything is beyond washed out and soon as step out the door it it correct, same happen with some dark scenes. stand out side door and you cant see anything in door in dark but soon walk in the dark is correct and you can see thing in the room. it can look amazing but brights and dark are frack in those situations. and UHDTV is calibrated for HDR as best as can be be limited as it is

    I look at Rting for most part for reviews and few other places then from them pick what i interested in pursuring and check place like redit for user reviews. i know bunch people here dont like Rting.
     
    Last edited: Jun 15, 2021
  3. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    Technically everything significantly above 100 nit is HDR (400 nit is definitely HDR, even if not as good as 700+). And in practice, HDR10 brings you 10bit instead of 8 at any peak brightness (even it's only 100 nit), even thoug SDR 10bit would also be technically possible but it's not used in practice (besides that Alian Isonlation game which did "Deep Color").
     
  4. tsunami231

    tsunami231 Ancient Guru

    Messages:
    14,750
    Likes Received:
    1,868
    GPU:
    EVGA 1070Ti Black
    How many monitor/display actual do 10bit these days most seen are still 8+FCR I Still see 6 bit with + fcr stuff or what ever it called and of that how many actaul 12bit or does that have nothing to do with hdr?
     

  5. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,930
    Likes Received:
    1,044
    GPU:
    RTX 4090
    8+FRC monitor is "10 bit" to the s/w and the GPU.
    It's 8 bit LCD panel with temporal "dithering" performed on the incoming 10 bit signal by the monitor h/w.
    And for like 99% of high refresh rate use cases you won't be able to see any difference in quality between 8+FRC and true 10 bit panels.
     
    Mineria likes this.
  6. ThermaL1102

    ThermaL1102 Active Member

    Messages:
    87
    Likes Received:
    28
    GPU:
    KFA2 1070 OC
    ah yes , i see what you mean , but just saying that the actual point of g-sync was ... to not be using v-sync anymore , it's why it was created in the first place ...
    and all i do is setting 240 fps in the limiter that's new now , for the rest v-sync of in global and haven't had bad experiences since ...
    maybe i've had some luck with that ofcourse , but never had the need to set any v-sync , anywhere
    i understand that some people May need it for the type of games they play , and i maybe don't for my games
    everybody's experience is different , but i remember why they made g-sync , don't forget a lot of things
    i'm on a 240hz monitor now , so g-sync without v-sync ,
    there's just no need for it with that high of a range , it just more work for both the cpu and gpu
    i've always looked at it that way and games just feel better for me like this ...
    it's like being under water when i turn on v-sync
     
    Last edited: Jun 24, 2021
  7. Nastya

    Nastya Member Guru

    Messages:
    185
    Likes Received:
    86
    GPU:
    GB 4090 Gaming OC
    It would be pretty cool if we got a compatibility list of games started where recommendations are made re: Control Panel Vsync vs in-game Vsync and any other issues that may occur with a certain game.
     
    Kamil950, Smough and BlindBison like this.
  8. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    But did you try the GSync+Vsync+Limiter case (where the limiter is set -3 below your max refresh rate)?
     
  9. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,419
    Likes Received:
    1,146
    GPU:
    RTX 3070
    As an aside from what we're talking about regarding G-Sync, traditional V-Sync will feel like you're moving through molasses unless you implement it "properly" like they do on consoles.

    This entails setting a Framerate limit and often limiting the max prerender value as well from what I gather. You also need "half refresh" vsync for 30 fps on a 60 hz traditional panel.

    BlurBusters has a low lag v-sync guide which I'd highly recommend. If you aren't capping the framerate in conjunction with v-sync then you get a lot of input latency, yes.
     
  10. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,419
    Likes Received:
    1,146
    GPU:
    RTX 3070
    What does this do exactly? Thanks
     

Share This Page