G-Sync + In-game V-Sync OR Control Panel V-sync?

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by BlindBison, Oct 14, 2020.

  1. tsunami231

    tsunami231 Ancient Guru

    Messages:
    12,119
    Likes Received:
    929
    GPU:
    EVGA 1070Ti Black
    I just read this and come out confused on this whole topic.

    gysnc certified and gsync compatible are not the same? so that mean if i want gsync monitor i dont want a gsync compatible monitor? most the monitor i look at that are in price range are "compatible"

    Bout only thing I get is vsync need to be on for gysnc to work?

    Frankly even when I do get one I shoul probably still cap my FPS to 60 cause I dont want the extra watt draw and heat from trying to push 120+ fps
     
  2. tty8k

    tty8k Master Guru

    Messages:
    555
    Likes Received:
    144
    GPU:
    Ati 5850
    There is hardware Gsync monitors (those that have the hardware module in) which are more expensive 2-300 more.
    There is gsync compatible ones (freesync ones) without the hardware module.

    I have both and there is a difference, as in the hardware gsync one is slightly better, but just slightly and I won't pay the extra for it.
    I did it a few years ago when freesync monitors were trash.
    Plus if you get a hardware gsync monitor you're basically stuck to a nvidia card (it's not compatible with amd freesync).

    I heard nvidia is working on a new gsync module for monitor that will be compatible with freesync (AMD) cards but till that happens, buying a hardware gsync monitor now is a bit of a waste, unless money is no object.
     
  3. tsunami231

    tsunami231 Ancient Guru

    Messages:
    12,119
    Likes Received:
    929
    GPU:
    EVGA 1070Ti Black
    well i only ever owned Nvidia cards so amd argue is moot to me.

    So how do i know which one have a "module" and which dont, almost all them are markets as "gysnc" or "gysnc compatible" rest markets as freesync or varrible sync or what ever, it all confusing at this point. Every time find monitor i like it either IPS or "Curved"

    Only IPS will consider is R-IPS/Nano-IPS or what it called and I will never buy a Curved Monitor. I prefer VA over normal IPS still and would prefer to ster away from TN while the response time still among the best with TN it blacks really arnt any better then IPS

    Been Sorta kinda looking at

    This was ruined by the fact it is curved

    All to be revisited when I actual buy one
     
  4. RealNC

    RealNC Ancient Guru

    Messages:
    3,580
    Likes Received:
    1,749
    GPU:
    EVGA GTX 980 Ti FTW
    @tsunami231 You can look them up here:

    https://www.nvidia.com/en-us/geforce/products/g-sync-monitors/specs/

    If it's "G-SYNC Compatible" then it doesn't have a g-sync module. The "G-SYNC Ultimate" thing is just some BS about HDR minimum nits or something that turned out to not matter in the end because nvidia relaxed that standard IIRC to the point of making it a useless label. So you need to actually look up monitor reviews to find out if a monitor is truly HDR or not.
     

  5. tsunami231

    tsunami231 Ancient Guru

    Messages:
    12,119
    Likes Received:
    929
    GPU:
    EVGA 1070Ti Black
    Basically none monitor i looking are actual have gysnc modules are really just vrr "monitors" or freesync like amd like to call it? Sooner or later ineed to get off 1080p monitor and switch to 1440p monitor so i more likely to use 1440p on UHDTV seeing it support it. I depsise res switching in games going from monitor to tv so i just use 1080p still

    HDR is crap on monitors still mostly cause it crap in windows, And frankly my UHDTV support HDR but it dont actual get bright enough to actual matter seeing 345nit is bout the bright it can do. For most part still dont like HDR. Example of why is FF7R with HDR on Stand right before the door way of Aeriths balcony door in her house looking out side and everything is beyond washed out and soon as step out the door it it correct, same happen with some dark scenes. stand out side door and you cant see anything in door in dark but soon walk in the dark is correct and you can see thing in the room. it can look amazing but brights and dark are frack in those situations. and UHDTV is calibrated for HDR as best as can be be limited as it is

    I look at Rting for most part for reviews and few other places then from them pick what i interested in pursuring and check place like redit for user reviews. i know bunch people here dont like Rting.
     
    Last edited: Jun 15, 2021
  6. janos666

    janos666 Maha Guru

    Messages:
    1,042
    Likes Received:
    172
    GPU:
    MSI RTX3080 10Gb
    Technically everything significantly above 100 nit is HDR (400 nit is definitely HDR, even if not as good as 700+). And in practice, HDR10 brings you 10bit instead of 8 at any peak brightness (even it's only 100 nit), even thoug SDR 10bit would also be technically possible but it's not used in practice (besides that Alian Isonlation game which did "Deep Color").
     
  7. tsunami231

    tsunami231 Ancient Guru

    Messages:
    12,119
    Likes Received:
    929
    GPU:
    EVGA 1070Ti Black
    How many monitor/display actual do 10bit these days most seen are still 8+FCR I Still see 6 bit with + fcr stuff or what ever it called and of that how many actaul 12bit or does that have nothing to do with hdr?
     
  8. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,056
    Likes Received:
    418
    GPU:
    RTX 3080
    8+FRC monitor is "10 bit" to the s/w and the GPU.
    It's 8 bit LCD panel with temporal "dithering" performed on the incoming 10 bit signal by the monitor h/w.
    And for like 99% of high refresh rate use cases you won't be able to see any difference in quality between 8+FRC and true 10 bit panels.
     
    Mineria likes this.
  9. ThermaL1102

    ThermaL1102 Member

    Messages:
    29
    Likes Received:
    11
    GPU:
    KFA2 1070 OC
    ah yes , i see what you mean , but just saying that the actual point of g-sync was ... to not be using v-sync anymore , it's why it was created in the first place ...
    and all i do is setting 240 fps in the limiter that's new now , for the rest v-sync of in global and haven't had bad experiences since ...
    maybe i've had some luck with that ofcourse , but never had the need to set any v-sync , anywhere
    i understand that some people May need it for the type of games they play , and i maybe don't for my games
    everybody's experience is different , but i remember why they made g-sync , don't forget a lot of things
    i'm on a 240hz monitor now , so g-sync without v-sync ,
    there's just no need for it with that high of a range , it just more work for both the cpu and gpu
    i've always looked at it that way and games just feel better for me like this ...
    it's like being under water when i turn on v-sync
     
    Last edited: Jun 24, 2021
  10. Nastya

    Nastya Member Guru

    Messages:
    148
    Likes Received:
    35
    GPU:
    RTX 2080 Ti 11GB
    It would be pretty cool if we got a compatibility list of games started where recommendations are made re: Control Panel Vsync vs in-game Vsync and any other issues that may occur with a certain game.
     
    BlindBison likes this.

  11. janos666

    janos666 Maha Guru

    Messages:
    1,042
    Likes Received:
    172
    GPU:
    MSI RTX3080 10Gb
    But did you try the GSync+Vsync+Limiter case (where the limiter is set -3 below your max refresh rate)?
     
  12. BlindBison

    BlindBison Master Guru

    Messages:
    862
    Likes Received:
    180
    GPU:
    RTX 2080 Super
    As an aside from what we're talking about regarding G-Sync, traditional V-Sync will feel like you're moving through molasses unless you implement it "properly" like they do on consoles.

    This entails setting a Framerate limit and often limiting the max prerender value as well from what I gather. You also need "half refresh" vsync for 30 fps on a 60 hz traditional panel.

    BlurBusters has a low lag v-sync guide which I'd highly recommend. If you aren't capping the framerate in conjunction with v-sync then you get a lot of input latency, yes.
     

Share This Page