1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

The truth about PRE-RENDERING 0?

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by Tastic, Jul 16, 2012.

  1. Finnen

    Finnen Member

    Messages:
    18
    Likes Received:
    0
    GPU:
    GTX 970 4GB
    I did some testing and what RealNC said is absolutely right, thank you. Without "1/2 Refresh Rate" (only normal 60Hz Vsync + 30FPS lock in RTSS) there is, indeed, smaller input lag that way, but there is always some micro stutter, regardless of what kind of precise ~30 FPS limit I set. I was testing that with a controller in few games by panning the camera around me at a constant rate. Sometimes I was able to make the whole "circle" fine, but most of the times there was one or two "angles" at which the game would stutter slightly, even though FPS stayed dead stable. It's very subtle stutter, some people might not even notice it, but it's there.

    However, With "1/2 Refresh rate" Vsync from NVIDIA Inspector, the micro stutter is removed entirely, but obviously, there is higher input lag. But it's super smooth, no micro stuttering at all. Using the -0.01 adjustment indeed helps with reducing input lag, but it still isn't as responsive as without "1/2 Refresh Rate". But without doubt, this is a far better option for visually smoothest 30FPS. Too bad there's such a big input lag compared to normal VSync, even with -0.01 adjustment, but oh well, it's a trade-off. At least now I can play smoothly the most demanding games in which I wouldn't be able to get such a good frame pacing at 60 FPS due to the slight fluctuations.
     
  2. RealNC

    RealNC Ancient Guru

    Messages:
    2,508
    Likes Received:
    746
    GPU:
    EVGA GTX 980 Ti FTW
    If you ever find a g-sync monitor offered at a good deal price, that would basically completely solve the problem. It will be lower input lag than what you would currently get even if you used vsync OFF and a 30FPS cap. And you can cap to anything you want (40FPS, 48FPS, whatever you want it to.)
     
  3. janos666

    janos666 Master Guru

    Messages:
    595
    Likes Received:
    28
    GPU:
    MSI GTX1070 SH EK X 8Gb
    I thought capping the frame rate anywhere below the refresh rate will starve the buffer/queue. Just below or way below, it's the same, the trick is always to keep the queue from filling up to full length and a bigger offset is actually better in that regard (but a smaller one is obviously better from other aspects, so you want the minimum effective offset, hence -.00x).

    I guess you mean the extra 0.00x cap is still needed when you force the 1/2-Vsync option with the nVidia driver or force the display mode to 30Hz (that obviously makes sense). But I assumed he plans to keep the display at 60Hz and use the regular V-sync (not the custom 1/2-Vsync).
    Why would you use both the 1/2-Vsync mode from the driver and then the additional RTSS limiter with a small offset rather than simply applying a 30fps cap with RTSS (keeping the regular V-sync on with the officially supported ~60Hz refresh rate mode)?

    BTW, I still prefer FastSync to capped V-sync. And yeah, HDMI 2.1 and VRR compatible OLED can't come soon enough ("shut up LG and take my money" style...). :)
     
    Last edited: Jun 18, 2018
  4. Finnen

    Finnen Member

    Messages:
    18
    Likes Received:
    0
    GPU:
    GTX 970 4GB
    Because keeping the regular V-Sync ON + applying 30 FPS cap in RTSS still produces micro stuttering, sadly. But it really IS micro, I actually played for years like that and only noticed it quite recently. If you play on keyboard and mouse, then it's possible that you won't actually notice it. It really is only visible if you pan your camera around at constant rate - easily visible, if you play with controller and just slightly turn your right analog stick to constantly move your camera at one rate. Then you will notice that, from time to time, there is kind of "hickup", a little, little stutter, even though your FPS stays rock solid. It's not consistent, it just sometimes happens.

    If you use 1/2-Vsync from NV Inspector then these stutters do not appear (with or without RTSS) but input lag is much higher. The RTSS cap is used in this case to fight the extremely big input lag that 1/2-Vsync adds. It helps a bit, but it's still bigger than regular V-Sync + 30 FPS cap. From what I see it's a trade-off. Want super, extra smooth frames at 30 FPS? Use 1/2-Vsync but suffer from really big input lag (that can be slightly reduced with RTSS). Want good enough frame pacing with lower input lag on 30 FPS? Keep the original V-Sync and only add 30 FPS cap with RTSS. There is some tiny, tiny little bit of micro stuttering, but input lag is smaller and the game feels more responsive.

    Exactly as I thought. haha. But I live in Poland, there are very rarely any sales for G-Sync monitors here and the base price for the cheapest G-Sync monitor (AOC G2460PG) is higher than, I don't know, GTX 1060 6GB for example. That monitor currently goes for 441 USD here and it's the cheapest option. GTX 1060 6GB can be bought already for around 350 USD. These damn G-Sync monitors are extremely expensive :(
     

  5. janos666

    janos666 Master Guru

    Messages:
    595
    Likes Received:
    28
    GPU:
    MSI GTX1070 SH EK X 8Gb
    I see. Indeed, I never used an analog controller. And yeah, I never used 30 fps cap at 60Hz for long. I just quickly tested it out of curiosity when this question popped up.

    That's why I use FastSync pretty much exclusively nowadays. It's "jittery" as hell compared to V-sync or G/Free-Sync at similar average framerates but the lag is pretty much as low as it gets with constant refresh rate (and no tearing).
    I couldn't stand the random frame duplications on "pulsing" displays (I was a hard core plasma fan, I still hold on to my old Pioneer 9G as a relic) but it's not nearly as bad on "sample and hold" displays (every single LCD or OLED screen without [proper] BFI [which is very rare]). So I used V-sync for PDPs and now FastSync with the 2017 LG OLED (there is no BFI option on this model).

    G-sync never was and probably never will be an option for me since I hate the whole LCD tech like a bloody epidemic and I prefer big screen sizes (~50"). My ticket into the FreeSync era could only be HDMI VRR on OLED TVs.
     
    Last edited: Jun 18, 2018
  6. Kidoki

    Kidoki New Member

    Messages:
    1
    Likes Received:
    0
    GPU:
    GTX 970
    So, to have a properly smooth 30FPS lock i should lock it at 30.000 - 0.01 while using 1/2 VSYNC? Or do i have to consider my refresh rate while doing this 30FPS lock? (Reported as 60.001 on TestUFO)

    edit: Nvm, just saw the other page were this was discussed... So, i should lock it to 60.001/2 - 0.01? I saw some people mentioning the -0.007 value too, so should i just keep trying between 0.007 and 0.015 to find the best spot?
     
    Last edited: Jun 19, 2018
  7. RealNC

    RealNC Ancient Guru

    Messages:
    2,508
    Likes Received:
    746
    GPU:
    EVGA GTX 980 Ti FTW
    You need to use your actual refresh rate. So if it's 60.001 (remember to let the test run for at least 3 minutes; yes it takes that long.) If it's really 60.001, then you need to cap to 60.001 / 2 - 0.01, which is 29.99.

    You can just use -0.01 and forget about it. The whole 0.007..0.015 thing is just there in case you opt for a custom refresh rate instead of an RTSS cap, because you can't use any Hz you want in a custom refresh rate, so anything between 0.007Hz and 0.015Hz will work fine.
     
    Kidoki likes this.
  8. Smough

    Smough Member

    Messages:
    48
    Likes Received:
    2
    GPU:
    GTX 1060 3GB
    Only on rare occasions should you use the nVidia Inspector frame rate limiter, its pretty lousy and never limit the frame rate properly except for a few games, such as Ghost Recon: Wildlands. Always use RTSS or simply let Windows 10 full screen optimizations work out the frame time for you, generally, it makes the games smooth. Some games don't respond to them at all so keep this in mind, other games stutter because of it, so enabling and disabling this on the game .exe is something you should try on any game that is giving you issues. Sometimes, RTSS is not needed.

    Now as for 30 fps, my recommendation is 37.50 fps, its much more smoother than PC 30 fps. Download Custom Resolution Utility, go to the "Add" option, on the bottom type "75.000", apply and save the changes. Click on "restart64.exe" so the program applies the 75 Hz to your monitor. Then set it on the desktop or in-game, use half v-sync refresh rate. Done. You'll be getting 37.50 fps which is a world beyond smoothness compared to 30 fps on 60 Hz.

    Now go to "C:\Program Files (x86)\RivaTuner Statistics Server\Profiles", click on any game's profile that you may play at less than 60 fps and do the following: Find the line called [Framerate], below it add this line: LimitDenominator=0 and leave the line "Limit=0" alone.

    Then set the frame rate cap for 37.5 fps.
    Should look this this:

    [Framerate]
    LimitDenominator=1000
    Limit=37498

    Try the games and see how they turn out.

    Remember: For all of this to work properly, the nVidia scaling must be set to "GPU" and not "Display".
     
  9. Finnen

    Finnen Member

    Messages:
    18
    Likes Received:
    0
    GPU:
    GTX 970 4GB
    Smough, thanks for suggestion, it's a good one and I tried it before already since I was using CRU for some time. Unfortunately, I have a really old monitor (Samsung 2333SW) and it doesn't work with 75Hz - the whole screen becomes blurry. Might try this on my TV, though. Maybe it will make the natural input lag of my TV more bearable since 37,5 FPS should be more responsive.
     
  10. Smough

    Smough Member

    Messages:
    48
    Likes Received:
    2
    GPU:
    GTX 1060 3GB
    Sucks that your monitor can't get those 75 Hz, when converted to half the results are much more bearable than 30 fps on 60 Hz. Your T.V should be able to pull this out if its modern enough, try it out. And yes, 37.5 fps is much more responsive and not only that, due to the v-sync, is way smoother and better than, for example, using RTSS to limit the frame rate to 40 fps or 50 fps on 60 Hz mode without v-sync.
     
    Last edited: Jun 20, 2018

  11. Finnen

    Finnen Member

    Messages:
    18
    Likes Received:
    0
    GPU:
    GTX 970 4GB
    Yeah, it sucks, but I took your advice and reverted it a bit.

    Here's the problem I have with games these days. I don't have a terrible PC, so I usually get 50-60 FPS in games with the right settings (mostly medium with textures on high), except for few games that are borderline unoptimized. 50 FPS+ is good, but these 50-60 fluctuations are annoying since they introduce a lot of stutters. And there's a lot of games where I usually have 52-55-57 FPS on average - something that as a 60Hz monitor owner I call "the worst FPS" (it's still quite high, so it seems like a real waste to chop it down to 30, but on the other hand the stuttering is annoying and I want to improve frame pacing without playing on ultra low).

    So I thought to myself... if I can get 50+FPS easily, then what if I actually downclocked my monitor to 50Hz and limited the FPS to "Refresh Rate - 0.01"? Using CRU I added a new resolution with 50Hz, checked the VSync with VSyncTester.com (turned out to be 49.99Hz) and limited my FPS to 49.98 FPS with RTSS. The results? It's wonderful, actually! Since I can keep 50FPS all the time, all the annoying fluctuations are gone, along with the stuttering. And 50Hz doesn't feel choppy or slow. Also, since the monitor is running at actual 50Hz the frame pacing is perfect with -0.01 tweak. It's actually very smooth! I mean, sure, in theory, it has increased latency (20ms) when compared to 60Hz (16.67ms), but that difference is barely noticeable. It's definitely a lot better than playing on 60Hz monitor with FPS fluctuating between 50 and 60.

    I think it's a great alternative for people who are on a tighter budget and want to improve the smoothness of their games. This seems like the best compromise among all the options available to a poor gamer like me, but obviously, it all depends on the hardware, the average FPS and so on. But 50Hz/50FPS gaming is definitely better than "60Hz/30FPS" or "60Hz/50-60FPS", at least to me. The only downside is that, of course, you rather want your normal desktop apps to run at 60Hz, so you have to manually switch to 50Hz before running the game and then manually go back to 60Hz after you stop playing (and it's advisable to do it in the Windows's monitor settings instead of Nvidia CP, since I noticed that NVCP settings are usually overwritten by games, but Windows settings are respected). Maybe there's an app that runs in the tray area and allows to quickly switch refresh rate by right-clicking it or something (edit: there is, HRC – HotKey Resolution Changer, allows to switch resolutions and refresh rates by hotkeys or context menu in the tray area)
     
    Last edited: Jun 19, 2018
  12. janos666

    janos666 Master Guru

    Messages:
    595
    Likes Received:
    28
    GPU:
    MSI GTX1070 SH EK X 8Gb
    Reasonable amount of money can't magically solve all the issues.

    Once you bought a faster VGA card, you will feel like you are wasting the horsepower and wish to buy a display with higher resolution, higher refresh rate or preferably both (if still possible) and then you have the same issue with quality/performance optimization again.
    My biggest quality issue was aliasing (since MSAA is long dead and 99% of post-process AA solutions are a bad joke --- they are much more efficient in destroying all the sharp details all over the picture than smoothing the actual jagged edges which might still remain jagged, just a little more blurry) and higher native resolution is much more efficient than super-sampling (it provides additional natural sharpness and detail while efficiently removing most of the aliasing rather than just preserving the details while removing some aliasing), so I went for higher resolution (2k->4k). But now I have to deal with <60fps again in many cases.

    And some games are just awful. They look like a disaster at default settings (I hate the overly popular forceful application of heavy DoF, aggressive motion blur and similar post-process effects --- !often undefeatable! but certainly ON by default on PC and usually forced on consoles) and run like crap (you just won't get stable 60+ fps on any PC unless you drop the quality settings back to the previous century), no matter what (in most cases, most of the "advanced" settings do pretty much nothing, at least on the performance side on relatively modern systems because the "chain" has 1 or 2 bottlenecks and the rest is practically irrelevant --- but fiddling with those bottlenecks is often painful, like loosing 50% perceived quality level to gain 25% performance, so not a good deal unless you absolutely need that performance to make the game playable/bearable).

    IMO, it's a total disaster these days. Hardware is ever more expensive and content is designed in the worst ways. For example, you have to pay much more money for a display which has lower inherent motion blur but then they put aggressive undefeatable motion blur effects into games. You pay more money for higher resolution displays but they apply heavy depth of field (and similar) effects in games to make everything look like a blurry dream. Or they put a load of film grain / perlin-noise to the image to make it look like a cinema movie from the '90s (before Hollywood had the kind of high dynamic range / low noise sensors or celluloid films they have today --- back then it wasn't a choice but a necessity, yet they often emulate it today with artificial processing "just because", yet you should buy the high resolution display...). And when you have low fps (partially thanks to the cacophony of various questionable post-processing steps), they offer you VRR capable displays to cope with it (but it's not a real solution, since low average fps still yields low motion resolution because VRR is incompatible with BFI / impulse display driving and low average fps still yields lower level of "control/aim" -> VRR is just a clever tool but not the holy grail of all...).

    So, if you are a maximalist and "purist", you simply can't get it right on all fronts. (Of course not *all* is *only* bad but I feel like it's certainly "mixed" a lot.)
     
  13. Finnen

    Finnen Member

    Messages:
    18
    Likes Received:
    0
    GPU:
    GTX 970 4GB
    I totally agree, especially with the cost of the hardware. I remember back in the days where I could upgrade my PC without breaking a bank. I usually spent up to ~630-800 PLN (170-220 USD) for GPU and that lasted for years without causing any serious problems. I used Radeon HD 4890 for such a long time and I paid like 629 PLN for that card. For CPUs I usually picked AMD. Oh boy, the AMD Phenom II X2 550 was SUCH a good performer that lasted for sooo long! And it did cost like... 60 USD? These are the times that are long gone and will probably never come back :(

    And yeah, the performance in games is absolutely horrible. It's even more problematic since I mostly play open-world games and they are the ones causing most of the problems and have an insane amount of stuttering. Games like Kingdom Come Deliverance and Mass Effect Andromeda reminded me that I would have to shell out an insane amount of money to remove horrible stuttering and FPS drops (upgrading CPU from Haswell to the latest i5 or i7 means new MOBO and DDR4 RAM). Sure, these games were badly designed in the first place, but in the end, who cares? It is what it is - you take it or you leave it. Andromeda can still stutter when you're walking around your ship (The Tempest), Kingdom Come has terrible stuttering on my CPU (i5-4440) when walking around the towns (locking the game to 30 FPS fixes that). I'm really afraid of next big open world RPGs like Dragon Age or Cyberpunk 2077 - I don't even know if I'll be able to hold stable 30 FPS without stuttering anymore. As for the quality, don't get me started on AA - I absolutely hate both FXAA and TAA, which make a blurry mess out of the screen, especially in Fallout 4 (face details are completely lost with Temporal AA).

    And, as you said, adjusting the settings manually isn't as good as it used to be. Back in the days you could customize the settings and drop like one or two biggest offenders to High or Medium and suddenly the game worked so "lightly" and smoothly. Today it feels like my PC is choking, transitions between cutscenes usually cause temporary stutter (probably because of the aforementioned amount of post-processing that is being activated every time you start a conversation with an NPC), and there's always some camera angle where your FPS just goes down for no reason whatsoever, causing stuttering. Ehhhhhhh. Life was easier during CRT era.

    All in all that makes me only happy that I found these forums and applied some of the tweaks here. Tweaks like "1/2 Refresh Rate Vsync" or "50Hz refresh rate in CRU" along with the "-0.01 fps lock" really work like a miracle for improving the smoothness of the games on cheaper hardware. If only I knew before that there were methods to improve frame pacing... that would save me and my girlfriend a lot of nerves (she was devastated by the stutterfest in Andromeda on her i5-4460/GTX970 and she mostly bought her PC for Bioware games). It's a terrible thing for less knowledgeable people. I noticed recently that I can barely run any game without doing either some .INI tweaks, RTSS caps or VSync changes. Either it stutters, has low FPS or terrible input lag by forced mouse smoothing. As you said, it's a total disaster these days.
     
  14. janos666

    janos666 Master Guru

    Messages:
    595
    Likes Received:
    28
    GPU:
    MSI GTX1070 SH EK X 8Gb
    Off-topic
    Andromeda is a good polypathological test case. It prompted me to do unscheduled hardware upgrades on an unusually liberal budget and eventually to give up on the hopes of experiencing a game like Bioware used to make (thin ME1 and DAO, or even just ME2 and DAI) again in the not to distant future (before everything comes full circle, as things usually do).

    Hahh. I think you might be negatively surprised (even coming from a little pessimist view). Those games could be several years away and especially C77 looks like something which intends to push the limits on many fronts, PQ included. And I think this will be a general trend for some years. High budget games will push for the best possible PQ (even if it's mostly just "eye candy" rather than actual pure detail/quality, or so called "artistic value") to starve the low budget (but ambitious) indie games (which seem to be on a rise in popularity) to death with their 2010-ish graphics (or really horrible PQ/performance).

    As an example, the Torment Numenera game is great (I enjoyed it almost as much as I enjoyed DAO back in the days, though my standards/expectations probably rose since DAO was my first RPG experience) but we wouldn't have considered it nice looking in 2006 (10 years before it's release), let alone today. On the other hand, Dreamfall: Chapters looked very pretty around it's release date but ran terribly bad, especially on AMD cards.

    Yes, a transition from gameplay to realtime rendered cutscenes (not videos) often followed by (a theoretically seamless) on-the-fly change to a bunch of parameters. I think the new habit of changing the render resolution is one of the most offending of them nowadays (in terms of smoothness) but a LOD change can also be both visible (if the detail level gradually changes in front of your eyes, sometimes in multiple steps over several seconds) and cause fps drops (a sudden change in the parameters can promote a lot of tasks which are normally handled at the background to be immediately necessary -- or more like already too late at the moment).
     
    CrazyBaldhead likes this.
  15. Monchis

    Monchis Maha Guru

    Messages:
    1,281
    Likes Received:
    33
    GPU:
    GTX 950
    I have not found a game with micro-stuttering fixable with this trick in a while, until the new Wreckfest racing game a couple of days ago, perfect smoothnes once I set the pre-render frames to 1.
     

  16. janos666

    janos666 Master Guru

    Messages:
    595
    Likes Received:
    28
    GPU:
    MSI GTX1070 SH EK X 8Gb
    Edited: Sorry, needs more testing.
     
    Last edited: Jul 12, 2018
  17. KneehighPark

    KneehighPark New Member

    Messages:
    2
    Likes Received:
    1
    GPU:
    GTX 1050 Ti
    The information in this thread has been super useful. Been playing older games in QHD/UHD on my budget rig hooked up to my PC, and got the RTSS limits figured out. Tests reported my TV's refresh rate as 60.002, so I capped RTSS to 59.990.

    Just had a couple of questions regarding Vsync and MPRF.

    1) Is it best to use Nvidia Control Panel Vsync, or in-game?

    2) What should MPRF be set at? I tried using "1", but I ended up getting more stutters and frame drops, even when my GPU wasn't pegged at 99 or 100% (when playing Arkham City). Will this setting vary on a game-by-game basis?

    3) Most of the time, I am hitting a locked 60, but rarely, I dip below, for a second, if that. Is that "alright"? Or is keeping a constant 60 fps mandatory for all of this to work?
     
    BuildeR2 likes this.
  18. RealNC

    RealNC Ancient Guru

    Messages:
    2,508
    Likes Received:
    746
    GPU:
    EVGA GTX 980 Ti FTW
    Shouldn't matter. Unless the game has a bad vsync setting (like the Bethesda RPGs.)

    If you always reach the cap, MPRF shouldn't have made a difference. Are you sure your monitor really is 60.002Hz? Maybe it's a browser bug? Try CRU to see what the EDID timings say.

    When you reach the FPS cap, it works. When you don't, it doesn't.
     
  19. janos666

    janos666 Master Guru

    Messages:
    595
    Likes Received:
    28
    GPU:
    MSI GTX1070 SH EK X 8Gb
    I'm sure some games override this parameter. See Frotbite3 with RenderDevice.RenderAheadLimit: the default value might suggests this is a no-op (Windows default or VGA driver overrides remain in effect) but I think the game still sets it to 2 by default (can be changed on-the-fly from the developer console).
     
  20. RealNC

    RealNC Ancient Guru

    Messages:
    2,508
    Likes Received:
    746
    GPU:
    EVGA GTX 980 Ti FTW
    It could also be that the NVCP setting can decrease it, but not increase it. If a game sets it to 2, but you set it to 1 in the nvcp, it's gonna be 1. But if a game sets it to 1 and you set it to 2 in the nvcp, it's gonna be 1.

    I don't have this game, but in Forza Horizon 3 there's an actual in-game setting for this and you can set it to 1, 2 or 3. This would make it rather easy to test whether what I wrote above is true or not by running the game at 60Hz with 1/3 vsync (20FPS.) Two frames difference mean 100ms input lag, which should be easy to tell.
     

Share This Page