1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

The truth about PRE-RENDERING 0?

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by Tastic, Jul 16, 2012.

  1. tweakpower

    tweakpower Banned

    Messages:
    932
    Likes Received:
    0
    GPU:
    MSI HD 6770
    It douse not matter what game you play, as long as you have high refresh rate. Other than that, you are right, it's not easy to carry :D, also when you got some insane FPS at some games like 1000, still there is no tearing, so that's good enough for me ;).

    Well, tearing problem first happened on CRT's, and that's logical. But as long as you have high refresh rate, it is not problem, it should not matter if is LCD or CRT.
     
  2. visine

    visine Master Guru

    Messages:
    441
    Likes Received:
    1
    GPU:
    Gigabyte GTX 970 G1
    So can someone clear stuff up? Is it best to have 1 if u wanna have as little "flickering" as possible? I haven't really bothered to change anything in the nvidia control panel except to change the texture quality from "quality" to "high quality".
     
  3. rewt

    rewt Maha Guru

    Messages:
    1,245
    Likes Received:
    0
    GPU:
    What is left to clear up, and what is this "flickering" you speak of?

    BTW, there is next to zero visual difference between Q and HQ. High Quality mode was introduced for purists long ago when older hardware had some inherent flaws in texture filtering.[/offtopic]
     
    Last edited: Jul 26, 2012
  4. -Tj-

    -Tj- Ancient Guru

    Messages:
    16,254
    Likes Received:
    1,400
    GPU:
    Zotac GTX980Ti OC
    ^
    Its frame tearing..



    Depends on the game and frame rate cap, sometimes 1-2 (both look the same) sometimes default 3.

    For example in COD4 (125fps cap) i see it less with 3 frames vs 1-2., but then there is this slight lag - although barely noticeable a higher mouse dpi.
     
    Last edited: Jul 26, 2012

  5. IcE

    IcE Don Snow Staff Member

    Messages:
    10,693
    Likes Received:
    73
    GPU:
    Zotac GTX 1070 Mini
    I think what he means is what is the "best" render ahead setting. Application controlled, 2, 1, etc.
     
  6. visine

    visine Master Guru

    Messages:
    441
    Likes Received:
    1
    GPU:
    Gigabyte GTX 970 G1
    That's pretty much what I ment yes. I usually just keep it simple and let it stay at default. That's what I do with pretty much all the settings in the control panel. Anyways, will render set at "1" make less tearing?
     
  7. Sajittarius

    Sajittarius Master Guru

    Messages:
    335
    Likes Received:
    15
    GPU:
    2080ti Windforce
    Some people are just more sensitive to the lag, lol. The EVO fighting game tournament (street fighter etc.) uses special monitors based on their lag. Also, if you practice that humanbenchmark you would get better at predicting (which is what happens to people who play the same game regularly)

    Which is another thing, say the monitor has an inout lag of 10-15 and you add +7 for the prerendered fram that pushes you over 16ms which is 1 frame @60fps. Alot of people would notice a frame of input lag. I definitely would. One of my friends plays street fighter 4 on a 60" hdtv and the lag doesnt bother him but a bunch of us hate the thing, lol.
     
  8. rewt

    rewt Maha Guru

    Messages:
    1,245
    Likes Received:
    0
    GPU:
    Yes, no, maybe. Tearing is a function of frame rate and refresh rate, and pre-rendering can have an affect on frame rate.

    Tearing (or "flickering") is not caused by pre-rendering. Tearing is caused by the display updating the screen out of sync with the video card.

    There is no "best" setting in my opinion. The value I prefer often differs by application
     
    Last edited: Jul 26, 2012
  9. visine

    visine Master Guru

    Messages:
    441
    Likes Received:
    1
    GPU:
    Gigabyte GTX 970 G1
    I usually play with 120hz in every single game, and most of the time I have vsync off. I also have never bothered to touch the settings in control panel. I tried the new FXAA option but in games the text can get "blurry" or uglier than it usually does. But will pre-render value "1" reduce the input lag more than for example 2,3 or default?
     
  10. Prophet

    Prophet Master Guru

    Messages:
    800
    Likes Received:
    7
    GPU:
    Msi 680 Gtx Twin Frozr
    Yes. You can try smaa instead of fxaa. http://mrhaandi.blogspot.nl/p/injectsmaa.html
     

  11. ManuelG

    ManuelG NVIDIA Rep

    Messages:
    603
    Likes Received:
    42
    GPU:
    Geforce RTX 2080 FE
    Just want to follow up on my previous comment. The pre-render limit was previously supported on XP only. We changed the NVIDIA Control Panel setting so that it starts at 1 rather than 0, but a setting of 1 should behave the same as 0 did before. It was removed because it was redundant.
     
  12. rewt

    rewt Maha Guru

    Messages:
    1,245
    Likes Received:
    0
    GPU:
    I always told people that XP does not suffer as much input lag. However, your driver team has removed zero which was indeed a valid setting for Direct3D. That's what some people are complaining about.

    But it doesn't. A setting of zero behaved more similar to tools like D3D antilag flushing the render queue for every frame. This is not something like 5 milliseconds that people who think they're superhuman claim to detect, this is something that can be proven and documented (using fps/performance graphs for example, which I will leave up to the complainers to provide if it behooves them).

    Thanks for stopping by.
     
    Last edited: Jul 28, 2012
  13. visine

    visine Master Guru

    Messages:
    441
    Likes Received:
    1
    GPU:
    Gigabyte GTX 970 G1


    So can you explain the difference between the 1,2,3 and what it does? Will it reduce input lag and will it decrease the FPS? What does it really do.
     
  14. rewt

    rewt Maha Guru

    Messages:
    1,245
    Likes Received:
    0
    GPU:
    Lower values help decrease input lag. Higher values help increase FPS and "smoothness". That's all there is to it.

    For more info hover your mouse cursor over the setting in the control panel and read the help tip.
     
    Last edited: Jul 28, 2012
  15. visine

    visine Master Guru

    Messages:
    441
    Likes Received:
    1
    GPU:
    Gigabyte GTX 970 G1
    Thanks for the clear up. I guess I rather want more fps and smoothness considering I own a benq 120hz monitor with 2ms and reduced input lag.
     

  16. gx-x

    gx-x Maha Guru

    Messages:
    1,254
    Likes Received:
    94
    GPU:
    MSI 1060 6G Armor

    So, basically you are saying that you know better than nV engineers do. Kudos to you mate...
     
  17. rewt

    rewt Maha Guru

    Messages:
    1,245
    Likes Received:
    0
    GPU:
    Kudos

    All they need do is compare r296 and r300 drivers with a pre-render of 1, then compare again r296 with a pre-render of 0. In my trials there is a notable performance difference between 0 and 1, which proves the settings are not redundant.

    Besides, if they were really concerned about removing redundant and inapplicable settings, why haven't they removed those anisotropic filtering optimizations and corresponding profile bugs long ago?
     
    Last edited: Aug 12, 2012
  18. gx-x

    gx-x Maha Guru

    Messages:
    1,254
    Likes Received:
    94
    GPU:
    MSI 1060 6G Armor
    because they know exactly what is behind each option because they write the driver and have source. All you here, and elsewhere, just speculate. Differences are so marginal in some cases that they are most likely statistical errors.

    "why haven't they removed those anisotropic filtering optimizations"

    because they work? You can check it yourself, you just need to know WHEN and HOW they work ;) There are articles on it thou so you can prolly can just google some of those. OR run 3dm06 with and without them and see difference in score ;)

    PS. There used to be a setting of -1 to prerendered frames in nibitor if I am not mistaken...go figure that one out lol...weird...Besides, it has been said, it is CPU related, how much frames CPU prepares for sending to GPU, not how many frames GPU holds in "stock", because in modern games some GPU's struggle to render even one per 1/60 of a second...
     
    Last edited: Aug 4, 2012
  19. Falkentyne

    Falkentyne Master Guru

    Messages:
    418
    Likes Received:
    2
    GPU:
    Sapphire HD 7970 Ghz Ed.
    Actually, he does.
    I don't even own an Nvidia card (besides a Geforce 4), but I can tell you directly that Prerender limit functioned better, and PERFECTLY on windows XP, while it acts different on windows 7 (at least on AMD Cards, but I'm betting my buns this also applies to Nvidia too). Now, on XP, a few old games (Drakan Order of the Flame comes to mind) would crash on startup with a prerender limit of 0 (this might have been a driver bug back then on Detonator drivers, forgot if this was fixed), setting a prerender limit of 0 would almost completely remove any sort of mouse lag if you were at 60 fps or higher, with vsync on.

    However windows 7 has significantly higher mouse lag with the same prerender as XP. And setting a prerender limit of 1 causes strange things to happen that did NOT happen in XP.

    Just to see the true test of prerender limit of 0 in XP, you NEED a CRT monitor. Sorry, LCD guys, but you simply won't be able to tell the smoothness when comparing it with W7 with a 120hz LCD.

    The best test:
    Run UT2004. Enable vsync, use 60 hz refresh rate with a CRT. With a prerender limit of 0, you will has a very slight lag feeling, but the game will be fully 100% playable and the turning will be completely smooth. (Turn the mouse slowly--you wlll notice the turning will be GLASS SMOOTH, and will look exactly the same as turning your head in real life). With the default prerender limit (3), and 60hz you will have lag that will make the game feel as if you are playing in molasses. If you then force the refresh rate to 100, you will see the lag get much lower. Basically the lag at 100 hz refresh rate and limit=3 will feel about the same as 60hz refresh rate and prerender limit of 0.

    Now just for kicks, set a limit of 15 (you may have to registry edit for this. At least it works with AMD cards (FlipQueueSize=15 string value). Still in XP. Now, in UT, you will have about HALF a second of mouse input lag at 60 hz refresh rate. And you will notice it ALL the time.

    Now going back to windows 7.
    Prerender limit of 1 in UT: 60hz refresh rate:
    The first thing you will notice is that there is more input lag than there was in XP. Also, the game does NOT feel anywhere near as smooth; it will seem as if the game is 'jumping' from pixel to pixel instead of smoothly turning (you will only notice this on a CRT screen! LCD's are NOT fast enough !!). And the mouse lag will be much more noticable and annoying. Also if you do this in CS:Go (except use a 100 hz refresh rate now), on the main menu, the mouse movement of the pointer will be jittery instead of smooth (AGAIN you need a CRT to notice this!). Also, in CS:Go, you will get horrible frame jittering in many areas, when close to a wall (MOST noticable by the fences in Train at T spawn).

    Prerender limit of 2: 60hz refresh rate:
    CS:Go: mouse pointer smooth in main menu. The jittering is gone on Train, by the fences at T spawn). UT STILL is not smooth (but it's smoother). Mouse lag now makes 60hz vsync on just unplayable.

    Prerender limit of 3 (aka default): 60hzrefresh rate:
    UT is smooth now. No frame skipping. But mouse lag makes this unplayable.

    Limit 15:
    Ok, now we see where W7 and XP differ for sure now.
    At 15, you will see only SLIGHTLY higher mouse lag than at 3 (default), but it seems like some areas of the game (CS:Go) will suddenly cause a HUGE increase in mouse lag while other areas will be fine. UT will have slightly higher mouse lag but not the 1/2 second lag of XP.

    So it definitely is a big difference compared to XP. The jitteriness of the mouse cursor in CS:Go (at 100 hz refresh rate mind you) with a prerender of 1 as well as the UT2004 panning jitterness at 60 hz) is a giveaway that something different is going on.

    However if you use 100hz refresh rate in UT instead of 60 hz (in W7), prerender limit of 1 is glass smooth with no noticable input lag, while XP was glass smooth at 60 hz.

    TL;DR: Basically, in XP: Set a prerender limit (or Flip Queue Size) of 0 and leave it there and have NO drawbacks. In W7, setting "1" (lowest value) has drawbacks that were NOT present in XP at 0 OR 1.

    If you guys want to test that in 7 on your Nvidia cards, go ahead.
    Remember vsync must be enabled otherwise you will hardly notice anything. But people without 120hz screens and who are capped at 60 fps are DEFINITELY getting the short end of the stick here in windows 7.
     
  20. rewt

    rewt Maha Guru

    Messages:
    1,245
    Likes Received:
    0
    GPU:
    All optimizations besides trilinear have no effect on current hardware. Trilinear optimization is also a redundant setting since HQ & Q modes toggle it automatically.
     

Share This Page