The truth about PRE-RENDERING 0?

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by Tastic, Jul 16, 2012.

  1. Corrupt^

    Corrupt^ Ancient Guru

    Messages:
    7,270
    Likes Received:
    600
    GPU:
    Geforce RTX 3090 FE
    My whole idea about everything input lag related is that sometimes people are to focused on "the human eye".
    Sure an image is smooth at 25 fps but the gameplay isn't. It's called input lag. Even if they eyes can't see, you're moving around your mouse, you're sending feedback with another sense besides your eyes.
    As soon as these 2 (sight and touch) feel "out of sync" it's usually input lag. I'm liking the F1 pilot comparison. Just like the experienced F1 pilot, an experienced FPS player can spot differences in input lag very fast.

    I even notice the difference between 120 Hz and 60 Hz on a desktop under 10 seconds (usually due to ghosting on the mouse cursor).

    So the nay sayers need to learn how to think outside the box for once:

    1) It's not because you can't sense it, others are talking bs
    2) Stop thinking with "just sight"

    I personally don't think the human sight is very amazing, but I do believe experienced people can sense minute small differences in things when using more of our senses.

    Also, in one of the video interviews from Rage, John Carmack mentioned that as FPS increases, input lag decreases. Hence why you'll still see quake players using fps well beyond their refreshrate if they can.

    Every competitive FPS player will condemn VSYNC as the mother of all evil since it adds a massive amount of input lag.
     
    Last edited: Jul 19, 2012
  2. rewt

    rewt Guest

    Messages:
    1,279
    Likes Received:
    15
    GPU:
    Yes
    Likewise, because all you have done so far is post nonsense and start fights. You have provided no valuable contribution to this thread, so do everyone a favor and never visit this topic again.

    Right on, you've pretty much summed it up. Putting on ignore.
     
    Last edited: Jul 19, 2012
  3. Cyberdyne

    Cyberdyne Guest

    Messages:
    3,580
    Likes Received:
    308
    GPU:
    2080 Ti FTW3 Ultra
    Reading this, I felt that at first I would be agreeing with Prophet.
    I don't though. You come off instead as a pseudo intellectual.
    Your comments on the placebo effect are comical. True, since we are all human we are all potentially subject to it, but facts doesn't change facts.
     
    Last edited: Jul 19, 2012
  4. Brendruis

    Brendruis Maha Guru

    Messages:
    1,242
    Likes Received:
    1
    GPU:
    Reference GTX 680 SLI
    Interesting results. I haven't touched this setting in several graphics card generations. I thought the default setting was 3?
     

  5. tweakpower

    tweakpower Banned

    Messages:
    932
    Likes Received:
    1
    GPU:
    MSI HD 6770
    I took your side here, but not because of this picture. If you say there is a difference, me, for one, have no reason to not believe you, since i had some experience with OpenGL games as Doom3, Prey, Quake 4 etc.

    But this graph shows something different for me, if i understand it well, orange is 0, red is 1, green is 2, and so on.... As you can see from the graph, orange (0) is always close to green (2), and that means (for me) that when you select 0, driver default it to 2, with small room for error in measurement (small dif. in graph).

    Long story short, some people have better reflexes than others. Better reflex do not necessarily means better at game itself. Sometimes, especially in games (not only PC, also consoles) better reflex can introduce more problems than help. So, at the end, it's how you prefer, or how you feel, and if you can't feel difference, good for you, if you can, deal with it, none of CORPS, chip makers, game makers even etc. will not waste their time for small percent of people, because time = money.

    No need to ignore oposite opinions really. If nothing, it is really interesting to see all sides.
     
    Last edited: Jul 19, 2012
  6. Prophet

    Prophet Master Guru

    Messages:
    865
    Likes Received:
    34
    GPU:
    Msi 680
    Comical? Please point out one fact I wrote thats not correct and unlike rewt support it with some type of evidence. Just cause you dont understand placebo doesnt mean it doesnt work like I wrote. You are just doing the same as him so far. You sir, are the victicm of herd mentality.
     
  7. Prophet

    Prophet Master Guru

    Messages:
    865
    Likes Received:
    34
    GPU:
    Msi 680
    Yea its the same as always, when new information is brought to the table people are not acting on the information itself but according do what assumptions they have. So if someone plays with vsync and they feel its good and you tell them it brings input lag they (well not everyone obviously just most) tend to react in the negative.

    If you are curious about why you get faster reactions and increased sensitivity to input lag etc, you might want to check out myelin. Basically it insulates nerves in the brain the more you use that nerv the more myelin is embedding that nerv making the nerv impulse faster.
     
  8. Cyberdyne

    Cyberdyne Guest

    Messages:
    3,580
    Likes Received:
    308
    GPU:
    2080 Ti FTW3 Ultra
    I know I know. No one here could possibly be as smart or smarter than you.
    What you said reads as if since we are all possibly subject to the placebo effect, nothing we say can possibly be factual. (Or at least anything anyone ELSE says but yourself).
    And it not only reads like that, but you act the part pretty well too.

    >for example, when showed the graph (which I have to assume is correct, but changes nothing if it's not) you had to refuse to believe it, and had to agree for other reasons. "My experience with this, my feelings about that, " The graph was fine and all, but the fat lady sings when "I" agree on my terms ;)
     
  9. rewt

    rewt Guest

    Messages:
    1,279
    Likes Received:
    15
    GPU:
    Yes
    2 for OpenGL ;)

    Thanks :)

    It seems 1 is the minimum value for both OpenGL and Direct3D in r300 drivers and above. Zero equals driver/app controlled.
     
    Last edited: Jul 19, 2012
  10. rewt

    rewt Guest

    Messages:
    1,279
    Likes Received:
    15
    GPU:
    Yes
    You can do some experimenting of your own with the following tool.

    Notice how the Nvidia driver tends to override application RenderAhead values higher than the driver setting, but not lower. This is in part because of the way DirectX and drivers manage the command queue as I mentioned earlier.

    http://www.kegetys.fi/misc/
     
    Last edited: Jul 19, 2012

  11. [Arnold]

    [Arnold] Guest

    Messages:
    30
    Likes Received:
    0
    GPU:
    EVGA GTX 680 2 GB
    For BF3 there is an ingame setting called RenderDevice.ForceRenderAheadLimit with values of -1, 0, 1, 2 and 3. According to what is known the value of -1 uses the GPU driver setting, 0 disables the feature, 1-3 set the prerendered frames. Use this setting on the ingame console or write it into a user generated "user.cfg" inside the application folder for permanent setting.

    Playing on a 120Hz TFT with a single GTX 680 I found my solution in using NVIDIAs Adaptive VSync 1/2. That way I get a stable vsynced 60 fps on a 120Hz refresh rate with no perceptible input lag at all using a prerendered frame setting of 1. As long as I don't go for a SLI setup to have a steady 120 fps that is a pretty decent setup for me.
     
  12. snowdweller

    snowdweller Guest

    Messages:
    492
    Likes Received:
    0
    GPU:
    Leadtek GTX 580 SLI
    Well when your playing games like Solitare yes 300+ fps probably won't give tearing as the game is sending out alot more frames than the refresh hence you will see a complete picture 99% of the time. Compared to say Crysis @ 150fps with 120hz screen. Did I ever "used" a CRT the answer is yes unfortunately I had a 21" which was not fun to carry around to LANs.
     
  13. Brendruis

    Brendruis Maha Guru

    Messages:
    1,242
    Likes Received:
    1
    GPU:
    Reference GTX 680 SLI
    There is tearing on CRTs it isn't unique to LCDs... it's just that a CRT you can raise the refresh rate higher so it is much less noticeable. You can get the same effect by buying a 120Hz LCD these days :)
     
  14. Raiga

    Raiga Maha Guru

    Messages:
    1,099
    Likes Received:
    0
    GPU:
    GPU
    Its advisable to play with 1-2 values of prerender limit , because all the CPU intense game is going to spoil your consistent frame rate otherwise.

    Also there isn't much loss with 1 or 2 frames ahead, if the game is running at more than 30 FPS..that would translate into 33.34 or 66.67 ms video render latency, which is purely acceptable for most utilization.
     
  15. Prophet

    Prophet Master Guru

    Messages:
    865
    Likes Received:
    34
    GPU:
    Msi 680

    I have 140 fps consistently in the game of my choice atm. Nothing spoiled there.


    Also I cannot tell you how much I disagree with 34-70 ms latency being acceptable. I can accept some lag in game where instant reactions dont matter a lot. Mmorpgs for example. But even there 70 ms would be a lot. In fps, which is what I play mostly, more than whats absolutely nessecary is simply unacceptable. Also I´d like you to confirm the numbers if possible.
     

  16. rewt

    rewt Guest

    Messages:
    1,279
    Likes Received:
    15
    GPU:
    Yes
    Yeah, and being that a value of 0 can harm fps by as much as 25% in games like Skyrim, that's probably another reason Nvidia chose to remove it.
     
  17. Raiga

    Raiga Maha Guru

    Messages:
    1,099
    Likes Received:
    0
    GPU:
    GPU
    Oh, sorry..I was mentioning 33ms(for prerender=1) and 66ms(for prerender=2) for anything running at 30 FPS..

    If the frame rates are faster, then the delay for 1 or 2 frames from prerender would also be lot less.. 140 FPS with prerender 1 & 2 would be around 7ms &14ms ..

    because 7ms(round figure) would be the cost to render each frame on your GPU (for it to be 140FPS).

    -----
    Edit
    Anyways..again, its actually destructive to actually completely remove prerender limit.

    By any standards it should be at least at 1 frame, because the CPU calculations for games are not clamped upto a certain point. If it was, then you'll never need prerender limit.

    Edit 2
    But there could be another way...if the game calculation and render push to GPU are deferential (I don't know the actual terminology).

    Even if the game state(physics, engine, etc) aren't updated in a frame, still the CPU should send the frame push to the GPU and re-render the un_updated game state on screen.

    So in a bucket, you could have:

    (game state/calculation) -> updating frames whenever possible -> (game render pool) -> push every frame (60) -> GPU to render

    In this case, GPU utilization will also be high..

    Or is it already like that? <- yelp, I am completely unfamiliar with this.

    Edit 3
    With the above, the inputs related to camera view could be tied to that GPU rendering with priority (with or without un_updated state)..which would give you smooth camera motion even if the game (worldspace) calculations aren't updated.

    So you don't have dodgy camera movements.

    (I apologize for many multi edits "-_- )
     
    Last edited: Jul 20, 2012
  18. rewt

    rewt Guest

    Messages:
    1,279
    Likes Received:
    15
    GPU:
    Yes
    The command queue also serves to minimize those costly user/kernel mode transitions. It's generally just a bad idea to disable it, even if the system is able to pump out a huge amount of frames. Like Raiga said, the higher the FPS, the lower the latency caused by pre-rendering.
     
  19. Prophet

    Prophet Master Guru

    Messages:
    865
    Likes Received:
    34
    GPU:
    Msi 680
    Do you have anything to support your claims of how long it takes to calculate a frame? Not saying you are wrong, thats pretty consisent with how I understand it and what my experience in gaming. But still, do you have some reliable source?

    My cpu/gpu utilization is pretty high, about as much as I expect from a subpar gameengine along with the cpu utilization. What I mean by this is that my hardware completely outranges the gameengine / settings Im currently using (bfbc2 ie frostbite.. 1.5? with pretty much all the lowest settings and the lowest possible screen resolution before it breaks out the 'schweden' borders. )



    Ill see if my dpc and cpu/gpu utilization changes when chagning pre-rendered frames also. I suspect the know-it-all rewt is yet again just using big words.


    The camera doesnt feel or react dodgy when I set to 0. It just feels laggy when I set it to 1 or higher. I have only tested this thouroughly with bfbc2 though, dont know how other games would react.

    I guess most would say 'ok 7 ms is nothing to worry about'. But I feel those and it affects my aim. Heck, I have even rather extensively tested pretty much all the intel infs cause they give me different dpc´s (and different stability in mouserate).

    So, how do we get John Carmack in here to pitch in? :)
     
    Last edited: Jul 20, 2012
  20. Raiga

    Raiga Maha Guru

    Messages:
    1,099
    Likes Received:
    0
    GPU:
    GPU
    When I meant dodgy, I meant the whole game frames if your FPS is 30..which means the camera turning is also updated at 30ticks/persecond.

    Lol..how absurd..are you absolutely sure about 7ms affecting your aim?

    Try -> http://www.humanbenchmark.com/tests/reactiontime/index.php

    You might just be under the false pretense that you can predict target movements in the game and react according, rather than actually be fast. But you are not in real sense.
     
    Last edited: Jul 20, 2012

Share This Page