Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by Tastic, Jul 16, 2012.
this. Also hes using an Athlon 64. So yea.
People said AMD removed the feature to adjust "flip queue" in their driver. If that's true then we are lucky Nvidia drivers still have the option.
I saw Far Cry 3 had an option to adjust pre-render limit in-game, and it works. Nicely done. (I think 1 is minimum though)
...and NFS MW 2012 isn't really working well for most people. Not much to do with hardware :bang:
Sorry folks but I've discussed this with the driver team and the setting between 0 and 1 is purely cosmetic as there is essentially no difference in how our GPU behaves between the values of 0 and 1. For this reason we will leave the Control Panel UI as it is for now.
This is pure bs.
I couldn't care if it was removed at all, since at 1 i lose roughly 10 FPS in Skyrim compared to 2 and up.
I rather have 60 FPS with Input Lag in menu's than 40-50 FPS and a smooth mouse.
Setting 0 hasn't done anything since Win Vista, that changed when MS changed the display driver model - http://en.wikipedia.org/wiki/Windows_Display_Driver_Model
WDDM added an additional layer to the driver model so the driver doesn't really talk directly to the hardware anymore. Esentially there is an additional buffer in place that wasn't there with Windows 2000/XP.
I don't know what the driver did with the 0 setting in earlier drivers though, it seemed inconsistant from game to game. I think it defaulted to the Application Default 1 or 3 depending on the game.
Just put the option back but make it do nothing at all. You will see people flood you with thanks and graphs showing you how much better it "feels"
Placebo is one hell of a drug.
LOL. Unfortunately what would happen is people would file bugs saying there is no effect in changing from 1 to 0 and that we didn't fix it right.
People believe what there lil minds tell them not what is true or false or anything like that. People need to let this go much like the complaints about color profiles not working fullscreen app, and acting like it the drivers fualt people just like things to complain about.
These are from Nvidia forums and there response
There 0 setting is nonsensical. Really. Honestly. I promise.
You can't produce graphics without the CPU preparing the frame. The GPU is at the end of the workflow.
nvtweakman, I won't say there isn't some driver weirdness going on between the settings you're demonstrating, but the CPU is preparing a singular frame. The anomaly may be caused because the graphics driver is attempting to do something impossible (render the frame before it's setup) and stalling. This would potentially create more input lag, not less.
NVIDIA | EBC Technology Manager
GeForce Forums - Special Counsel
Posted 5 hours ago
certainly am, on occasion.
However, in this case, this information comes from the folks that create the driver. Either a 1 or a 0, resulted in an effective Maximum Pre-Rendered frame setting of 1 frame.
We removed the 0 value because it never had an impact, and is, in fact, impossible. The GPU cannot render any frames without the CPU having prepared it first.
snikt said:Well, Im sorry, your imperial highness, but you are wrong.
NVIDIA | EBC Technology Manager
GeForce Forums - Special Counsel
Posted 12/11/2012 10:27 PM
Lets not forget when talking about this stuff that devs who dont want to do something always have a lot of arguments that it shouldnt be done. 'It takes too long' is the most common but 'it doesnt work that way' is pretty common too. I think its fairly certain to say that the nvidia driver dev team has taken a qualitative dive in the last.. oh I dont know.. 3-4 years or so. (Of course the driver is a lot more advanced these days but still.)
On top of this manuel has twice in the past acknowledged that this setting does make a difference. Once when he said 'this hasnt done a difference since win xp' (or may that wasnt you but some other nv rep) and the second time when he said 'I talked to our devs they said the difference is miniscule'. Im paraphrasing ofc but this is the gist of it.
And the funniest part of this is always the 'you guys who can feel the difference are experiencing the placebo effect'. Theres plenty of scientific evidence supporting the fact that some people can detect things others cant. Just cause you are blind doesnt mean we are experiencing the placebo effect.
Actually, you are describing the placebo effect right there =P.
I somehow tend to believe what the driver developer instead of <random internet guy> says in this case.
I'm pretty fussy about input lag, not many monitors have decent response times and it took me long enough to move away from crt's.
I have played fps on pc since the quake days and although I'm not as serious (hurr) player as I once was I still destroy most pub games.
I messed with pre rendering a lot and imo the default "3" setting is quite bad but obviously gives best frames per second.
But I could never tell a difference from 1 and 0 most monitors add at least 1 frame of lag (a lot add much more)
so when set to 1 it's in most cases the lowest part of the chain that's causing input lag.
It also doesn't make sense how it could possibly render no frames on the cpu, surely that would cause lag the otherway (the gpu waiting for the cpu?)
Maybe someone could explain that in detail
It doesn't, the GPU has*** to wait for CPU to give new information about the game world updates(Physics + movement + everything)for GPU work on what needs to be seen in the view/camera.
So GPU just doesn't render a new screen and waits until CPU sends what changes in the view/camera, hence frame rate will be low.
If CPU doesn't send any information to GPU, the game might very well be struck onscreen with nothing happening and most of the time little or no response from any input_keyboard/mouse/gamepads (Unresponsive).
This is why the 0 setting could potentially create more trouble than it's worth. You're essentially stalling both the CPU and GPU, reducing performance.
Edit- originally I said it potentially creates more input lag, but this is only what Nvidia would have us to believe with recent statements on their forums.
Yes with Windows XP the 0 setting surely had an effect. I'm also able to demonstrate visually that it creates the perception of reduced input lag. But unfortunately, the setting was also removed from Windows XP drivers since r300. There really is no proper excuse for this that Nvidia could provide.
the rule is the strongest the cpu is the most pre-rendered frames can compute easily...
the best is "2"...not the default 3...i can't understand how 3 predominated...
2 is the standard for OpenGL at least..
I recently posted the following in another topic but I think it deserves to be here also.
I can't imagine all NVIDIA's employees failing to experiment with input lag to find the truth, but that appears to be the case judging by what yourself and other gamers have said ever since the 0 setting was removed.
It seems they need to hire this guy:
I've followed similar methodology and there really is a clear difference between pre-rendered frames 0 and 1. In the above example, the console window would lag slightly less behind the mouse cursor @0 than @1 (it's most easily detected with vsync enabled).
No, it is bs. 0 cannot exist.
This is done in the CPU before rendering a single frame, OpenGL stack:
Set up OpenGL context.
Set up shader program (fragment+vertex+geometry shaders).
Set up your uniforms (constants across shaders in a single shader program).
Upload your buffers to the GPU (vertex,normal,uv, index and texture data).
Issue draw call to the GPU. <- This is where the GPU work starts and CPU work ends.
This is all done in the CPU before each frame. The only difference with DirectX is the uniforms behaviour (cbuffers on DirectX) and the shader program process (DirectX uses pre-compiled shaders).
You cant issue a draw call without doing all of that. The GPU doesn't gets all the data it needs magically because you set up your frames to 0.