Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by Tastic, Jul 16, 2012.
Wait Astyanax is Sora? It all makes sense now
I'd be surprised if Unwinder is the only one that knew all this time.
are we talking about prerendering?
anyway, theres a lot of talk about whats going on in both amd and nvidia's low latency methods, and a general consensus is that both are dynamically alternating between 0 and 1.
this seems to test true as games that don't like it seem to speed up and slow down randomly
but more testing and discussion will be had over the next few days.
Like @RealNC stated, it really depends on the in-game limiter, and how much it fluctuates, but -3 FPS should usually be pretty safe with either an RTSS or in-game limiter, at least with G-SYNC proper (aka G-SYNC w/module; I haven't directly tested -3 FPS limit with G-SYNC Compatible myself, but that should be similar too).
Ok, this is quite the revelation. No wonder why he is such an... Well, you know. It does make sense, I also was quite surprised to see someone so toxic at these forums. I honestly had to use the ignore function on him, reading what he says and the tone he used made me sick. Quite sad he hasn't got a ban yet, he is the same douchebag as he is on the nVIdia forums.
More like "the nVidia forums kid mod with attention issues and no self-esteem taking it on random strangers.
Sora is probably the most obnoxious individual that I've seen at my life at any forums and I've been on several. And it's quite funny that he manages to do that without even offending anyone, his strange and hostile to passive behavior alone probably due to this bitter life and taking it on others, it's enough.
That's news to me which you never mentioned before. Anyway, good thing Nvidia managed to stop themselves from opening up a huge can of ugly worms for backtracking on their statement about the nature of GRD and CRD/SD.
I think that's enough thread derailment here. And I know it's partly my fault as well.
Nvidia's article never mentioned that the feature supports OpenGL, only that it supports DX9 & DX11 and does NOT support DX12 & Vulkan. Has anyone tested whether the low latency feature (specifically the Ultra setting) does anything in OpenGL games?
Nvidia's article states that Ultra setting enables just-in-time frame scheduling, which I assume means that the queue is empty most of the time but is filled by a single frame just as the GPU is ready to render it. With the On setting, the queue is immediately filled with the next frame the moment it becomes empty. So yeah it's always alternating between 0 and 1, with the 0's having much longer duration on Ultra setting and the 1's having much longer duration on the On setting.
I suppose the problem is if the GPU is ready but the queue is still empty due to bad timing or CPU being too busy, the GPU may stall or go into low power mode which causes stutters/slowdowns/speedups if the mistimings happen frequently and inconsistently enough.
Anyway it's still a beta feature, so Nvidia will do more tweaks to get the timing right.
Well, the old implementation certainly did, well before i learned the few frames below triplebuffering trick i was using pre-render to shave off input lag in eduke32.
Not supporting DX12 and Vulkan is very expected, the developer has very fine grained control over these, and I remember Manuel saying something about it.
I'd be worried about the benefits of these api's being undermined by drivers interfering.
A user over on the Gf forums has a post up to get the original implementation back..... going to go suggest a compromise
Yeah, that's probably the most important new feature of these drivers.