Guru3D.com Forums

Go Back   Guru3D.com Forums > Videocards > Videocards - NVIDIA GeForce Drivers Section
Videocards - NVIDIA GeForce Drivers Section In this section you can discuss everything GeForce driver related. GeForce ForceWare and GeForce Exerience drivers are for NVIDIA Quadro and all GeForce based videocards.



Reply
 
Thread Tools Display Modes
Old
  (#101)
Prophet
Master Guru
 
Prophet's Avatar
 
Videocard: Msi 680 Gtx Twin Frozr
Processor: Intel Sb@4.7
Mainboard: Asus P8Z68V Progen3
Memory: 12 Gb Kingston
Soundcard: Asus Essence STX|Akg k701
PSU: Corsair 1200w
Default 08-04-2012, 21:25 | posts: 776 | Location: Heaven

Quote:
Originally Posted by Falkentyne View Post
Actually, he does.
I don't even own an Nvidia card (besides a Geforce 4), but I can tell you directly that Prerender limit functioned better, and PERFECTLY on windows XP, while it acts different on windows 7 (at least on AMD Cards, but I'm betting my buns this also applies to Nvidia too). Now, on XP, a few old games (Drakan Order of the Flame comes to mind) would crash on startup with a prerender limit of 0 (this might have been a driver bug back then on Detonator drivers, forgot if this was fixed), setting a prerender limit of 0 would almost completely remove any sort of mouse lag if you were at 60 fps or higher, with vsync on.

However windows 7 has significantly higher mouse lag with the same prerender as XP. And setting a prerender limit of 1 causes strange things to happen that did NOT happen in XP.

Just to see the true test of prerender limit of 0 in XP, you NEED a CRT monitor. Sorry, LCD guys, but you simply won't be able to tell the smoothness when comparing it with W7 with a 120hz LCD.

The best test:
Run UT2004. Enable vsync, use 60 hz refresh rate with a CRT. With a prerender limit of 0, you will has a very slight lag feeling, but the game will be fully 100% playable and the turning will be completely smooth. (Turn the mouse slowly--you wlll notice the turning will be GLASS SMOOTH, and will look exactly the same as turning your head in real life). With the default prerender limit (3), and 60hz you will have lag that will make the game feel as if you are playing in molasses. If you then force the refresh rate to 100, you will see the lag get much lower. Basically the lag at 100 hz refresh rate and limit=3 will feel about the same as 60hz refresh rate and prerender limit of 0.

Now just for kicks, set a limit of 15 (you may have to registry edit for this. At least it works with AMD cards (FlipQueueSize=15 string value). Still in XP. Now, in UT, you will have about HALF a second of mouse input lag at 60 hz refresh rate. And you will notice it ALL the time.

Now going back to windows 7.
Prerender limit of 1 in UT: 60hz refresh rate:
The first thing you will notice is that there is more input lag than there was in XP. Also, the game does NOT feel anywhere near as smooth; it will seem as if the game is 'jumping' from pixel to pixel instead of smoothly turning (you will only notice this on a CRT screen! LCD's are NOT fast enough !!). And the mouse lag will be much more noticable and annoying. Also if you do this in CS:Go (except use a 100 hz refresh rate now), on the main menu, the mouse movement of the pointer will be jittery instead of smooth (AGAIN you need a CRT to notice this!). Also, in CS:Go, you will get horrible frame jittering in many areas, when close to a wall (MOST noticable by the fences in Train at T spawn).

Prerender limit of 2: 60hz refresh rate:
CS:Go: mouse pointer smooth in main menu. The jittering is gone on Train, by the fences at T spawn). UT STILL is not smooth (but it's smoother). Mouse lag now makes 60hz vsync on just unplayable.

Prerender limit of 3 (aka default): 60hzrefresh rate:
UT is smooth now. No frame skipping. But mouse lag makes this unplayable.

Limit 15:
Ok, now we see where W7 and XP differ for sure now.
At 15, you will see only SLIGHTLY higher mouse lag than at 3 (default), but it seems like some areas of the game (CS:Go) will suddenly cause a HUGE increase in mouse lag while other areas will be fine. UT will have slightly higher mouse lag but not the 1/2 second lag of XP.

So it definitely is a big difference compared to XP. The jitteriness of the mouse cursor in CS:Go (at 100 hz refresh rate mind you) with a prerender of 1 as well as the UT2004 panning jitterness at 60 hz) is a giveaway that something different is going on.

However if you use 100hz refresh rate in UT instead of 60 hz (in W7), prerender limit of 1 is glass smooth with no noticable input lag, while XP was glass smooth at 60 hz.

TL;DR: Basically, in XP: Set a prerender limit (or Flip Queue Size) of 0 and leave it there and have NO drawbacks. In W7, setting "1" (lowest value) has drawbacks that were NOT present in XP at 0 OR 1.

If you guys want to test that in 7 on your Nvidia cards, go ahead.
Remember vsync must be enabled otherwise you will hardly notice anything. But people without 120hz screens and who are capped at 60 fps are DEFINITELY getting the short end of the stick here in windows 7.
Flip size que 0 was removed in the drivers years ago, even when setting it with att. The drivers have a min of 1 simply put. Just what nvidia are tryhing to do now.

Also I, and a few mates of mine that where at the very top of the q3 scene, would say that there are lcds that are just as fast as crts. Mine is 2 ms iirc. The reason you might notice a difference is the same reason its easier to detect difference between 60 and for example 120 hz on a crt than on a lcd. The frames are rendered differently.
   
Reply With Quote
 
Old
  (#102)
Prophet
Master Guru
 
Prophet's Avatar
 
Videocard: Msi 680 Gtx Twin Frozr
Processor: Intel Sb@4.7
Mainboard: Asus P8Z68V Progen3
Memory: 12 Gb Kingston
Soundcard: Asus Essence STX|Akg k701
PSU: Corsair 1200w
Default 08-04-2012, 21:28 | posts: 776 | Location: Heaven

Quote:
Originally Posted by rewt View Post
Kudos

All they need do is compare r296 and r300 drivers with a pre-render of 1, then compare again r296 with a pre-render of 0. In my trials there is a notable performance difference between 0 and 1, which proves the settings are not redundant.

Besides, if they were really concerned about removing redundant and inapplicable settings, why haven't they removed those anisotropic filtering optimizations and corresponding profile bugs long ago?
Funny I guess you are superhuman also then, since you earlier claimed only superhumans could sense a difference. You are just stupid, however much technical knowledge you claim to possess.
   
Reply With Quote
Old
  (#103)
gx-x
Maha Guru
 
gx-x's Avatar
 
Videocard: MSI 780Ti Gaming
Processor: intel i5 3570K
Mainboard: ASRock Extreme4 Gen3
Memory: patriot 4x2GB @1333 ddr3
Soundcard: Yamaha RX-V550 w/!JBL
PSU: tT SmartSE 530W
Default 08-04-2012, 23:04 | posts: 946 | Location: Serbia

Quote:
Originally Posted by rewt View Post
All optimizations besides trilinear have no effect on current hardware. Trilinear optimization is also a redundant setting since HQ & Q modes toggle it automatically.
that's because you don't know where to look for them. Try game profiles in drivers, but not in control panel Do you really think that when some game gets, say 5% performance increase with new driver it is a miracle? Or re-written code from bottom up? Using some power in GPU that engineers didn't know it's there? hehe...no. it's optimizations embedded into a profile for a game and you cannot see it as an end user, and nV/AMD hopes you wont notice anything strange...Trillinear has nothing to do with this unless they ban anisotropic filtering from a profile and force trilinear instead, but we would notice that
   
Reply With Quote
Old
  (#104)
gx-x
Maha Guru
 
gx-x's Avatar
 
Videocard: MSI 780Ti Gaming
Processor: intel i5 3570K
Mainboard: ASRock Extreme4 Gen3
Memory: patriot 4x2GB @1333 ddr3
Soundcard: Yamaha RX-V550 w/!JBL
PSU: tT SmartSE 530W
Default 08-04-2012, 23:08 | posts: 946 | Location: Serbia

Quote:
Originally Posted by Prophet View Post
The reason you might notice a difference is the same reason its easier to detect difference between 60 and for example 120 hz on a crt than on a lcd. The frames are rendered differently.
you mean displayed differently right? I hope you do...

/offtopic
   
Reply With Quote
 
Old
  (#105)
rewt
Maha Guru
 
Videocard: √
Processor: √
Mainboard: √
Memory: √
Soundcard: √
PSU: √
Default 08-05-2012, 01:51 | posts: 1,245 | Location: Americas

Quote:
Originally Posted by gx-x View Post
Try game profiles in drivers, but not in control panel
No. The optimizations I was referring to are clearly accessible in the control panel.

Quote:
Originally Posted by gx-x View Post
Trillinear has nothing to do with this unless they ban anisotropic filtering from a profile and force trilinear instead, but we would notice that
Trilinear filtering and anisotropic filtering are not mutually exclusive.

Besides, my original question was rhetorical (didn't require an answer) and illustrated my point. There is no need to derail the topic any further.

Last edited by rewt; 08-05-2012 at 03:05.
   
Reply With Quote
Old
  (#106)
pekka
Master Guru
 
Videocard: MSI GTX580twin frozr PE
Processor: Core i5 3570K
Mainboard: ASUS MAximus Gene V
Memory: G skill. Sniper 6gb
Soundcard:
PSU: Corsair 750W TX V2
Default 08-05-2012, 10:42 | posts: 317 | Location: Sweden

i see i really never knew what this was, sometimes iŽd set it to 5 and never knew how, it would impact miy gaming so the higher i choose the more it might impact my FPS or the Loewr the setting it also impacts the FPS in some way right?
   
Reply With Quote
Old
  (#107)
gx-x
Maha Guru
 
gx-x's Avatar
 
Videocard: MSI 780Ti Gaming
Processor: intel i5 3570K
Mainboard: ASRock Extreme4 Gen3
Memory: patriot 4x2GB @1333 ddr3
Soundcard: Yamaha RX-V550 w/!JBL
PSU: tT SmartSE 530W
Default 08-05-2012, 14:32 | posts: 946 | Location: Serbia

Quote:
Originally Posted by rewt View Post
No. The optimizations I was referring to are clearly accessible in the control panel.



Trilinear filtering and anisotropic filtering are not mutually exclusive.

Besides, my original question was rhetorical (didn't require an answer) and illustrated my point. There is no need to derail the topic any further.
I was giving you a clue to why some things don't work on some games you tried. If you rename the game exe (and it works after that) to something nV doesn't have a profile for, there is a chance it will work. Not prerender 0 thou, that has been forced to default value.

and trillinear and aniso are mutualy exclusive since they are two different approaches to texture filtering. Anisotropic filtering over trilinear would render trilinear redundant and vice verse. There are tech articles about it, google it. Besides, it's obvious when you know how they work. If you say no again, I hope it will be with something to back that up
   
Reply With Quote
Old
  (#108)
rewt
Maha Guru
 
Videocard: √
Processor: √
Mainboard: √
Memory: √
Soundcard: √
PSU: √
Default 08-05-2012, 19:39 | posts: 1,245 | Location: Americas

Quote:
Originally Posted by pekka View Post
i see i really never knew what this was, sometimes iŽd set it to 5 and never knew how, it would impact miy gaming so the higher i choose the more it might impact my FPS or the Loewr the setting it also impacts the FPS in some way right?
I have done my best to explain it as clearly and with as few words as possible. Higher values have the potential to increase performance (fps) at the expense of input lag. Let me know if there is still something about this that you don't understand.

@gx-x
I assume your post was directed at me (I couldn't read it), you made my ignore list yesterday. PM me when Mr. Hagedoorn invites you to be a member of the Guru3D team of software engineers (as he has already done for me a decade ago) and I'll reconsider.

Last edited by rewt; 08-05-2012 at 20:13.
   
Reply With Quote
Old
  (#109)
connta
Banned
 
Videocard: MSI GTX660Ti TwinFrozr IV
Processor: i5 2500K / Megahalems
Mainboard: Sabertooth P67
Memory: Mushkin RL 8GB 1866Mhz
Soundcard: X-Fi Titanium/Z-5500
PSU: Strider Plus 850W
Default 08-06-2012, 08:43 | posts: 88 | Location: serbia

there are lag input testers out there, all you who claim to see 5-15ms difference in lag, i want you to score under 50 on those and then we can debate if you can trace 5-15ms lag or not...

average human response is 250ms, not the worst, average! as a seasoned gamer i can do ~180ms, thats in the 1/5 of a second... now all of you come here and tell me that you can notice when there is +-5ms in your input lag? i call bollocks really. on testers my perception of a 180ms try and a 220ms one is the same really, after i get the result i do get the feeling it was faster or slower than average but thats just how our brains work, if i try to guess how i did i almost never do. neither would anyone.

thats like saying that you can notice the difference between 200 and 195 beans in a jar, or that you can tell me if the temperature in the room is 20 or 19,9C, or if an object weight is 2g or 1,9g... and you cant, no one can. we can test it and play the guessing game and you will be right in a percentage of tries but that doesnt mean you actually knew (noticed) the actual value.

also all the argument from the superman in this thread are based on "notice" and "feel" which means they are heavily subjective. like pro race car drivers cant feel if their lap was +-2,5% without a stopwatch neither can you without proper grading system which has to have numbers in it, not "feels".

so until one of us writes a 3d response time measuring program in which we can test things all this is moot. both yours and my opinion.
   
Reply With Quote
Old
  (#110)
BrightCandle
Member Guru
 
Videocard: MIS / 680 / 2GB SLI
Processor: Intel 3930k
Mainboard: Asus X79 Pro
Memory: 16 GB (4x4) Corsair 1600
Soundcard: Asus Xonar D2X
PSU: Corsair 1050HX
Default 08-06-2012, 22:22 | posts: 88 | Location: UK

It was only at the weekend that John Carmack was explaining the problems with Sony's VR glasses which introduced a whopping 3 screens of extra latency. He explained that when it came to VR that sort of increase was totally unacceptable and could make you suffer motion sickness. Indeed he had issues with latency and has spent large amounts of time programming to reduce it in his games as much as possible, because he finds a frame noticeable.

Turns out that while we do have surprisingly slow reaction times our ability to perceive changing images is much faster than our reaction time. People notice a single frame of a flashed image into the few millisecond range, even 120Hz isn't really sufficient. The eye can be tricked into motion reasonably well at 24fps but most gamers would tell you that isn't sufficient. Many are much happier at 60Hz, some want more like 120Hz but even that we know has perceptible limits. What the research actually suggests is its much more complicated than reaction times or basic FPS and to fool us in all circumstances is going to take a lot more speed than we have.

Reducing the overall latency is Id's goal, they would very much like it to be as near to zero as possible. Some gamers really notice input lag, some are used to it, but we are all suffering its effects whether we realise it or not. The brain is simply compensating quite happily.
   
Reply With Quote
Old
  (#111)
Tastic
Newbie
 
Videocard: Sapphire HD 7850
Processor: Intel i5-2500k
Mainboard: Gigabyte Z68MA-D2H-B3
Memory: DDR3 1333 16gb
Soundcard:
PSU: Antec (HCG-750) PSU
Default 08-07-2012, 11:31 | posts: 14

Quote:
Originally Posted by Prophet View Post
Flip size que 0 was removed in the drivers years ago, even when setting it with att. The drivers have a min of 1 simply put. Just what nvidia are tryhing to do now.

Also I, and a few mates of mine that where at the very top of the q3 scene, would say that there are lcds that are just as fast as crts. Mine is 2 ms iirc. The reason you might notice a difference is the same reason its easier to detect difference between 60 and for example 120 hz on a crt than on a lcd. The frames are rendered differently.
I feel I have to say that no current LCD can match the responsiveness of a CRT monitor. CRT's have 0ms input lag as there is no conversion before it reaches the screen (I'm sure you're already aware of this). Just can't do it given the current digital technology being employed. I utilize a very high quality ASUS 120hz LCD monitor with 2ms (GTG), and I have compared it to my original CRT back from my Q3 days. There's just no comparison :\.

I played Q3 competitively at the highest level back in 2000 (c3) and I wish I didn't notice the differences between playing on an LCD from a CRT. The input lag is noticeable to the point where it almost ruins the game for me. Input lag is a huge deal if there's even the slightest of delay, all of which I notice. It's more frustrating than anything else, even after picking up the game again (Quake Live) approximately 10 years later.

From this forum I started, I did more research and found a specific nvidia driver set (295.75) I believe, and turned off gpu scaling and changed pre-rendered frames to 1. I played with this setting for about a week and tweaked just about everything you can possibly tweak in windows 7 to lower DPC latency. The delay was still 'too' noticeable for what I found acceptable. I decided to switch it to 0 to see if I would notice a difference, and surely enough, I noticed a huge improvement in reduced input lag. Hit detection was far more consistent, and overall movement was more responsive. The reason why I noticed it with absolute certainty this time was because I played with pre-rendered frames @ 1 for about 5-7 days so I had time to adapt to the feel. Changing to 0 at this point had a desirable impact. The majority of the Q3 community will also agree that 0 is the desired setting for the issue discussed here.

After reading the Nvidia engineer's comment regarding the difference between 0 and 1, only added more confusion to the issue. I could be looking too much into it, sure, but even he said '0 'SHOULD' be the same as 1.' If it were so definitive, I am unsure as to why he suggests it 'SHOULD' be the same, implying that it may not. The lack of definitive responses is a bit concerning .

In regards to input lag w/ LCD's, I conclusively found that your BIOS settings add arguably the most input lag to LCD's (Digital connections that require conversion). In your BIOS, HPET (High Precision Event Timer), & Disable CPU Enhanced Halt (C1E), EIST (SpeedStep) & Cool'n'Quiet. These should all be disabled. c3/c6 should also be disabled. Feel free to google for more information, but it can be confirmed by using DPC Latency Checker. These changes had the most significant impact on lowering input lag for me. A bit off subject but I hope it helps those of whom interested in the matter.

-T
   
Reply With Quote
Old
  (#112)
kalston
Newbie
 
Videocard: Gigabyte GTX 670
Processor: Intel i5 2500k
Mainboard:
Memory:
Soundcard:
PSU: Seasonic X Series 560w
Default 08-07-2012, 12:51 | posts: 9

Quote:
Originally Posted by Tastic View Post
I feel I have to say that no current LCD can match the responsiveness of a CRT monitor. CRT's have 0ms input lag as there is no conversion before it reaches the screen (I'm sure you're already aware of this). Just can't do it given the current digital technology being employed. I utilize a very high quality ASUS 120hz LCD monitor with 2ms (GTG), and I have compared it to my original CRT back from my Q3 days. There's just no comparison :\.

I played Q3 competitively at the highest level back in 2000 (c3) and I wish I didn't notice the differences between playing on an LCD from a CRT. The input lag is noticeable to the point where it almost ruins the game for me. Input lag is a huge deal if there's even the slightest of delay, all of which I notice. It's more frustrating than anything else, even after picking up the game again (Quake Live) approximately 10 years later.
Actually few LCDs have input lag (or it's like 1-5ms which you'll never notice) when playing at the native resolution. However because LCDs are sample and hold displays they will always feel laggier than CRTs. On a CRT each new frame is drawn on top of a black screen while on a LCD each frame is drawn on top of the OLDER frame resulting in this motion blur that is easily felt. Higher refresh rates reduce the effect but it's still there, you might need 240hz or so before you can start to forget about it. In any case even at 120hz it's indeed still pretty easy to tell the difference with a 120hz CRT.

Edit: of course there's also the fact that LCDs have pixel response lag, which altogether with the sample & hold effect creates a very different experience compared to CRTs.

Quote:
Originally Posted by Tastic View Post
From this forum I started, I did more research and found a specific nvidia driver set (295.75) I believe, and turned off gpu scaling and changed pre-rendered frames to 1. I played with this setting for about a week and tweaked just about everything you can possibly tweak in windows 7 to lower DPC latency. The delay was still 'too' noticeable for what I found acceptable. I decided to switch it to 0 to see if I would notice a difference, and surely enough, I noticed a huge improvement in reduced input lag. Hit detection was far more consistent, and overall movement was more responsive. The reason why I noticed it with absolute certainty this time was because I played with pre-rendered frames @ 1 for about 5-7 days so I had time to adapt to the feel. Changing to 0 at this point had a desirable impact. The majority of the Q3 community will also agree that 0 is the desired setting for the issue discussed here.

After reading the Nvidia engineer's comment regarding the difference between 0 and 1, only added more confusion to the issue. I could be looking too much into it, sure, but even he said '0 'SHOULD' be the same as 1.' If it were so definitive, I am unsure as to why he suggests it 'SHOULD' be the same, implying that it may not. The lack of definitive responses is a bit concerning .
And how did prerender 0 (bad setting as someone explained previously in this thread)affect Q3, an OpenGL game when the setting was Direct3D exclusive until one or two months ago?
And no, the "majority" of Q3 players does not agree. In fact the very best players don't touch such settings and concentrate on getting a good monitor and a good mouse - which makes a lot more sense.

Quote:
Originally Posted by Tastic View Post
In regards to input lag w/ LCD's, I conclusively found that your BIOS settings add arguably the most input lag to LCD's (Digital connections that require conversion). In your BIOS, HPET (High Precision Event Timer), & Disable CPU Enhanced Halt (C1E), EIST (SpeedStep) & Cool'n'Quiet. These should all be disabled. c3/c6 should also be disabled. Feel free to google for more information, but it can be confirmed by using DPC Latency Checker. These changes had the most significant impact on lowering input lag for me. A bit off subject but I hope it helps those of whom interested in the matter.

-T
DPC latency is in the range of microseconds unless your computer is horribly ****ed up so sorry but it doesn't have its place here. Maybe you can tell a difference between HPET on/off on some systems but NOT because of a few microseconds difference, it would be because Windows is having issues with different conflicting timers or something (not that I experienced that myself but I do believe it can happen). Remember, MICROseconds not milliseconds. That's a lot smaller.

Last edited by kalston; 08-07-2012 at 12:55.
   
Reply With Quote
Old
  (#113)
tweakpower
Banned
 
Videocard: MSI HD 6770
Processor: FX-4100 4.0Ghz
Mainboard: MSI 970A-G46
Memory: HuperX 2x4GB PC12800
Soundcard: Realtek Onboard
PSU: LC-Power 600W
Default 08-07-2012, 13:18 | posts: 932 | Location: Serbia

Quote:
Originally Posted by kalston View Post
And how did prerender 0 (bad setting as someone explained previously in this thread)affect Q3, an OpenGL game when the setting was Direct3D exclusive until one or two months ago?
And no, the "majority" of Q3 players does not agree. In fact the very best players don't touch such settings and concentrate on getting a good monitor and a good mouse - which makes a lot more sense.
Well, there is no bad setting as anyone want to suggest. It all depends from what your aim is.
Quote:
Originally Posted by kalston View Post
DPC latency is in the range of microseconds unless your computer is horribly ****ed up so sorry but it doesn't have its place here. Maybe you can tell a difference between HPET on/off on some systems but NOT because of a few microseconds difference, it would be because Windows is having issues with different conflicting timers or something (not that I experienced that myself but I do believe it can happen). Remember, MICROseconds not milliseconds. That's a lot smaller.
Had that problem with old MOBO, so you are right it can happen. I think the guy pointed at this problem was not aware of it (not aware that have nothing to do with LCD's). And in fact, it is not related to display delay problem.
   
Reply With Quote
Old
  (#114)
Pill Monster
Banned
 
Videocard: 7950 Vapor-X 1100/1500
Processor: AMD FX-8320 @4.7
Mainboard: ASUS Sabertooth 990FX R2
Memory: 8GB HyperX Beast 2400
Soundcard: X-Fi Fatal1ty, Wharfedale
PSU: AcBel M8 750 Bronze
Default 08-07-2012, 13:51 | posts: 25,234 | Location: NZ

Quote:
Originally Posted by gx-x View Post
and trillinear and aniso are mutualy exclusive since they are two different approaches to texture filtering. Anisotropic filtering over trilinear would render trilinear redundant and vice verse. There are tech articles about it, google it. Besides, it's obvious when you know how they work. If you say no again, I hope it will be with something to back that up
Wrong.
Anisotropic is used together with trilinear or bilinear filtering.

Now try and disprove me.


Quote:
Originally Posted by Prophet View Post
F
Also I, and a few mates of mine that where at the very top of the q3 scene, would say that there are lcds that are just as fast as crts. Mine is 2 ms iirc. The reason you might notice a difference is the same reason its easier to detect difference between 60 and for example 120 hz on a crt than on a lcd. The frames are rendered differently.
LCD's aren't as fast as a CRT monitor and it has nothing to do with refresh rate.
LCD screens technically don't have a refresh rate since they don't have an electron gun, instead they have shutters.....

Last edited by Pill Monster; 08-07-2012 at 14:05.
   
Reply With Quote
Old
  (#115)
tweakpower
Banned
 
Videocard: MSI HD 6770
Processor: FX-4100 4.0Ghz
Mainboard: MSI 970A-G46
Memory: HuperX 2x4GB PC12800
Soundcard: Realtek Onboard
PSU: LC-Power 600W
Default 08-07-2012, 15:13 | posts: 932 | Location: Serbia

I think i finally find out from where confusion comes from (for all of us) concerning LCD's and CRT's, read a few things, researched, and it seems that response time (2ms, 5ms, 8ms etc.) have nothing to do with "input lag".

That is just a time of color refresh of the pixels, and general refresh of them, but input lag (that one can notice) is more connected to the way LCD's work, to the matrix maybe, and all processing LCD have to do in order to give you picture on screen. So, one is better than other, and it is not necessary related to response time nor refresh rate it seems.

It is simple fact that analog is always faster then digital, but digital should be more precise (and in most cases is). CRT's are analog displays, while most of LCD's are digital (at this time), and that from where all confusion comes. Take for example optical connections (for internet), and "choke" of the network and why it happens. While at stop places, signal is converted from analog to digital, and again reverted to analog, you got that lag, and sometimes you are able to see it clearly (watching live sports for example from distant country), and if is not calculated in advance as it should, or if something goes wrong when it is calculated, you get squares, broken picture or/and sound at the moment etc.
   
Reply With Quote
Old
  (#116)
connta
Banned
 
Videocard: MSI GTX660Ti TwinFrozr IV
Processor: i5 2500K / Megahalems
Mainboard: Sabertooth P67
Memory: Mushkin RL 8GB 1866Mhz
Soundcard: X-Fi Titanium/Z-5500
PSU: Strider Plus 850W
Default 08-08-2012, 09:10 | posts: 88 | Location: serbia

Quote:
I think i finally find out from where confusion comes from (for all of us) concerning LCD's and CRT's, read a few things, researched, and it seems that response time (2ms, 5ms, 8ms etc.) have nothing to do with "input lag".
all of us. thats rich. you can find out that by just visiting wikipedia.

you are right about one thing, "ms" number does not have much to do with input lag. the "ms" number for lcd monitors that manufacturer provides is neither relevant (as long as it is under 8ms true ISO) nor it is accurate in most monitors. true ISO transitions are black-white-black and that is the real transition time for any lcd display. in any monitor today you will find gray to gray transition number (GTG) which is again misleading and not accurate. same thing the manufacturers are doing with contrast ratio, exploiting the math calculation to make the result come in numbers 1 000 000 : 1 or even more, some actually go to say their panels have "infinite" contrast which may be true when put in a faulty math formula (they divide by 0 and conclude that = infinity) that manufacturers have no problem using. this happens since the average customer will only compare manufacturing specs when making a purchase decision in which infinite > 1 000 000 and thus its better.

marketing, marketing, marekting.

Last edited by connta; 08-08-2012 at 09:15.
   
Reply With Quote
Old
  (#117)
tweakpower
Banned
 
Videocard: MSI HD 6770
Processor: FX-4100 4.0Ghz
Mainboard: MSI 970A-G46
Memory: HuperX 2x4GB PC12800
Soundcard: Realtek Onboard
PSU: LC-Power 600W
Default 08-08-2012, 12:56 | posts: 932 | Location: Serbia

Quote:
Originally Posted by connta View Post
all of us. thats rich. you can find out that by just visiting wikipedia.

you are right about one thing, "ms" number does not have much to do with input lag. the "ms" number for lcd monitors that manufacturer provides is neither relevant (as long as it is under 8ms true ISO) nor it is accurate in most monitors. true ISO transitions are black-white-black and that is the real transition time for any lcd display. in any monitor today you will find gray to gray transition number (GTG) which is again misleading and not accurate. same thing the manufacturers are doing with contrast ratio, exploiting the math calculation to make the result come in numbers 1 000 000 : 1 or even more, some actually go to say their panels have "infinite" contrast which may be true when put in a faulty math formula (they divide by 0 and conclude that = infinity) that manufacturers have no problem using. this happens since the average customer will only compare manufacturing specs when making a purchase decision in which infinite > 1 000 000 and thus its better.

marketing, marketing, marekting.
Ye, contrast specified in manual etc. have nothing to do with reality, also the response time. I said all of us, since most people don't bother to search it in detail (including me).
   
Reply With Quote
Old
  (#118)
rewt
Maha Guru
 
Videocard: √
Processor: √
Mainboard: √
Memory: √
Soundcard: √
PSU: √
Default 08-09-2012, 03:35 | posts: 1,245 | Location: Americas

Quote:
Originally Posted by Pill Monster View Post
Wrong.
Anisotropic is used together with trilinear or bilinear filtering.

Now try and disprove me.
Right, thanks. Ironic how he uses the word "obvious" when he doesn't even understand it himself..
   
Reply With Quote
Old
  (#119)
tsunami231
Ancient Guru
 
tsunami231's Avatar
 
Videocard: EVGA 660gtx sig2
Processor: i7 6700k NH-D14
Mainboard: Asrock z170 Extreme 4
Memory: Corsair LPX 2400mhz 16gb
Soundcard: Realtek HD Audio
PSU: Antec HCG 750m
Default 08-09-2012, 05:58 | posts: 7,464 | Location: USA

and i thought i was stickler for input lag and all that, I have not felt input lag on any of pc builds in over 5+ years.
   
Reply With Quote
Old
  (#120)
rewt
Maha Guru
 
Videocard: √
Processor: √
Mainboard: √
Memory: √
Soundcard: √
PSU: √
Default 08-09-2012, 19:19 | posts: 1,245 | Location: Americas

Quote:
Originally Posted by tsunami231 View Post
I have not felt input lag on any of pc builds in over 5+ years.
I haven't "felt" my hair growing either, but it happens.

Lets try sticking to the "truth" about pre-rendering 0, not feelings or opinions.

Michael Phelps, the most decorated Olympian ever, won Olympic gold by one hundredth of a second and people were amazed. But that's still twice slower than the latency some people in this thread were complaining about.

Last edited by rewt; 08-09-2012 at 19:36.
   
Reply With Quote
Old
  (#121)
tsunami231
Ancient Guru
 
tsunami231's Avatar
 
Videocard: EVGA 660gtx sig2
Processor: i7 6700k NH-D14
Mainboard: Asrock z170 Extreme 4
Memory: Corsair LPX 2400mhz 16gb
Soundcard: Realtek HD Audio
PSU: Antec HCG 750m
Default 08-09-2012, 20:02 | posts: 7,464 | Location: USA

oh have no doubt that it happens, It just i think some people go out of there way to find it. I tried with 0 ~ 3 and never saw a diffrence. in any game or program i used in the last 5 years.

This topic is almost like pagefile topic, just not to that extreme yet
   
Reply With Quote
Old
  (#122)
rewt
Maha Guru
 
Videocard: √
Processor: √
Mainboard: √
Memory: √
Soundcard: √
PSU: √
Default 08-09-2012, 20:31 | posts: 1,245 | Location: Americas

Quote:
Originally Posted by tsunami231
It just i think some people go out of there way to find it.
Agreed.

I only went out of my way during experiments to prove there is a major difference between 0 on r296 drivers and 0 on r300+ (maybe I will decide to provide some nice graphs after all, to silence the criticism from newbies). During gaming though, there's a balance between performance and input lag that I strive for.

Input lag and screen tearing do not exist for most gamers (at least, in their minds). That's what I mean by "ignorance is bliss". This must be why Nvidia doesn't care enough to make certain all control panel options function consistently.

Quote:
Originally Posted by tsunami231 View Post
This topic is almost like pagefile topic, just not to that extreme yet
Any time people (trolls?) begin to question facts, and pretend they are knowledgeable, topics eventually end up in the gutter (with everyone left confused from bad information). And it doesn't exactly help when Nvidia themselves provide conflicting/incorrect information right along side it (but as I don't shoot the messenger, I will not fault ManuelG for this).

Last edited by rewt; 08-09-2012 at 22:05.
   
Reply With Quote
Old
  (#123)
tweakpower
Banned
 
Videocard: MSI HD 6770
Processor: FX-4100 4.0Ghz
Mainboard: MSI 970A-G46
Memory: HuperX 2x4GB PC12800
Soundcard: Realtek Onboard
PSU: LC-Power 600W
Default 08-10-2012, 03:09 | posts: 932 | Location: Serbia

Quote:
Originally Posted by rewt View Post
Agreed.

I only went out of my way during experiments to prove there is a major difference between 0 on r296 drivers and 0 on r300+ (maybe I will decide to provide some nice graphs after all, to silence the criticism from newbies). During gaming though, there's a balance between performance and input lag that I strive for.

Input lag and screen tearing do not exist for most gamers (at least, in their minds). That's what I mean by "ignorance is bliss". This must be why Nvidia doesn't care enough to make certain all control panel options function consistently.
I wish to help you there. But because i don't know what is GPU/GPU's you using, here is some tips i know (because i tested it). nVidia driver version 266.58 WHQL driver have difference between all 0, 1, 2, 3 etc. Most major dif. is between 0 and 1 (on mine system, under XP and 7), and some minor but yet important dif. between 1 and 3/2. You can also test it with app. engine with or without GPU to CPU synchronization, and see impact of all settings in both cases.

First compare dif. between 266 and 296, and if there is none (or just some small driver changes) then compare 296 to 300+ and see if there is a dif.

I don't have time ATM to do this for you (i would do it happily if i have), also have to swap GPU's etc. (too much work), and since here are people who have nVidia cards in their PC's, I'm sure someone will help you (if you are unable to do it for some reason ATM). If you need any help with testing effect of RFA on GPU when in synch with CPU and when is not, contact me, i will explain you trough PM.
   
Reply With Quote
Old
  (#124)
VenoMaizeR
Member Guru
 
VenoMaizeR's Avatar
 
Videocard: Asus Strix RX480OC 8gb
Processor: i7 6700k
Mainboard: Asus Maximus VIII Ranger
Memory: Gskill DDR4 2400Mhz
Soundcard: SupremeFX
PSU: CorsairHX 650w psu
Default 08-11-2012, 14:35 | posts: 123 | Location: Portugal/Lisbon

I run this setting on 3 but not directly on nvidia control panel, i put it on command console on BF3 and the game runs smoother, very big diference. Other games i havent try it yet.. i run the game on ultra except mesh is low and antialiasing is off, and i have around 45-55fps on 64player maps its playable i think....

Last edited by VenoMaizeR; 08-11-2012 at 14:39.
   
Reply With Quote
Old
  (#125)
rewt
Maha Guru
 
Videocard: √
Processor: √
Mainboard: √
Memory: √
Soundcard: √
PSU: √
Default 08-11-2012, 19:00 | posts: 1,245 | Location: Americas

Quote:
Originally Posted by tweakpower View Post
I'm sure someone will help you (if you are unable to do it for some reason ATM).
Thanks. I'm away on vacation atm, but I have experience with drivers ever since detonator series and earlier. Prior to r300 drivers a setting of 0 could negatively impact performance by 25% or more in games (which I mentioned earlier in the topic using Skyrim as an example). It just goes to show, 0 should never have been used in drivers before r300 unless minimizing latency was of utmost importance.

Last edited by rewt; 08-11-2012 at 19:12.
   
Reply With Quote
Reply

Tags
flip-queue, input lag, pre-rendered frames, prerendered

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump



Powered by vBulletin®
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
vBulletin Skin developed by: vBStyles.com
Copyright (c) 2017, All Rights Reserved. The Guru of 3D, the Hardware Guru, and 3D Guru are trademarks owned by Hilbert Hagedoorn.