158.45 is out! (vista 32/64)

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by W@w@Y, May 31, 2007.

  1. KENNYB

    KENNYB Master Guru

    Messages:
    883
    Likes Received:
    0
    GPU:
    eVGA 8800 GTX 648/1900
    I know only of the performance test that is included with the game. Make sure you turn off vsync.
     
  2. Barton C++

    Barton C++ Guest

    Messages:
    603
    Likes Received:
    1
    GPU:
    MSI Hawk GTX460 1GB @ SLI
    Lost Planet DX10, Snow 57, Cave 42, those were my numbers using 158.42 (or was it 158.43?)

    Today I get Snow 70 with 158.45...could be some setting lower than previously, but 4xAA, 16xAF, Medium HDR, Low Shadows, High Lightning, 1280x800 16:10 rez ...plays like any game now and looks sweet. (also tested lowering to 8xAF, low HDR, medium Lightning: gave only 2 extra fps)

    Saw posts on other forum, Company of Heros DX9/10 patch. One user had 120fps avg. in DX9. With it's new DX10 patch avg. went down to 29fps. Dunno if he used 158.45 driver tho.
     
  3. VultureX

    VultureX Banned

    Messages:
    2,577
    Likes Received:
    0
    GPU:
    MSI GTX970 SLI
    These drivers are great for DX10, but that fckin 'Driver stopped responding' bug came back. It's sooo annoying, can't even play for 30 minutes or the system starts crashing...
     
  4. Rodzilla68

    Rodzilla68 New Member

    Messages:
    8
    Likes Received:
    0
    GPU:
    NVIDIA GeForce 8800 GTX
    My 2 Cents

    I installed the drivers but noticed very little difference between them and the 160.03 version.

    Here are my results:
    COH (160.03): avg: 16.7 / max: 34 / min: 9.6
    COH (158.45): avg: 21.7 / max: 34.9 / min 5.8

    5 FPS is not worth instability in my book.

    FEAR (160.03) min: 43 / avg: 58 / max: 61
    FEAR (158.45) min 43 / avg: 57 / max 60

    Great, this one actually runs slower. :(

    Rise of Legends...no benchmark but since day one this game really stutters when I pan. Has been the same problem with all Vista drivers. :(

    I think I'll go back to the 160.03...
     

  5. MacWun77

    MacWun77 Banned

    Messages:
    296
    Likes Received:
    0
    GPU:
    XFX 8800GTX @ 621/972 Air Cooled

    I dunno about these, but teh 158.27s on XP make STALKER run incredibly well. Getting 90FPS on average, over 100indoors and as high as 120 in some places. Never falls below 60 :)

    That's with everything cranked to the max too! Even FSAA and shadow qual wooot

    Don't know what you're doing to get awful performance like that. In Fear I get minimum of 75FPS and max 220, COH not sure, havent run it for ages, but I always got more than what youre getting????
     
  6. ZeroOne

    ZeroOne Active Member

    Messages:
    70
    Likes Received:
    0
    GPU:
    ASUS ENGTX275 869MB DDR 3
    Install nvidia 160.3 for tv out.
    IQ and performance are fine too...
    No BsOD'd it's the best driver out there fore the 7XXX cards.


    0000 0001
     
  7. VultureX

    VultureX Banned

    Messages:
    2,577
    Likes Received:
    0
    GPU:
    MSI GTX970 SLI
    He obviously has Vsync on and I think he's running CoH DX10, which is much worse in terms of performance than DX9
     
  8. Rodzilla68

    Rodzilla68 New Member

    Messages:
    8
    Likes Received:
    0
    GPU:
    NVIDIA GeForce 8800 GTX
    Yes, I run everything with Vsync because the shearing is just ugly IMHO. Does it cause that big of a drop?? And, yes, I've patched CoH to the latest DX10 patch. It was definitely running faster when it was using DX9.
     
  9. Ryan Williams

    Ryan Williams Master Guru

    Messages:
    331
    Likes Received:
    0
    GPU:
    ATI Radeon 4870 1GB
    Vsync in itself has no perceptible performance effect whatsoever. What causes the misconception that it does impact performance is something that's just as bad, or arguably even worse.

    To sum it up extremely briefly, to sync up the screen and remove the tearing the game's FPS has to run in one of a number of divisors, and if the FPS of the game drops below one of these divisors it has to go down to the next divisor.

    What this means is that if you're running a 60Hz monitor and your FPS is 65, you're sitting sweet and will have your FPS locked at 60. Since 60 is considered butter smooth by most people, this is absolutely perfect. The problem is that if your FPS drops to, say, 59, you'll be forced to go down to the next divisor. Unfortunately, the next divisor after 60 is 30. The divisors do vary depending on your monitor's refresh rate, but ultimately there'll always be a big drop if you go below the refresh rate's FPS equivalent.

    So essentially, if you ever drop below 60 FPS while vsync is enabled, you'll be thrown right down to 30 FPS until it goes over 60 again. This used to be a massive dilemma for people as they'd basically have to choose between an artificial FPS throttle or tearing.

    Fortunately it's relatively simple to avoid the problem. All you have to do is enable triple buffering, which without getting into the technical stuff basically means it isn't locked to the aforementioned FPS divisors yet still prevents tearing by always buffering an additional frame.

    Until relatively recently you couldn't enable triple buffering reliably in Direct3D games, as the 'triple buffering' option in the Nvidia control panel actually only affects OpenGL games. To do it, you need to download Rivatuner which'll also put D3DOverride into your start menu. Set it to load on Windows startup, and it'll force triple buffering in all your Direct3D games. Make sure 'force vsync' is enabled in your Nvidia control panel and you're good to go.

    If you notice any mouse lag in your games you may want to open up Rivatuner and set "prerender limit" to 2 instead of 3. I'm not sure what the exact logic is here, but I can confirm that doing this retains the FPS flexibility of triple buffering as well as gets rid of the mouse lag usually associated with triple buffering. Use the Rivatuner documentation or Google if you can't find this setting. Don't go below 2 though as you'll almost certainly get graphical weirdness in various games, or simply crash. :)

    Note: I guess I should mention the slight caveat that having triple buffering does use a bit more of your graphics card memory; however, with the memory they come with these days it shouldn't be a significant issue.
     
    Last edited: Jun 18, 2007
  10. KillerC

    KillerC Master Guru

    Messages:
    418
    Likes Received:
    0
    GPU:
    EVGA 2080 RTX
    WOW awesome post!!

    I would like to ask some questions for those that know though ,

    1) I installed the Riva tuner and set it to run at start up and it works very well so far , BUT I didn't see any setting for Triple Buffering , is the setting "Synchronization with Vertical retrace" the setting I am looking for? and if so it is already set to always on so I am good to go then right?

    2) Do I have to uninstall/re install rivatuner EVERY time I change my video driver?

    Thanks , sorry about the newb questions.:smoke:
     

  11. Benke99

    Benke99 Member

    Messages:
    37
    Likes Received:
    0
    GPU:
    Sapphire HD 4870 1GB
    1) Go to Start-Programs-RivaTuner and start D3D Overrider. This will enable tripple buffering in D3D games. Tripple buffering in OpenGL can be enabled through your driver control panel (or Rivatuner).

    2) No, you don't have to reinstall RivaTuner after changing drivers. It will ask you if you accept the change in drivers upon first restart.
     
  12. KillerC

    KillerC Master Guru

    Messages:
    418
    Likes Received:
    0
    GPU:
    EVGA 2080 RTX
    Thank You!! :)
     
  13. MacWun77

    MacWun77 Banned

    Messages:
    296
    Likes Received:
    0
    GPU:
    XFX 8800GTX @ 621/972 Air Cooled

    Sorry but that is the biggest load of poop I have read for a while.

    Ok not all of it is poop but the part about vsync is.

    I have NEVER in all my years of PC gaming (more than11yrs seriously now) noticed what you described when enabling vsync.

    however, I did not want to rush into claiming you were wrong without testing, so I went to make sure I was right.....guess what?

    I was right!

    Enabling vsync, example: I run 1280x1024@ 85hz, I tested with Oblivion, with triple buffering OFF and vsync ON. The most I get is 85FPS, no big surprise there right? Now in your theory you claim if it drops below that threshold I will get the next divider which would be what? Half that? I don't know since your theory was only just invented, I don't know your rules LOL....nope, I moved around and FPS dropped 84, 82, 80, 75, 77, 65. 64. and so on... you get the idea...

    Basically depending on where I was and where I looked I was able to get a smooth progressionin FPS from 85 down to 40 or 40 up to 50, it could stay at 50 for a while, go up to 62, stay there for a while to........etc etc

    So basically, I am not flaming you, but what you wrote is some concocted bull sh it story you made up? OR perhaps it was something you were told or read?

    Whatever it was, I am not sorry to tell you but it is completely FALSE, I have never heard anyone say what you have said about vsync, and beleive me, I have had many an indepth discussion about games, technicalities of PCs with other hardcore gamers......many much more techno-savvy than myself...and I'm no nOOb either my friend :p

    Vsync's only drawback is you will never get higher framerates than your refresh rate, but will experience no visual tearing. Some games run fine with Vsync off, Oblivion is one that doesn't, since the tearing is very obvious, but to say that performance is the same with Vsync on is a myth, again something you have said which could lead people into beleiving false ideas.

    Anyone who agrees or disagrees, please, do some research, test for yourself, you will see what I am talking about.

    The part about "frames to render ahead" is somewhat true, however, triple buffering is different in that it relies on the video card, whereas the frames ahead option (which is normally 3 by default) applies to how many Frames the CPU will prepare in advance, NOT the video card.

    Hope this clears some things up for those of you scratching your heads at this guys post.......sheesh :)
     
    Last edited: Jun 19, 2007
  14. KENNYB

    KENNYB Master Guru

    Messages:
    883
    Likes Received:
    0
    GPU:
    eVGA 8800 GTX 648/1900
    Ryan's post wasn't completely off the mark. The last time i checked this was running Doom 3 when it was released and nVidia did not have the option for Triple Buffering for OGL (for the GF6). Using vSync, either my FPS were locked at 60 or were fluctuating toward 30 and vice versa. When nVidia finally did improve their drivers and started to support TB for OGL, my FPS would drop to the low 40's (not exactly 45) and fluctuate between that and 60 FPS.

    Firing up Oblivion i see cut and dry results, but the behavior is very similar.
     
  15. MacWun77

    MacWun77 Banned

    Messages:
    296
    Likes Received:
    0
    GPU:
    XFX 8800GTX @ 621/972 Air Cooled
    So how do you explain that I have never seen this behaviour nor have I ever heard of this before??? Very strange......
     

  16. KENNYB

    KENNYB Master Guru

    Messages:
    883
    Likes Received:
    0
    GPU:
    eVGA 8800 GTX 648/1900
    I don't know man...lol. I learned about this way back when i first bought a GF3 Ti200. Right now i don't run with vSync anymore thanks to the monstrous FPS advantage that a 88 gives me, so i don't "see" it in action anymore. Maybe when Crysis is released will i have to use vSync again.

    Doom 3 is my most obvious example. The tearing in that game was unbearable. Fraps is all well and good but i wouldn't rely on it being 100% accurate when looking at the FPS counter. Sometimes i'm pegged at 58-59, sometimes at 61, but most of the time at 60 (in Oblivion).

    That's why i say that when i see "similar" behavior in Oblivion, i mean that it looks like my FPS are fluctuating around a divider. What the divider is doesn't seem to be as cut and dry as my D3 example. I know for sure that the divider in Doom 3 was 60 --> 45 --> 30. In Direct 3D games, i don't know if the divider is different, but it seems like it's in increments of 5 to 10 FPS.

    Unfortunately, i don't have many games to test. Only in Oblivion do my FPS drop low enough to test this.
     
    Last edited: Jun 19, 2007
  17. Ryan Williams

    Ryan Williams Master Guru

    Messages:
    331
    Likes Received:
    0
    GPU:
    ATI Radeon 4870 1GB
    My post wasn't a load of rubbish, and if you'd spent as much time doing some research of your own as you did writing your post about mine then you'd probably have found a significant amount of evidence to back up what I said.

    If you want to read a more in-depth description than mine then see this forum thread:

    http://www.hardforum.com/showthread.php?t=928593

    And if you'd like to do a little more scouring yourself, these lot should get you started:

    http://www.google.co.uk/search?q=vsync+fps+direct3d+30+60

    As you've observed KENNYB, when vsync is in effect your FPS will indeed hover around the level of your refresh rate. Although it's only physically displaying at your refresh rate FPS, whatever's measuring the FPS often reports it up to 1 FPS wrong.

    If you've not seen these effects before then I don't know what to tell you, MacWun77. It could be that the game is still telling you it's running at 50 FPS or whatever even though vsync has knocked it down to 30. I've observed the smoothness difference with my own eyes so I'm willing to invest trust in what I've read.
     
    Last edited: Jun 19, 2007
  18. GhostXL

    GhostXL Guest

    Messages:
    6,081
    Likes Received:
    54
    GPU:
    PNY EPIC-X RTX 4090
    Dunno but i never experienced this with Vsync either.
     
  19. KENNYB

    KENNYB Master Guru

    Messages:
    883
    Likes Received:
    0
    GPU:
    eVGA 8800 GTX 648/1900
    I know for sure i've seen it work as described above before. Playing Oblivion and looking at Fraps i don't see it working like i've seen before. My max FPS are being limited to 60 though. Really, i can't explain definitely why Oblivion is behaving like it is.

    IIRC, Far Cry, back in the day, also behaved like Doom 3. I remember my FPS always being either 45 or 60 FPS. Shrug...
     
  20. MacWun77

    MacWun77 Banned

    Messages:
    296
    Likes Received:
    0
    GPU:
    XFX 8800GTX @ 621/972 Air Cooled
    Well, I really don't know about this...Did ome more testing and nothing shows me what has been described here.

    BUT there is one thing I have noticed, it involves FRAPS; If you have FRAPS set to only update once per second it seems to report different FPS.

    Otherwise I still see no evidence of what has been described here, and as I said before, I have NEVER heard of this issue before, nor have I spoken with anyone who has ever mentioned this, and honestly, I have a lot of experience with games and other gamers.......so I don't know

    I know you weren't just posting a load of rubbish on purpose, but from my point of view it's just something unknown to me

    EDIT: I just had a thought; Is this perhaps a DVI/LCD issue? Since I have only experience with CRT monitors...?
     

Share This Page