Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by hicookie, Apr 18, 2008.
Thanks! I understand now
so it emulated as driver 174.85 I got it now.
I understand then (about xp 64bit only).
Anyway you said noones uses 3dmark06 anymore? i mean what do most guys benchmark with it then? because 3dmark vantage the new one is still not released?
does ANY ONE know if this driver has the CUDA runtime in it?
theres no cuda runtime in this driver
thanks looks like i will stick with my 174.12 since 174.93 is causing some problems, until nvidia launches CUDA with the drivers.
Yes, it does jump to 648, but I thought this was normal for G80s?
I have considered the BIOS flash, but will that allow me to further overclock my card?
I mean, I can set the clocks much higher, but it always crashes above those in my sig.
If I were to flash the card, which BIOS should I use?
I see several on TechPowerUp.
I currently have 60.80.0A.00.01.
its a joke, No one plays it online because it really isn't a game so using it to show the worth of drivers is stupid.
Thanks to everybody who contacted me via PM and tested x64 174.93 driver detection in upcoming RT 2.09. Everything is working properly, additional testers are no longer needed.
Just sent you a PM...
I thought 3DMark06 was commonly used to show the performance of the drivers. so this means actually when 3dmark shows a higher score it doesnt have to mean the framerates of other games are a bit higher then as well?
its just that i dont really understand why its stupid to use 3dmark06 on forceware drivers :gape:
What else is there to show worth/performance value of drivers? except for testing all kind of games for driver-compatibility of course.
Thats the thing though, 3Dmark and games are coded differently so while driver could do well in 3Dmark, it could cause crashes, flashing textures, artifacts, and fps loss in some games because of bad compatibility.
Well it happen to me, I get better score with 3DMark but less framerate with games. I dont care much about 3DMark I just want to have a good framerate when Iam gaming. So I always test with games. Bench driver with them insted of 3DMark.
plus 3Dmark06's cpu:gpu dependency ratio is higher than any game ive ever played. otherwise its a good universal bench, as are HL2ep1, WIC, COH, Crysis, or any other game with a built in bench. but yes, i do prefer to bench with games im playing.
i most frequently use WIC as my personal measuring stick, so to speak, because maxed out 16xAF/4xAA @1920x1200 it REALLY stresses my system. plus ive probably tested that game with over 10 different sets of drivers by now, so i know what to expect... i know from past experience what is an improvement, and what is a detraction from the norm.
Of course its always needed to test several games because of compatibility/stability.
I always thought that artifacts occured only when having a faulty graphics card or too much overclocked in Mhz etc. Never had it with non-overclocked drivers. i did had artifacts once with bad ddr memory (and alot of other random errors).
Working awsome for me xp 32bit even better than vista. Also see alot off people saying enable sli and black screen. Im confused because sli is always enabled by default with me. Maybe because it the GX2 . Just wonder if these are aimed towards that cos my card loves them.
So is anyone running these on a 7x0i chipset motherboard and not get trouble playing videos? (and not just for a few minutes)
hitting the enable sli gave me a black screens that did not go away, after a restart the driver went to 8 bit color mode and nvidia cp gone.
went back to the 174.88 that seems to work proper with xp32 sp3 and sli
Yea I sent one too but I think that the testing has begun. uke2:
Hi. All. Been reading here for years. First post ever.
Wanted to thank all for their comments and reviews on drivers.
And...this driver is the best yet for my 8800GT. Absolutely best IQ and smoothness in BF2 Project Reality mod.