GeForce Forceware 260.52 Win 7/Vista 64-bit

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by phk, Sep 8, 2010.

  1. ZebMacahan

    ZebMacahan Master Guru

    Messages:
    281
    Likes Received:
    35
    GPU:
    MSI 4090 Liquid X
    For me these are a nice set of drivers. The performance is a notch better than with 258.96(also a good driver). The gameplay feels smoother, and they are stable for my rig. No BSOD or other crashes so far. Lets hope the final ones are good too.
     
  2. Pedroshin

    Pedroshin Active Member

    Messages:
    94
    Likes Received:
    0
    GPU:
    EVGA GTX 680 SC
    Hmm I don't use Alchemy on TF2 or L4D2. I get surround sound just fine without it. :confused:
     
  3. TheHunter

    TheHunter Banned

    Messages:
    13,404
    Likes Received:
    1
    GPU:
    MSi N570GTX TFIII [OC|PE]
    yea i had thought you're not using it.

    Of course you get perfect sound with 5.1 but you're missing allot, I can play without just fine too, but i lose reverb and some other sound effects.. (it sounds dull and even more generic in sw without alchemy)


    Title
    Source Games "Orange Box"
    API
    DirectSound3D
    ALchemy
    Yes

    http://connect.creativelabs.com/alc...ivelabs.com/alchemy/Lists/Games/AllItems.aspx
     
  4. Strykerr

    Strykerr Master Guru

    Messages:
    381
    Likes Received:
    0
    GPU:
    GIGABYTE 1070
    Just installed, now to do a test. First thing I noticed in the panel though is that, before my native resolution was 1920x1080 PC @ 60hz, and the only way I could get HD was to go to 50hz, but now my native is 1920x1080 1080p HD @ 60hz.

    Started SC2, I seem to have lost performance. Will try some tweaking though.
     
    Last edited: Sep 9, 2010

  5. RedruM-X

    RedruM-X Guest

    Messages:
    210
    Likes Received:
    2
    GPU:
    MSI GTX 1070 Gaming X 8G
    Ok

    Installed perfectly.

    Then set up the nvidia control panel, all ok but first time I went to look at resolutions, there were none.

    I closed the panel, wanted to re-open it, didn't work.
    Rebooted, then all was ok and the control panel really opens faster then with older drivers.

    BUT there is something else, because I play (or used to play) UT3 I always created a custom resolution 1680x1050@76Hz (yes 76).
    This is not possible anymore, I always had to do it in a special way, first changing to 1680x1050, which then says 59Hz, then I had to create 1680x1050@60Hz, then switch back to 1920x1080 and then again to 1680x1050@60Hz, and then I created the 1680x1050@76 (with special timings) and it worked and was choosable in the menu.
    But now I can't even create 1680x1050@60 anymore, if I do it, then it just shows the same 1680x1050@59, no 60Hz available, which is no problem really, but it's only in 16bit, no 32bit after that. Recreating it doesn't help, and ofcourse the 1680x1050@76 is totally not possible anymore all of a sudden ... it worked however with all older drivers.

    I REALLY REALLY hope they are going to fix this because this will be driving me nuts otherwise, even though I don't play UT3 anymore atm, it will bug me a lot.

    Keeping fingers crossed. Didn't test any games yet but driver seems to be good nonetheless.

    I remember they once went to a new version and that's when I also had resolution problems, that's when I had to find that trick to create the 76Hz mode, and now they have a really new driver again, and again that resolution weirdness. Come on Nvidia, please don't **** this one up :D

    EDIT 1: After posting this I just saw the post before me :
    This already tells me they ****ed something up again with the resolutions.
    And omg I hope SC2 really doesn't run worse now, because that's the game I play all the time now... :(

    EDIT2: SC2 seems to be running exactly same for me, I think, haven't played for hours ofcourse :)

    Drivers seems good, just hope they will fix the res thing in the real one, otherwise :bang:
     
    Last edited: Sep 9, 2010
  6. H-Ackermans

    H-Ackermans Member Guru

    Messages:
    187
    Likes Received:
    0
    GPU:
    Palit GTX550Ti
    To those saying, nVidia flamingo'd up on these drivers, uhm... are they available for download on their site? No. Does it even say beta in the driver filename? No. So it's not even a beta driver.
     
  7. PirateNeilsouth

    PirateNeilsouth Ancient Guru

    Messages:
    1,773
    Likes Received:
    0
    GPU:
    980 OC
    Agreed

    It's a " Review driver "
     
  8. RedruM-X

    RedruM-X Guest

    Messages:
    210
    Likes Received:
    2
    GPU:
    MSI GTX 1070 Gaming X 8G
    If you're talking about me, I specifically said in my post that I KNOW that these are removed by NVidia, so I HOPE this problem won't occur in the real ones.

    I'm not assuming anything.
     
  9. Maced

    Maced Active Member

    Messages:
    62
    Likes Received:
    0
    GPU:
    Gigabyte GTX 460 1GB
    I have a problem believing that it matters much with a 280. What, did your framerate drop from 80 to 75?
     
    Last edited: Sep 9, 2010
  10. Sr7

    Sr7 Master Guru

    Messages:
    246
    Likes Received:
    0

    What are you basing this whole premise that your GPU should be fully loaded all the time on? This is totally wrong and bogus. If a game is GPU-bound (it is bound by the graphics workload) you will see 100% GPU usage. If a game is partially CPU-bound (i.e. limited by how fast the CPU can pump work to the GPU), you start to see GPU usage come down because it can not be fed enough work to keep it's "usage" high... the CPU or disk are bottlenecking that process. Crank the settings to maximum, I bet your usage will go to 100%. If you run a single GTX 480, your usage will be higher than SLI GTX 480.. simply because at some point you may hit a CPU bottleneck and the SLI horsepower is "overkill" for your CPU for that game at those settings.

    The same goes for CPUs. This is basic logic about bottlenecks guys.

    The problem when running BC2 with previous driers was that the shader compiling was causing regular stalls. The shader compiling happens on the CPU, to prepare them for consumption by the GPU. Since the stalls were happening on the CPU while compiling shaders, the CPU was unable to feed the GPU work.. hence GPU usage dropped when this would happen (often). But as long as your usage is relatively flat (and not insanely spikey) then it's how it should be.. you're probably CPU-bound at your current settings, or the game you're running just isn't that taxing on the GPU.

    There is nothing throttling the cards or "dropping usage". If the game is smooth, and FPS is high, then you know it's working correctly. Can we please move on from this bogus measure?
     
    Last edited: Sep 9, 2010

  11. MegaBear

    MegaBear Master Guru

    Messages:
    344
    Likes Received:
    2
    GPU:
    EVGA GTX 1080Ti SC2
    crashing time with me
     
  12. Sr7

    Sr7 Master Guru

    Messages:
    246
    Likes Received:
    0
    It's impossible to split a single threaded application across multiple cores. The application has to be explicitly threaded such that work can happen in parallel between cores.

    Even multi-threaded applications won't get perfect scaling as it depends on the workload each thread has to do, and then there are sync points where, even if one thread could continue running and processing work theoretically, it can't because all the threads have to talk to each other.. so that thread ends up stopping and waiting for the others (even if it's on a separate core).
     
  13. Sneakers

    Sneakers Guest

    Messages:
    2,716
    Likes Received:
    0
    GPU:
    Gigabyte 980Ti Windforce

    Since you did bother to read my post, you do understand that I bring up the notion that the game could very well be extremly cpu bound. But that is not supported by the fact that performance remains the same ( fps and gpu utilization ) when cpu cores are removed from the application.

    If the game was cpu bound at 4 cores running at 3.8 ghz, then it would be even more cpu bound with 3 cores no? But my and other people's observations contradict that somewhat.

    We ALL know BFBC2 is a very cpu tasking game but there has to be more to it when you see wierd behaviour like I mentioned in the above post. Just to straigthen this out once and for all, having low gpu utilization is NOTHING good at all it means the card ain't working as much as it could/has resources to do, it is litterally bad when you do not get high enough frames to begin with to keep at or above 60 min fps.
    If we were running 300 fps the card could idle at 2% utilization for all I cared but the reality here is that we have cards running low on its utilization with bad fps as result. Bad gpu utilization is bad gpu utilization is bad... :)

    I have lower fps in BC2 with 465 SLI heavily oced then I had with my 4870x2..thats not good enough. I have alot lower gpu utilization on average on my 465 in single gpu mode then I had with my 4870x2 in dual gpu mode. I know it cannot be compared, pears and apples and all that but still.

    If you had paid any attention to this issue with BC2 then you would also know numerous people with beefy i7s at 4 ghz + see the same odd behaviour of low single and multi gpu utilization in BC2.

    As I already mentioned I can artificially raise my gpu utilization abit if I put on alot of eyecandy yes, but the fps remain the same. Wich as I already mentioned support the claim that the game is just completly limited by how much data the cpu can shuffle to the gpus, hence why we ( atleast I ) get roughly the same combined gpu utilization on two cards as I get with one ( if you can add it up like that :) ). But then again how do you explain the fact that you can run the game on 3 cores with no performance loss/gain but remain pritty much as you were with 4 cores. That atleast suggest to me that the game ain't really utilizing the 4th core ( if available ) and the only gain people get in BC2 from i7s are actually the HTing technology and not the actual nummber of cores. This however is again contradicted by the fact i3s dual cores with HT score badly in BC2 despite beeing OCed to 4.2 ghz. So that is basically a deadend.

    I'm still betting my money on that this is mostly a driver issue together with a nativly cpu hungry game.


    /edit

    Just for fun tried some Dirt2 benching and I get low gpu utilization in that game aswell during SLI. With single gpu 95-98% during bench. Benched with SLI on clocked @ 850-1030 and got 55 fps 44 min with 34-45% gpu utilization.

    Then I benched single GPU clocked at 607-800 and I got 54 fps 44 min but with 95-99% utilization.

    CPU usage with SLI turned on was lower then cpu utilization with single gpu. SLI = 53-65% and Single GPu = 60-72%.

    No doubt in my mind this whole mess with BC2 and other games are purly driver related and has little to do with cpu bottlenecks per say.
     
    Last edited: Sep 10, 2010
  14. brendanvista

    brendanvista Master Guru

    Messages:
    237
    Likes Received:
    0
    GPU:
    GTX 980 Ti
    I just installed these drivers and they were completely stable and GPU usage was much higher in BC2. The game went from using %35 to 80% of my 260. That made the game run super smooth at 1920x1200 all settings maxed including AO with no AA and 16xAF in multiplayer. I used to play the game all on low so it would run silky smooth so id be competative, but not anymore :). Sadly, I was also getting some black screen flickers in game occasionally. My GPU was far from overheating, @ 54C. The control panel crashes once on me seemingly randomly too. The new installer in sweet. The control panel is much less lag free now too. So, Id say this leaked driver is a keeper for me right now, but it is in no way "ready."
     
  15. TheHunter

    TheHunter Banned

    Messages:
    13,404
    Likes Received:
    1
    GPU:
    MSi N570GTX TFIII [OC|PE]
    i partially agree, but tell me this why does it run at 100% in SP? well at least sneakers said so..

    and now what you just said doenst make any sense..


    edit: lol i saw Sneakers already replied..
     

  16. Sneakers

    Sneakers Guest

    Messages:
    2,716
    Likes Received:
    0
    GPU:
    Gigabyte 980Ti Windforce
    Sounds good bro, hope they can figure out how to fix the 400s utilization aswell for the WHQL/next beta driver. Makes me sad to see these babys idle at 20-25% with poor fps as result.

    If may ask, did your FPS raise with your higher gpu utilization or did it remain the same?

    I'm asking b/c the people on this forum who seem to think it is all good with low utilization doesn't seem to understand that higher util means higher fps always...also means more heat but thats another story :)

    So just for the record did you notice an increase in fps aswell?
     
  17. avivoni

    avivoni Ancient Guru

    Messages:
    2,599
    Likes Received:
    1
    GPU:
    560ti 448 classified
    dont have a need for going any higher. at 3.2 my memory runs at 1600mhz thats all i needed. I have the 4ghz profile saved in bios for if any reason i need to test with it, but i no longer a benchmark whore , i just play some games from time to time and it does the job like it is.
     
  18. Pedroshin

    Pedroshin Active Member

    Messages:
    94
    Likes Received:
    0
    GPU:
    EVGA GTX 680 SC
    Hmm never read that before, interesting. Thanks, I'll give it a try. Same method and settings for L4D2 I take it (5/10/128)?
     
  19. TheHunter

    TheHunter Banned

    Messages:
    13,404
    Likes Received:
    1
    GPU:
    MSi N570GTX TFIII [OC|PE]
    quoted from laptop2go..


    edit: yes pedroshin same settings :)
     
  20. Burnt_Ram

    Burnt_Ram Guest

    Messages:
    5,921
    Likes Received:
    0
    GPU:
    Zotac GTX 1050 Ti

Share This Page