October 25th last 400 series drivers C'mon Nvidia

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by RagDoll_Effect, Dec 9, 2010.

  1. chispy

    chispy Ancient Guru

    Messages:
    9,979
    Likes Received:
    2,693
    GPU:
    RTX 4090
    Merry Christmas to you too and family :).

    Thanks for the guys trying to help , as of the mini HDMI it does not work either in my configuration.

    There's a big problem with both companys regarding drivers , both Nvidia and ATI just want to make faster and faster cards but when it comes to software support they both fail :/.
     
  2. Redemption80

    Redemption80 Guest

    Messages:
    18,491
    Likes Received:
    267
    GPU:
    GALAX 970/ASUS 970
    Yeah, Nvidia have jumped on the hardware before software bandwagon recently.

    These companies need to realise that software is just, if not more important, and alot of us are getting annoyed and quite cynical at them optomising their cards for the generic game benchmarks, but forgetting about the rest.
     
  3. anthonyda

    anthonyda Member

    Messages:
    28
    Likes Received:
    0
    GPU:
    470 GTX SOX
    Isn't the low GPU usage a feature? I mean when my GPU usage is low, it means that my CPU is the bottleneck, right?
     
  4. steelcowboy

    steelcowboy Member Guru

    Messages:
    132
    Likes Received:
    0
    GPU:
    EVGA GTX 570's SLI
    I can say drivers from ati suck just coming from a 5970 with nothing but trouble to 2 gtx 460's and no problems.Like ive said numerous times normally the drivers nor the card are the problems.90% of the time its the person between the chair and keyboard.I went back to ati after like 4 years thinking the driver team had their crap together after all that time but it was still the same room full of monkeys like before.

    Merry Christmas All
     

  5. NightCrawler™

    NightCrawler™ Member Guru

    Messages:
    124
    Likes Received:
    0
    GPU:
    2 EVGA GTX460 1 8800GTSSC
    In addition to this... Do not install the latest Gothic 4 patch if you're on a SLi system..
    In most area's it will break SLi and totally destroy your FPS... Mine dropped some 15-20 frames..

    So after I finally figured this all out I finished the game an hour later... :bang: :bang:
     
  6. VenoMaizeR

    VenoMaizeR Member Guru

    Messages:
    123
    Likes Received:
    0
    GPU:
    Asus Strix RX480OC 8gb
    Not Really, on BFBC2 my 470GTX only uses 50% gpu, and on Metro2033 uses 98-100%, same system so take your own conclusions...:bang:
     
  7. anthonyda

    anthonyda Member

    Messages:
    28
    Likes Received:
    0
    GPU:
    470 GTX SOX
    My conclusion is that both engine are totally different :) And BFBC2 is a console portage, so it will be mostly CPU whore.
     
  8. buddyfriendo

    buddyfriendo Guest

    Messages:
    3,404
    Likes Received:
    5
    GPU:
    2070 Super
    Turn off VSync and your GPU usage will go up, or don't if you can stand your solid 60FPS. :bang:
     
  9. chispy

    chispy Ancient Guru

    Messages:
    9,979
    Likes Received:
    2,693
    GPU:
    RTX 4090
    Correct Statement 100% :thumbup:
     
  10. phatbx133

    phatbx133 Master Guru

    Messages:
    850
    Likes Received:
    16
    GPU:
    MSI GTX 1050ti 4GB
    I hope Nvidia do better working on new driver next week as they said mentioned, We expect them release next week this time no B.S delay few times.:banana:
     
    Last edited: Dec 27, 2010

  11. Zer0K3wL

    Zer0K3wL Banned

    Messages:
    3,073
    Likes Received:
    0
    GPU:
    gtx 480 850/1700/2000 h2o
    i wonder if the poor gpu usage in world of warcraft is posibly related to drivers :p
    probaly not >_>
     
  12. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    Nope....Blizzard is still attempting to keep the game CPU bound...
     
  13. Foonus

    Foonus Active Member

    Messages:
    63
    Likes Received:
    1
    GPU:
    Nvidia 2070SC
    Nvidia: 8+ years of driver superiority at its end.

    Used to be a few years back, ok.. many years back; Nvidia was miles head of ATI when it came to driver releases and support.

    In fact its pretty much the only reason i stuck with Nvidia over ATI for the past 8 years. Seems with Nvidia now are only focusing their driver team on the current flagship card, and already forgetting about everything else.

    It will be a lot harder choice to buy an Nvidia next time knowing that they do not fulfill their promise of current monthly drivers, they are no better than ATI for drivers now, if not worse; at least ATI releases hot fixes for new games if their drivers aren't ready yet....
     
  14. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    It really depends on the application. If you play Blizzard games, you're better off with nVidia. If you're building a media center....it could easily go either way. Regardless of which manufacturer you go with, someone is going to bitch and whine about driver problems.

    Also, nVidia has never guarranteed monthly driver releases....only ATI does.
     
  15. slick3

    slick3 Guest

    Messages:
    1,867
    Likes Received:
    234
    GPU:
    RTX 2070 +85/1200
    It's the way the game uses your CPU that makes the difference, mtero 2033, crysis and lost planet 2 are examples of GPU based games. BC2 uses your CPU extremely poorly, i do believe that if the game was properly optimized, a dual core would run it with no problem at all. Crysis 2 hopefully will be GPU based, (and hopefully won't fry mt 460) and not poorly optimized like BC2 ...

    What i don't get however is that COD:BO was just like BC2 when it was first released. getting 40%-70% gpu usage, but with the patch they released couple of dayz later, 99% with ease. Why can't DICE do the same thing ?! :(
     
    Last edited: Dec 28, 2010

  16. Stevethegreat

    Stevethegreat Guest

    Messages:
    42
    Likes Received:
    2
    GPU:
    eVGA Geforce 8800 GTX (630/1000)

    It's not even their true flagship. My twin GTX 460 setup is faster than their "flagship" (GTX 580), only 3 months older (it's pretty much the same tech) with much better $/perf. GTX 460 is STILL nVidia's TRUE flagship yet just because they decided to up a meaningless number (the 5xx series IS the 4xx series) they drop support to MY product and their best product yet.

    I mean come on, a new series in less than 8 months? That's insanely demeaning to anyone's intelligence, they should be sued en masse, it's not as if their products are NOT luxury products either. Since we buy luxury products, we want luxury support. At least Apple support their products for one year, nVidia seems to drop their support after 3 effing months...I used to be an nVidia fan, great drivers, great chips, but now they don't seem to care for either. The 5xx naming scheme is a joke and a shame...

    I'm looking to sell my GTX 460s, after a decade of continuous support and THOUSANDS of dollars I'm moving to the Red Team, their drivers may suck, but at least they're honest s*ckers...
     
  17. arrrdawg

    arrrdawg Member

    Messages:
    28
    Likes Received:
    0
    GPU:
    NVIDIA GeForce GTX 460m
    I'm curious to know what gpu/cpu usage actually equates to. If both CPU and GPU are low usage and a game is running poorly, exactly what does that mean? I fear there is just too much misinformation out there from the less educated further confusing us all. Some people use the wrong tool to gauge CPU usage and the game might only use two cores so on an 8-threaded i7 this might show as 25 cpu usage when in fact two cores might be at 100% and that's just the way the game is programmed.

    I'm still not sure why people like to say "game is CPU bound" when pretty much most graphic stuff is GPU bound nearly exclusively. Otherwise those with IGPs wouldn't be complaining. CPU is used to run scripts, for AI, for physx, etc. My buddy has a very 'old' core 2 duo which I think is an e6600 (2.4ghz.. back from '06) and he has two 5870s. I realize Black Ops is based on an old quake3 engine, but it runs buttery smooth at 1920x1200 with all settings maxed out. Even so-called cpu-bound games like Metro 2033 and GTAIV run around 30-60fps with most everything maxed out. I seriously don't believe the CPU is as important as the "serious business" internet makes it out to be because I've seen with my own eyes that GPU is everything. Not saying a better CPU won't help because it certainly will, but a better GPU helps out considerably more meaning "GPU Bound" rather than "CPU Bound". Having said that, I know Metro 2033 runs a lot smoother with a better CPU. But CPU is only one part of the entire pie
     
    Last edited: Dec 29, 2010
  18. Luumpy

    Luumpy Master Guru

    Messages:
    425
    Likes Received:
    0
    GPU:
    470gtx s/c

    It is just poor porting. Consoles games are cpu heavy.If, they are not opptimized for pc,s this is what happens. Try aliens vs predator. dx 11 game. ported, unreal eng. like laagopps. 100% gpu, 50%-75% cpu. And great graphics and fps. Avsp is how a console port should be done.
     
  19. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    SSE instructions can be used for graphics....which allows the graphics processing to be handled by the CPU, independently of the GPU. Anything done using 3DNow, MMX, SSE and in the case of Sandy Bridge, AVX is done by the processor completely independent of the GPU. DirectX 11, in the case of a non-compatible graphics card, is also done by the CPU. CPU's have been processing graphics for as long as video cards have been displaying images on a screen.
     
  20. dchalf10

    dchalf10 Banned

    Messages:
    4,032
    Likes Received:
    0
    GPU:
    GTX670 1293/6800
    So where the F is the new driver.....the only 'new' drivers are glitchy leaked ones which I needed a modded INI for and only because they added AA for dead space. I lost 10% performance across the board with 266.44...

    $#!+!
     

Share This Page