Nvidia is deliberately downgrading kepler gpu's performance in favor of Maxwell.

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by Angrycrab, May 18, 2015.

Thread Status:
Not open for further replies.
  1. Turanis

    Turanis Guest

    Messages:
    1,779
    Likes Received:
    489
    GPU:
    Gigabyte RX500
    Lets be straight: Nvidia year after year,chip after chip increase the level of Tessellation.
    Last game when we see how they "improve" Tess is The Witcher 3,when Kepler is doomed(Fermi doesnt matter).

    Its normal,new tech (Maxwell),new game DX11.1 game(TW3) and new Hairwrecks,same as Gamewrecks who want to put the compettion on the knees.But they put on the knees their own old cards (Fermi,Kepler) to force people to buy the "new&miraculous" product named Maxwell.

    Well the Maxwell chip(except Gtx 980TI who do it well in compute) is built on single thread tech who dont do it well in compute&parallelism which is essential in DX12 games.
    The DX12 will be more focused on what have graphic card inside,than how good driver is.
    The future will prove this Maxwell is another gimped product from Nvidia(to sold more&more for just 1 1/2 year).After that God mercy for users who buy them.

    No cards are future proof,but some of them are still good in newest games,even in coming DX12.
    And dont tell me: Pascal will be future proof,will be not.Just Nvidia makes you believe that,pickpocketing you.

    Motto:Be informed,be educated,your life wil be more shining and with less worries.
     
  2. mbk1969

    mbk1969 Ancient Guru

    Messages:
    15,649
    Likes Received:
    13,657
    GPU:
    GF RTX 4070
    And don`t forget that not all users care about cranking the games settings up to the maximum. Smooth operation at acceptable eye-candy level is what matter for many users.
     
  3. chr!s

    chr!s Master Guru

    Messages:
    229
    Likes Received:
    52
    GPU:
    RTX™ 3080 TI
    I suppose thats where consoles do things a bit better - instead of improving hardware - effort is spent in improving the software. (increasing the lifespan of the HW)
    but usually this effort is still passed onto the consumer in terms of higher prices for the console game...

    Anyway, as a PC gamer I've accepted that my hardware will be outdated quicker - and I've set myself a ~3yr HW refresh cycle ... (regardless if my hardware is purposely/intentionally broken in certain games) - just means I'll always be slightly behind sometimes in playing the latest games at optimal settings. e.g. when I upgrade next year, i'll go back and play witcher 3 which will probably be a year old at that stage...

    p.s. i sense your post is tring to touch more on an ethical/moral topic with regards to nvidia - my only hope is that AMD regains some of the marketshare to level the playing ground, and this should reduce any "shady/questionable" activities from our GFX manufactors...
     
  4. SturmButcher

    SturmButcher Guest

    Messages:
    45
    Likes Received:
    0
    GPU:
    Zotac AMP GTX 1070
    Yeah... sure, I noticed this since I have a 8800GT then I sold the video card when I realize that with every driver my performance went down. I stopped upgrading my drivers but the new games that came out performed bad.

    Nvidia do this every time. When I changed to AMD, my HD4850 started to crashing my old 8800gt and even 3 years later I received little boosts.

    Now I have a gtx 980 but I will sell it as soon as the new series come out because that.
     
    Last edited: Nov 3, 2015

  5. ---TK---

    ---TK--- Guest

    Messages:
    22,104
    Likes Received:
    3
    GPU:
    2x 980Ti Gaming 1430/7296
    Do what exactly? Been with nvidia well over 12 years and did not notice anything out of the ordinary.
     
  6. GREGIX

    GREGIX Master Guru

    Messages:
    856
    Likes Received:
    222
    GPU:
    Inno3d 4090 X3
    @TK
    Because u are good trained puppy and u buy all new stuff they throw on market, selling then ur old stuff...no offence :). This case u cannot see that what other easily see.
    Like my case, I had 8800gt too, great card, but driver to driver after 2-3y of use performance was worse. Same game(no patches), same system, newer drivers was more and more bigger and crappier. Never saw that with amd drivers when I had 4850, then 6850(still working) and lately 7870xt aor what that was called...
     
  7. ---TK---

    ---TK--- Guest

    Messages:
    22,104
    Likes Received:
    3
    GPU:
    2x 980Ti Gaming 1430/7296
    Why don`t you show some proof? How come nobody documented the demise of the 8800gt driver wise way back then? I had sli 8800gtx and did not notice anything out of the ordinary at the time.
     
  8. SturmButcher

    SturmButcher Guest

    Messages:
    45
    Likes Received:
    0
    GPU:
    Zotac AMP GTX 1070
    Well this time I am doing it, I am testing the drivers in firestrike to see how much improve/degrade the performance, until I get my new video card(next gen) cause I will not stick with a bone much time to see how the performance goes down.
     
  9. nhlkoho

    nhlkoho Guest

    Messages:
    7,754
    Likes Received:
    366
    GPU:
    RTX 2080ti FE
    I've used Nvidia exclusively for 15 years and I too have not seen anything like this. There are a lot of other factors in system performance other than just GPU drivers. Performance could be affected by OS updates, other hardware driver updates, new software, etc...
     
  10. Cyberdyne

    Cyberdyne Guest

    Messages:
    3,580
    Likes Received:
    308
    GPU:
    2080 Ti FTW3 Ultra
    I'd like to see some before and after benchmarks on that. Should be really easy to do.
     

  11. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,946
    Likes Received:
    1,050
    GPU:
    RTX 4090
    [​IMG]
     
  12. Pyrage

    Pyrage Master Guru

    Messages:
    414
    Likes Received:
    51
    GPU:
    580 CF@ 1470
    I still have my old 660, which I remember well that would give me around 55fps on Deus ex HR with things maxed out. I took out my 970, fired up deus ex and..it gave me 55 fps.

    I still believe that people are having performance issues because of bad anti viruses, bad OS installation (win10?), CPU degradation from old overclocks, anything but "nvidia is forcing us to upgrade!1!!!"

    I'm gonna need way more proof than simple talk to believe that. This is becoming just like Quadro drivers, where people swear that they're getting better IQ from them.
     
  13. quickkill2021

    quickkill2021 Guest

    Messages:
    131
    Likes Received:
    3
    GPU:
    1080ti sli Poseidon
    I believe this issue is really directed at the witcher 3. I would like to know what the issue was.

    Its not like with evil within. With that game gpu usage prior to their performance patch was around 80 percent and you would only get 50 frames per second. After the patch, gpu usage went to 100 percent and then I hit 60 to 70 fps, so obviously the game was not optimized.

    With witcher 3 post nvidia patch, my gpu usage was at 99 percent and i would struggle to get 60 fps on two 780 tis. Then you got got guys like dr_rus claiming no foul on nvidia's part and being an ass about it. Furthermore stating the technology is inferior blah blah.

    After the nvidias patch gpu usage stayed at 99 percent, but my frame rate went up by 10 to 20 frames per second. some scenes i can hit 100 frames per second where previously I would only hit 60 fps.

    clearly there was an issue, where you have a game and the gpu usage is at 99 percent but you get low frame rate. In my mind there are no bottlenecks and there is no more performance to be had. Nvidia didnt need to issue a driver update, but because they did now i am just more suspect regarding poor performance. How did nvidia give me an extra 10 to 20 frames per second when my gpu usage was already at 99 percent. That makes me question things.

    To me if the visuals don't justify the poor performance I feel its bad drivers. Witcher 3 had no business being as taxing as it was prior to the patch. We had better looking games, that ran better.
     
    Last edited: Nov 4, 2015
  14. Clocknut

    Clocknut Guest

    Messages:
    20
    Likes Received:
    0
    GPU:
    Palit 750Ti StormX Dual
    750Ti is a different Maxwell from 900 series. Maxwell 1.0 is unaffected right?
     
  15. Yxskaft

    Yxskaft Maha Guru

    Messages:
    1,495
    Likes Received:
    124
    GPU:
    GTX Titan Sli
    Not increased performance in general perhaps, but those "game ready" drivers are advertised as being the best performant drivers for the specific game
     

  16. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,946
    Likes Received:
    1,050
    GPU:
    RTX 4090
    There was an issue with the game's code not being optimal for Kepler architecture. Go bug developers for not optimizing their code for older videocards. Stop bugging NV to fix everything in the world with the drivers as this is a wrong way to handle this from any point of view.
     
  17. Yxskaft

    Yxskaft Maha Guru

    Messages:
    1,495
    Likes Received:
    124
    GPU:
    GTX Titan Sli
    If the game's code wasn't optimal for Kepler, then Nvidia shouldn't have been able to get that performance increase they did via the drivers, shouldn't they?

    Studios are coding for DirectX or OpenGL and it's up to Nvidia to have the performant drivers to handle all the calls and/or bugs on the supported GPUs
     
  18. GanjaStar

    GanjaStar Guest

    Messages:
    1,146
    Likes Received:
    2
    GPU:
    MSI 4G gtx970 1506/8000
    if you are gonna be dedicating time to this, at least test games.

    synthetic benchmarks are useless for both overclock stability testing and performance comparison over driver sets as they don't reflect game performance at all.

    their only purpose is to compare your score with a similar system in order to see if there's an issue somewhere.
     
  19. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,110
    Likes Received:
    2,611
    GPU:
    3080TI iChill Black
    http://www.hardwarecanucks.com/foru...ws/70125-gtx-780-ti-vs-r9-290x-rematch-7.html

    A slightly OC'ed 780GTX @ 1050 - 1100 MHZ boost surpases stock 780TI and you see how 780TI stock goes head to head with 980GTX, so saying 970GTX is that much faster then 780GTX is just wrong..

    There are few minor exceptions where new Maxwell texture compression kicks in and takes a small leap, but not much once you raise MSAA or higher resolution.

    Just saying.
     
  20. GanjaStar

    GanjaStar Guest

    Messages:
    1,146
    Likes Received:
    2
    GPU:
    MSI 4G gtx970 1506/8000
    http://forums.guru3d.com/showpost.php?p=5179287&postcount=421

    i wouldn't call 20+fps a small leap. the newer the game/tech, the bigger the gap, from my experience. feel free to compare your 780 to my other recent benches in this post.

    http://forums.guru3d.com/showpost.php?p=5172925&postcount=59
     
Thread Status:
Not open for further replies.

Share This Page