Nvidia is deliberately downgrading kepler gpu's performance in favor of Maxwell.

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by Angrycrab, May 18, 2015.

Thread Status:
Not open for further replies.
  1. stereoman

    stereoman Master Guru

    Messages:
    884
    Likes Received:
    181
    GPU:
    Palit RTX 3080 GPRO
    Nvidia should be supporting all their products that are currently being sold that includes the 700 series and they should be optimizing both Maxwell and Kepler architecture, to favour one over the other just because it's newer doesn't bode well for the consumer, I mean look at the life cycles of consoles we're talking what 7 - 8 years and yet every year they continually work on optimizing and extracting more and more power, sure the GPU market moves a lot faster than that but Kepler is still fairly new architecture regardless, to quit optimizing it in favour of newer products is just bad business and it's really a waste considering the money and time that goes into the R&D for these products.
     
    Last edited: May 20, 2015
  2. gringopig

    gringopig Guest

    Messages:
    213
    Likes Received:
    5
    GPU:
    EVGA 980Ti SC
    Indeed, my opinion. It's all just opinion in the end.

    Interested in that car though...I have not found one yet that speeds up over time. Do you have a green one?
    :)
     
  3. ---TK---

    ---TK--- Guest

    Messages:
    22,104
    Likes Received:
    3
    GPU:
    2x 980Ti Gaming 1430/7296
    780 ti was released November 2013, it is well over a year old.
     
    Last edited: May 20, 2015
  4. gringopig

    gringopig Guest

    Messages:
    213
    Likes Received:
    5
    GPU:
    EVGA 980Ti SC
    Specifications are descriptions of attributes, limitations and expected performances of hardware designs. The application of specifications and software design to an end use results in measurable performances.

    You cannot imply performance attributes to a non existent application, i.e. a game title which was released years after the hardware was developed, so I would suggest that any limitation to performance is perhaps more to do with the end use and, in this case, game developers produce this end use and the manufacturer of the hardware provides a software driver.
    Going back to an older driver revision may provide more information as to whether it is this that is causing a perceived reduction in performance for any one end use.

    Do the performance figures for older games still pertain?
    Do the performance differences for one card versus another still hold true for games that were originally used as benchmarks, even for the most recent driver revisions?
    If they do then nothing has changed except for the relative performances in small degrees for the most recent games and as game developers do not focus on performance optimisations for video cards that are EOL, I'm not surprised that in the occasional demanding title, some anomalies may appear.

    Unless this is all about a feeling of no longer being 'top dog'

    <<I think it is>>
     
    Last edited: May 20, 2015

  5. nhlkoho

    nhlkoho Guest

    Messages:
    7,755
    Likes Received:
    366
    GPU:
    RTX 2080ti FE
    The 700 series wasn't even a new architecture. Kepler started with the 600 which was released in 2012. So 3 years of performance updates is pretty good.

    Unless you bought a GPU directly from Nvidia, they have no obligation to you whatsoever. With your example, you should be complaining to EVGA, PNY, ASUS, or whoever you buy your card from.
     
    Last edited: May 20, 2015
  6. eclap

    eclap Banned

    Messages:
    31,468
    Likes Received:
    4
    GPU:
    Palit GR 1080 2000/11000
    Since when is there 18 months in a year?
     
  7. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,097
    Likes Received:
    2,603
    GPU:
    3080TI iChill Black
    Don't mix mainstream perf. Kepler chip with high-end Kepler chip.

    2 totally different things here, that's why nvidia compared GK104 to GM204 in the first place..


    To me it looks like Witcher is specifically optimized for GM204, that's why it got postponed numerous times in last few months to make these shiny new GM204 chips pull ahead from the rest, they obviously did something in game code so it acts the way it does and slows down on anything else that's not Maxwell..


    COD Ghosts has proper hairworks implementation - looks like real hair too if I can call that dog fur that and it magically runs better or the same as on GK110 vs full GM204 chip (980GTX).

    http://www.hardwarezone.com.sg/review-nvidias-geforce-gtx-980-bags-performance-boost-new-drivers

    http://ht4u.net/reviews/2015/nvidia_geforce_gtx_titan_x_im_test_-_gm200_im_vollausbau/index32.php


    Or MHFO Cryengine hairworks
    http://forums.guru3d.com/showthread.php?t=396046

    specifically min and avg fps..
     
  8. GPU

    GPU Guest

    just to add
    no games that have been tested lately were made after maxwell was released.

    all were being made while nv was charging 1k for some keplers while knowing said cards would not run well in those games [with game works ??]. that is if nv are not neffing them in the drivers today , no win in my books.
    they either knew or are causing the slow down today imo.
     
  9. ---TK---

    ---TK--- Guest

    Messages:
    22,104
    Likes Received:
    3
    GPU:
    2x 980Ti Gaming 1430/7296
    There is a huge difference in performance between gk104 and gk110, I had both. You will see the performance bump from a 980 to 980 ti fairly soon.
     
  10. j2.Nocturne

    j2.Nocturne Member

    Messages:
    27
    Likes Received:
    0
    GPU:
    eVGA 780TI Classified
    O.K....I NEVER post here....EVER.

    I lurk all the time...But I never post.....

    So for one of my only post's....


    My god you are an ignorant prat....You honestly have absolutely no bloody idea what you are talking about...from a low level programmer, or a computer hardware engineering point of view...

    Just stop. You are wrong, and are already starting to look stupid, stop while you are just at the line.



    Really eclap?

    you are going to argue saying "a year" vs saying "a year and a half"

    Don't you have better things to do with your life?


    Released to the public however =/= fully developed and shared with developers....

    However.....This is on nvidia....so I do agree that nvidia is in the wrong....but you are blaming them for the wrong reason.


    edit: oh god I just noticed the computer I have next to my name....That was like 6 full builds ago+....
     

  11. GanjaStar

    GanjaStar Guest

    Messages:
    1,146
    Likes Received:
    2
    GPU:
    MSI 4G gtx970 1506/8000
    So does anyone have any game benchmarks proofs or is this just internet hearsay?
     
  12. eclap

    eclap Banned

    Messages:
    31,468
    Likes Received:
    4
    GPU:
    Palit GR 1080 2000/11000
    You do realize that 6 months in pc hardware is a long time? I bought my 970 less than 6 months ago and I'm already looking at what's coming out next.

    Also, 7 series is hard to find now over here in the UK, most places will have a couple of 780 left at extortionate prices, some won't even have them anymore. 7 series are history, yes, 18 months is a lot longer than you think.
     
  13. Shayne

    Shayne Master Guru

    Messages:
    353
    Likes Received:
    5
    GPU:
    MSI RTX 2070 Armor
    Here is a jacked up 970 (no sli) TK, no where near your beast. Progress and electronics is always about money and if you need to have the best than you need to buy new. I want a 65" oled UHD but the price is not justified this year even if I can afford it no problem. My first cd burner was $5000. I find that drivers reach a plateau for each series and usually find myself using a favorite old driver at some point as the new ones all do not preform as well. It is what it is and I bet my 1070 will be closer to your set up right now.

    [​IMG]

    All about what you can live with and for me it has become mid-range and upgrade more often. Maybe I now know how the sli boys where gaming with their 680's it just toke a little more patience on my part.

    Regards
     
  14. gringopig

    gringopig Guest

    Messages:
    213
    Likes Received:
    5
    GPU:
    EVGA 980Ti SC
    Well, I actually do know what I'm talking about funnily enough as I work in the electronics industry and have worked with engineering specifications for 25 years but no matter. If you disagree, so be it but be civil about it.
     
  15. morbias

    morbias Don TazeMeBro

    Messages:
    13,444
    Likes Received:
    37
    GPU:
    -
    As above; please play nice if you guys want this thread to remain open. Any further insults directed at another member will result in infractions being handed out.
     

  16. ---TK---

    ---TK--- Guest

    Messages:
    22,104
    Likes Received:
    3
    GPU:
    2x 980Ti Gaming 1430/7296
    That particular bench loves vram bandwidth, I think that is a major factor in your score vs mine. Was your vram speed at 8ghz?
     
  17. SmashedBrain

    SmashedBrain Guest

    Messages:
    44
    Likes Received:
    0
    GPU:
    2x970@1500Mhz
    That would make sense, since the 780TI has a much wider memory bus.

    Comparing benchmark numbers without 8xMSAA could be interesting.

    Also, Witcher 3 is much less dependent on memory bandwidth since it has no MSAA and doesn't even use 3GB @ 4K resolution. But Witcher 3 uses tesselation and GPU compute effects in which the 970 is better than the 780TI.
     
  18. zoomer-fodder

    zoomer-fodder Guest

    Messages:
    82
    Likes Received:
    0
    GPU:
    ASUS ROG Poseidon 780 SLI
    now i see review marked UPDATED and tester adapt to the growing excitement of discontent users, and rework the numbers in the tests, supposedly 780 outperforms the 960. its mean 780 without any optimisation in Project CARS and Witcher 3 can win 960?
     
  19. psychok9

    psychok9 Master Guru

    Messages:
    319
    Likes Received:
    0
    GPU:
    MSI GTX 980TI Gaming 6G
    It can be that don't patch games on the driver like Maxwell and all new games... (all games need a specific profile and optimization for work at best).

    If they abandon the "optimization" for previous architecture, it's very bad.
    Will happens the same with Pascal gen and Maxwell?

    I'm very worried about it, I need to change gpu card now...

    If they will release a new driver with Kepler optimizations, I hope that I will know it.
     
  20. zoomer-fodder

    zoomer-fodder Guest

    Messages:
    82
    Likes Received:
    0
    GPU:
    ASUS ROG Poseidon 780 SLI
    its just nvidia's true.
    yes, some believe...
    That must blowup nvidia forums, and nvidia cant ignore this "no keppler optimisation anymore".
     
    Last edited: May 21, 2015
Thread Status:
Not open for further replies.

Share This Page