Nvidia is deliberately downgrading kepler gpu's performance in favor of Maxwell.

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by Angrycrab, May 18, 2015.

Thread Status:
Not open for further replies.
  1. zoomer-fodder

    zoomer-fodder Guest

    Messages:
    82
    Likes Received:
    0
    GPU:
    ASUS ROG Poseidon 780 SLI
    what a shame nvidia. i never buy his cards. im going in future only for best technology from AMD like HBM memory, liquid cooling, mantle/vulcan etc. Nvidia make a crap.
     
  2. Prophet

    Prophet Master Guru

    Messages:
    865
    Likes Received:
    34
    GPU:
    Msi 680
    Would you kindly provide some evidence preferably in the form of a benchmark? Compare to 350.05 if you will.
     
  3. Yopfraise

    Yopfraise Member Guru

    Messages:
    145
    Likes Received:
    27
    GPU:
    MSI Suprim X 4090
    Some little comparison maybe :) ?
     
    Last edited: May 19, 2015
  4. MrBonk

    MrBonk Guest

    Messages:
    3,385
    Likes Received:
    283
    GPU:
    Gigabyte 3080 Ti
    No, it doesn't matter that Maxwell is vastly more efficient, especially at tessellation than Kepler right? Just cuz reasons.

    (There is a lot of Tessellation in TW3 by the way. Water,Hair,Geometry in the environments.)

    It's amazing to see that people get worked up when the innefficiency of their card starts to pop up when games use more intensive technologies.

    It will happen to Maxwell too, no doubt about it.



    [​IMG]

    I'm a tad CPU/PCI-E 2.0 bottlenecked.

    Also, I'm using 347.88, as if makes a difference
     
    Last edited: May 19, 2015

  5. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,107
    Likes Received:
    2,611
    GPU:
    3080TI iChill Black
    ^
    Nope you are bottleneck by your 256bit bus there.. That new Maxwell compression doesnt help here.

    And this is the real scenario how fast GM204 really is compared to GK110 when it doesnt count, same thing by Hitman Absolution or COD Ghosts or Dirt4 and few more.
     
  6. Prophet

    Prophet Master Guru

    Messages:
    865
    Likes Received:
    34
    GPU:
    Msi 680
    You are making it out like it's ok to gimp your products in order to sell newer ones.
     
  7. fellix

    fellix Master Guru

    Messages:
    252
    Likes Received:
    87
    GPU:
    MSI RTX 4080
    Actually it helps and without it the difference would be larger, since this particular benchmark is quite sensitive to the memory bandwidth.
     
  8. Yxskaft

    Yxskaft Maha Guru

    Messages:
    1,495
    Likes Received:
    124
    GPU:
    GTX Titan Sli
    That try will have a counter-effect if the customers believe Nvidia stops improving the older cards once their new ones are out
     
  9. ---TK---

    ---TK--- Guest

    Messages:
    22,104
    Likes Received:
    3
    GPU:
    2x 980Ti Gaming 1430/7296
    I do not think your card is pci e 2.0 bottlenecked at all. My single 780ti was at 2.0 16x. Maybe a tad on the cpu dept, and valley does love video ram bandwidth as well.
     
  10. Sergio

    Sergio Guest

    Messages:
    254
    Likes Received:
    7
    GPU:
    Asus 760 DirectCU II OC
    http://us.download.nvidia.com/Windows/352.86/352.86-win8-win7-winvista-desktop-release-notes.pdf

     

  11. Yopfraise

    Yopfraise Member Guru

    Messages:
    145
    Likes Received:
    27
    GPU:
    MSI Suprim X 4090
    Okay it seems that one of the problem is Nvidia Gameworks
    Here is an interesting link : http://*************/nvidia-responds-witcher-3-gameworks-controversy/#ixzz3aWEA0v1z

    "It’s very worrisome when one hardware company begins to implement proprietary code into video games which are naturally expected to run on a variety of hardware from different vendors. Worse yet if the code can’t be optimized for other vendors as CD Projekt Red has stated. However one could and would rightfully argue that every company including Nvidia and AMD has the right to develop and optimize technologies for its hardware. No one should disagree with that, however issues arise from the corporate politics that ensue. And that’s when other hardware vendors are denied the opportunity to optimize for games as they normally would, or provide an alternative to code that performs poorly.

    It’s quite unfortunate that Nvidia has taken this recent turn with GameWorks towards locking code and limiting control. A future where the competitors’ only choice is to fill the game with even more proprietary code of their own to compete is not one that gamers or developers will want or appreciate.
    Ultimately the decision to partake in the GameWorks program is that of the game developer. The implications however are often seemingly overlooked or misunderstood. And that’s when gamers have to step in as they did on reddit to make their voices heard. Make yours heard below in the comments."
     
  12. vbetts

    vbetts Don Vincenzo Staff Member

    Messages:
    15,140
    Likes Received:
    1,743
    GPU:
    GTX 1080 Ti
    They are not gimping their products.

    They are basing all optimizations on Maxwell now, and putting Kepler behind. For progress, you must discard the old. Nothing wrong with that considering Nvidia has a product to replace Kepler that is much more efficient.
     
    Last edited: May 19, 2015
  13. GuruKnight

    GuruKnight Guest

    Messages:
    869
    Likes Received:
    17
    GPU:
    2 x 980 Ti AMP! Ex
    These kinds of NVIDIA bashing threads are just getting OLD.
    And FYI a single GTX 780Ti is actually a tad faster than a single GTX 980 in Crysis DX9 with 8xSGSSAA forced using the "0x000002C1" AA flag, mainly due to the 384 bit memory bandwidth of the Kepler architecture.

    Only Titan X has recently surpassed Kepler in this respect.
    This is one of the main reasons I skipped the GTX 970/980 series, and am still waiting to get a couple 980Ti's or Titan X Black's ;)

    But as MrBonk said, in more recent tessellation intensive titles like Witcher 3 the Maxwell cards will have the advantage.
    However I'm fairly certain things will improve for Kepler users in W3 with a couple of patches and more optimized drivers.
     
  14. PhazeDelta1

    PhazeDelta1 Guest

    Messages:
    15,607
    Likes Received:
    14
    GPU:
    EVGA 1080 FTW
  15. ---TK---

    ---TK--- Guest

    Messages:
    22,104
    Likes Received:
    3
    GPU:
    2x 980Ti Gaming 1430/7296

  16. Yxskaft

    Yxskaft Maha Guru

    Messages:
    1,495
    Likes Received:
    124
    GPU:
    GTX Titan Sli
    Aye. No matter how upset one can be over possible lack of Kepler optimizations, one still can't deny that Nvidia supports its older graphics cards for much longer than AMD and Intel

    AMD's last driver for its DX9 GPUs was released in 2010 and had no official Windows 7 support. Likewise, AMD's DX10 cards never got official Windows 8.1 support.
     
    Last edited: May 19, 2015
  17. johnathonm

    johnathonm Member Guru

    Messages:
    127
    Likes Received:
    3
    GPU:
    Nvidia 2080 ti 12GB
    I honestly don't. Could someone post a list of which cards fall into the 6 and 7 series? I have been trying to figure out if my 680m gtx has been abandoned. If I need to start counting pennies to save up for a new rig. Additionally, if my card has been abandoned, is there a best driver for it or such?

    Thanks.
     
  18. Prophet

    Prophet Master Guru

    Messages:
    865
    Likes Received:
    34
    GPU:
    Msi 680
    Would take lot for casual users to even find this out.
     
  19. atimaniac

    atimaniac Master Guru

    Messages:
    265
    Likes Received:
    7
    GPU:
    MSI EVOKE RX 5700 X
    Check witcher on order driver u See what i mean. Post benchmark.... U are playin games or jest tryin only get more points on benchmarki...
     
  20. Sergio

    Sergio Guest

    Messages:
    254
    Likes Received:
    7
    GPU:
    Asus 760 DirectCU II OC
    How rude :)))) , i will go next room and stop using my computer for a while after this mistake .... farewell :)) cant stop laughing. emberassment...
     
Thread Status:
Not open for further replies.

Share This Page