Five generations of Nvidia cards compared

Discussion in 'Videocards - NVIDIA GeForce' started by alanm, Dec 23, 2014.

  1. alanm

    alanm Ancient Guru

    Messages:
    12,273
    Likes Received:
    4,477
    GPU:
    RTX 4080
    Five gens of Nvidia cards compared. 480, 580, 680, 780, 780ti, 980.

    http://www.techspot.com/article/928-five-generations-nvidia-geforce-graphics-compared/

    Biggest performance jump in percentage terms was the 580 to 680 which was 41% over the 580 @ 1920x1080 and 56% @ 2560x1600. So once gain, a good reminder that lower bit bus specs are meaningless when it concerns newer gen cards vs old. Something that many of us have known since the midrange 6600gt pwned the preceding flagships cards with twice the bit bus a decade ago.
     
  2. kislotikas

    kislotikas Guest

    Messages:
    42
    Likes Received:
    0
    GPU:
    8800GT Zilent OC 512MB
    thats what i like to read ;)

    thx for the link i really like old vs new cpu/gpu reviews. I'm still on 2008 rig with e8400@3,7 with 8800gt. hope to finally change my pc in 2015. so i like to check progress.:pc1:
     
  3. Kaapstad

    Kaapstad Guest

    Messages:
    146
    Likes Received:
    0
    GPU:
    4 x Titan X
    The only problem with your theory is the GTX 780 and 680 are part of the same generation (Kepler) but released at different times and the 780 packs a 384bit bus.

    @4K comparing 256bit and 384bit Kepler cards really shows up the differences between the buses.
     
  4. alanm

    alanm Ancient Guru

    Messages:
    12,273
    Likes Received:
    4,477
    GPU:
    RTX 4080
    Not my theory, just using the words in the article. "Generation" as used obviously was not strictly meant to stick to same arch. You also overlooked fermi (480 and 580) which were closer to one another than the 680 vs 780.

    Not necessarily the bit bus as sole determining factor, but 780 also has 50% more vram capacity and more grunt to churn out the fps.. so 'your theory' as bit bus responsible for that falls flat here.
     

  5. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,103
    Likes Received:
    2,606
    GPU:
    3080TI iChill Black
    43% sounds a lot but when you look at numbers its mostly ~15 - 17fps 580gtx vs 680gtx, nothing really groundbreaking.

    Full Kepler to Maxwell even less, 5-12fps at best at 1080p.


    At least there is a proper 100% jump last gen High-end (580GTX) vs current High-end (780Ti), hopefully next gen high-end Maxwell GM200 will have the same jump vs GK110.
     
    Last edited: Dec 23, 2014
  6. alanm

    alanm Ancient Guru

    Messages:
    12,273
    Likes Received:
    4,477
    GPU:
    RTX 4080
    15-17 fps is meaningless on its own. It means nothing when its a high fps game but a world of difference in something like Crysis or Metro. Something you oddly seem to never grasp.

    But if we were to look at it on your terms - that 15-17 fps is not "really groundbreaking" - then you're probably peeved that the 780 is only around 9 fps more than the 680 in Crysis 3 @ 1920x1200 and only 6 fps more at 2560x1600.

    So stop using FPS regardless of context, it doesnt help you're argument.
     
  7. Kaapstad

    Kaapstad Guest

    Messages:
    146
    Likes Received:
    0
    GPU:
    4 x Titan X
    Not really as when you test what I said earlier you use games or benches that don't use more VRAM than the 256bit card has so as to test the bus itself.
     
  8. alanm

    alanm Ancient Guru

    Messages:
    12,273
    Likes Received:
    4,477
    GPU:
    RTX 4080
    And when "you test" what I said about about 'GPU grunt' you can see that bit bus *in and of itself* (which is basically what you implied in your earlier statement) is not as crippling a factor in GPU performance as you make it out to be - even in 4k benches.

    http://www.guru3d.com/articles_pages/gtx_780_ti_sli_geforce_review,22.html

    Look at the 760 SLI (256bit/2gb vram kepler) vs 780ti in above 4k benches.

    So getting back to your original point, that 780 beats 680 at 4k because of bit bus is not entirely correct.. it mainly beats it because of more GPU power. The bit bus can help to get more bandwidth through when called for, but it is not sole determining factor. If you meant otherwise, then maybe you will have to re-word what you said in a slightly better way.
     
  9. transdogmifier

    transdogmifier Guest

    Messages:
    5
    Likes Received:
    0
    GPU:
    GTX 680
    alanm, thanks for the link...makes it easier to decide about updating to a 970..yeah, yeah...I'm still tossing it around...don't really want to spend 300+ dollars, but want a new card ;)
     
  10. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,103
    Likes Received:
    2,606
    GPU:
    3080TI iChill Black
    Or your fanboizm. Uf feel much better now :p

    btw if you want to nitpick both 580 & 680 are far from playable @ 31 or 45fps.. Those are just avg fps, nvm lowest and techspot is so so lately when it comes to benches, to biased.

    perfect example and we all know the game doesnt act like that, just saying. No offense ;)
    http://www.techspot.com/review/827-watch-dogs-benchmarks/page5.html
     
    Last edited: Dec 23, 2014

  11. VultureX

    VultureX Banned

    Messages:
    2,577
    Likes Received:
    0
    GPU:
    MSI GTX970 SLI
    Have you even ps3'ed lol? What's up with all these people saying that anything between 30fps and 60fps average is not playable.
    The older gen consoles can't even keep a steady 30fps most of the time, especially on newer titles. Also have a look at gSync and tell me again that anything above 30 fps average is not playable. You'll think it's more playable than a vSynced game that runs average 70 fps on a 120Hz monitor.
     
  12. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,103
    Likes Received:
    2,606
    GPU:
    3080TI iChill Black
    Yes I played on PS3, but not fps games meant for PC.

    btw then sell your 2x 970gtx and buy a 580GTX to play Crysis3, I bet you will love it ;D
     
    Last edited: Dec 23, 2014
  13. VultureX

    VultureX Banned

    Messages:
    2,577
    Likes Received:
    0
    GPU:
    MSI GTX970 SLI
    I played that game on a single 8800GTX at 1200p and had a blast. Seriously now.

    Oh wait Crysis 3 :D Never mind xD I played that on a single 670GTX, though. It was smooth.
     
  14. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,103
    Likes Received:
    2,606
    GPU:
    3080TI iChill Black
    So was on 580gtx, kinda. With shaders on high, avg 50ish fps.


    Anyway the chart is interesting but not so much, small leaps gen to gen especially cause that article makes it seem mid-range perf. chips aka GK104 and GM204 are real generation step-ups, while in fact its just a GK104 to GM204 step-up. Yes these are ok speedups, just like high-end GF110 to GK110 to future GM200 :nerd:
     
  15. VultureX

    VultureX Banned

    Messages:
    2,577
    Likes Received:
    0
    GPU:
    MSI GTX970 SLI
    https://www.youtube.com/watch?v=WRBzifzqNHM&list=PLBB9260DBCB42ED40

    Tell me that's not smooth to watch. I could keep the 30 fps most of the time and that was 1.5 years ago. I played this game vSynced as well and it still felt smooth to me. That engine is really great. I would say the same of Crysis 1.
    It really depends on the engine how a game feels, if the fps is not all over the place you can have a really smooth experience even at low average fps.
     

  16. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,103
    Likes Received:
    2,606
    GPU:
    3080TI iChill Black
    So why are we debating this again? It was OP that made it more clear that 30 is not playable while 680gtx is with 45fps, I just said both aren't if you want to look at this way..

    Anyway I stand by what I said before, for me personally both aren't anymore, Im a 60fps+ whoe.
    Not in the mood to discuss further, Im out of this thread cya. :)
     
  17. moab600

    moab600 Ancient Guru

    Messages:
    6,660
    Likes Received:
    557
    GPU:
    PNY 4090 XLR8 24GB
    NVIDIA has been very amazing since GTX 480, they halved the TDP while providing better performance.

    While difference between 680 770 to 780 are not that great without oc, once u oc the 780 is it way way better than 770, like 50-70% depends on game(and your oc).
     
  18. GPU

    GPU Guest

    so a card 680 with boost is xxx faster than a 580 with no boost shocked.
    same bs as when the 680 vs 580 was released should be max oc for that set of cards


    -with no boost clocks given with any of the cards this should go in the trash bin imo.
     
    Last edited by a moderator: Dec 24, 2014
  19. Agent-A01

    Agent-A01 Ancient Guru

    Messages:
    11,640
    Likes Received:
    1,143
    GPU:
    4090 FE H20
    I remember that lame driver that came out with the 680 limited my 580s to 999mhz, couldnt go over 1ghz even though the lightnings had more in them :bang:
     
  20. Kaapstad

    Kaapstad Guest

    Messages:
    146
    Likes Received:
    0
    GPU:
    4 x Titan X
    Well actually no

    When I tested I looked at the % drop in performance between the cards when using 4K not the GPU grunt.

    I do have Kepler 256bit and 384bit cards also I use 4K.:)

    I also have 512bit Hawaii and 256bit Maxwell to compare with too.:)
     

Share This Page