NVIDIA preparing Maxwell GM204 and Kepler GK210 GPUs.

Discussion in 'Videocards - NVIDIA GeForce' started by TheDeeGee, Apr 19, 2014.

  1. ---TK---

    ---TK--- Guest

    Messages:
    22,104
    Likes Received:
    3
    GPU:
    2x 980Ti Gaming 1430/7296
    Huh what are you talking about? My 680 sli was 40% with oc 1280 vs 580 sli at 900. Where are you getting 30fps to 40?. Stock 680 vs stock 580 is about 30% difference. You cant compare different architecture shaders. Fermi to kepler to Maxwell they are not the same.
     
  2. Strikerx80

    Strikerx80 Ancient Guru

    Messages:
    5,347
    Likes Received:
    0
    GPU:
    eVGA 980 Ti SC +120/+300
    you said it was only a 30% increase in performance. so plus 10 fps from 30 to 40 is an increase in frame rate of 33%
     
    Last edited: Apr 21, 2014
  3. eclap

    eclap Banned

    Messages:
    31,468
    Likes Received:
    4
    GPU:
    Palit GR 1080 2000/11000
    yes and what's wrong with that? when you're getting low fps, 10 extra fps is a godsend. at 60fps it would be a 20fps jump, which again is more than noticeable.
     
  4. Strikerx80

    Strikerx80 Ancient Guru

    Messages:
    5,347
    Likes Received:
    0
    GPU:
    eVGA 980 Ti SC +120/+300
    nothing is wrong with it. just seems you should see closer to a 66% increase given the difference in the specs. anyway if 680 only increased 30% over 580 then no way would this rumored spec of maxwell be an increase at all over 780ti
     

  5. ---TK---

    ---TK--- Guest

    Messages:
    22,104
    Likes Received:
    3
    GPU:
    2x 980Ti Gaming 1430/7296
    30% is around the avg, some games will be less, some more. thats a massive difference when you are getting lower fps in a game, could be unplayable to playable. these jumps are massive in performance compared to intel cpu`s where they seem more interested in lowering power consumption imo. I estimate a 50% increase over 680sli both overclocked.
     
  6. Xenotone

    Xenotone Guest

    Messages:
    681
    Likes Received:
    0
    GPU:
    780 Ti SLI H2O 1230mhz
    It seemed to me like 580 / 670 / 760 all have similar horsepower really, depending on the situation of course. I went from 670 SLi to 780 Ti and it's got very nearly the same grunt when overclocked, little bit less maybe.

    Can't wait to get another 780 Ti, OH YEAH :D
     
  7. SLI-756

    SLI-756 Guest

    Messages:
    7,604
    Likes Received:
    0
    GPU:
    760 SLI 4gb 1215/ 6800
    if you remember back to the initial release of the 670s they flew like almost no card ever did before them, some handy work was put into the 670s imo.
    I see 760 basically the same (better overall card design though +boost2.0)though minimum frames they both kill the 580.
    I remember trying bf3 on a single 580, ouch.
    with a 760 bf4 was delicious.
    all ive been doing for a week is reading gpu reviews and the like. :(
     
    Last edited: Apr 21, 2014
  8. Loophole35

    Loophole35 Guest

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    You're a masochist you know that right.
     
  9. fry178

    fry178 Ancient Guru

    Messages:
    2,078
    Likes Received:
    379
    GPU:
    Aorus 2080S WB
    @pbvider
    Well,if u feel enthusiastic about 2 years old technology...

    Being able to sell "the same" product for 2y (on the pc market) is a pretty long time and shows how "good" it is.

    Never seen the (open) market or manufacturers holding on to a crappy product, do you?!

    Just because something is newer, doesn't automatically make it better: how many own a galaxy S4 and will get the S5?
    SO, its better to put out a new product that "no one" buys, but at least its new?! ;-)
     
  10. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    Are you implying that the S5 with a superior processor, battery life and camera is somehow inferior to the S4? Also the S5 was reported to be selling nearly double what the S4 was on first week launch.

    I mean I guess I'm not sure what people expect. It costs a lot of money to develop a GPU architecture. Literally close to a $1B now. Nvidia isn't going to recoup that cost unless they stagger their product launches over 2 years and milk every penny out of it. It also doesn't help that 28nm at launch had horrible yields and TSMC is now charging per wafer and not per working chip. 20nm is looking even worse.
     

  11. pbvider

    pbvider Guest

    Messages:
    989
    Likes Received:
    0
    GPU:
    GTX
    Judging by the GPU history I`ll say that newer is allways better,but If u like so much 28nm u`r free to stay with it,for me 20nm is what I want/prefer.U can`t compare a phone to a GPU,thats is just silly.
     
  12. GPU

    GPU Guest

    -what is a 20nm maxwell - just a dream imo.

    I agree[on 20nm] but the big MAX 20 nm [14 billion ish trans] is coming when ?
    , late 2015 ? or later , 1 year after small max 20nm 4q-2014 or 1q-2015 + a year =1 H-2016
    what we know now small max vs big max would be a pissing away money nor should nv be rewarded for their marketing by buying a $600.00 small max.
     
  13. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    I mean I don't know about Maxwell, the pricing isn't really set yet, but for Kepler I don't see how 680 users "pissed away money" or got screwed in anyway. If you bought a 680 you could have easily picked up a second when the 780Ti launched and for $350 more you could had the 680 performance for the initial year and slightly better performance for the rest of the 780Ti's life.

    I mean I play games with a friend who has a 780Ti and my 690 often out performs it. There are very few games where SLI scaling is an issue and even with those games its pointless because I'm already getting way over 140fps. Dual 680's would be even faster than my 690. And if you don't mind buying used there were 680's going used for $350 when the Ti launched, you can pick one up now for $275.

    I mean yeah I guess that's at best a sidegrade, or even a slight downgrade if you couldn't buy used, but it's not terrible at all. It depends on how much you value that initial year of extra performance.

    Edit: Of course if you're just going to SLI every series you get that's a different story. But yeah.
     
    Last edited: Apr 22, 2014
  14. GPU

    GPU Guest

    I under stand what your saying ,but you have a 690.and had games been released in 2014 with next gen. of games most cards would be playing on low to med.
    -when you paid 1 k for that card nv was stocking gk110's for the next release when sales slowed down.
    -they had gk110 at the same time as the 680's but held them back , then released 1k titans .[sales slowed down]then 650.00 gtx 780's, amd released x290's, then 780 dropped to 500.00 ,but they had the [$700] 780ti PLUS all the sku's after. 6gb 780's , super titans black's]
    same gk110 chip folks what do you think will happen with maxwell ?
     
    Last edited by a moderator: Apr 22, 2014
  15. Blackops_2

    Blackops_2 Guest

    Messages:
    319
    Likes Received:
    0
    GPU:
    EVGA 780 Classified
    Truth be told less your running 1440p or 1080 @ 120hz+ a 7970/680/770 and the like are still really really viable. It's a lie for me to sit there and act like i needed a 780. My 7970 @ 1125/1575 would max any game i've bought at 60+fps with the exception of Metro LL and Crysis 3. I just picked up a 780 but i got a steal on a lightly used one and i've yet to surpass my 400$ mark for a GPU. That's the only part that has bothered me about this whole ordeal of Hawaii and GK110 we've effectively established 700$ flagship cards. That blows.
     

  16. Loophole35

    Loophole35 Guest

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    Do you have proof of this or are you presenting your opinion as fact?

    AFAIK GK100 as it was known then was not even stable enough to fill orders placed by the government and none of those were full 15SMX units.

    So again proof or GTFO!
     
  17. GPU

    GPU Guest

    mines at $500.00 x2
    -and I have no gpu budget if required , I just Don't like nv's marketing of holding back sku's , eg 2gb vram 3months later 4gb for a +100.00 ,but when amd has something will release a 770 2gb and a 4gb[+$50] at the same time.
    -nv yields on gk110 cry me a river $58.00 chip after yield losses on a 1k card.[per net]
     
  18. GPU

    GPU Guest

    hey if they had titans they had cut down 780 or can't you count cores
    how many 1 K titans would have sold if the 780's were released at the same time that is my point ,
    they can do a big 780ti fan fair and no cards when amd has something faster ,they have no problem with that do they.
     
    Last edited by a moderator: Apr 22, 2014
  19. Blackops_2

    Blackops_2 Guest

    Messages:
    319
    Likes Received:
    0
    GPU:
    EVGA 780 Classified
    I get that as well. Hell I've historically had nothing but nvidia but they do some things that bother me as you mention holding back skus. Mainly I hate that we got a destined mid range part sold as high end, but it makes perfect sense on nvidias end. And there is real way saying AMD wouldn't do the same given the position. I see the same likely happening with maxwell. Though I highly doubt we see 20nm maxwell until 2015. I can't see a 28nm refresh either. I think we're stuck with current 28nm tech until TSMC gets to nvidia/AMD and 20nm yields increase.
     
  20. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Actually 20nm is ready and rolling on TSMC and GloFo.
    late 2015 we'll have 14nm FinFET already. And maybe risk production of 14nm FinFET+.

    And as for news Samsung does not want to wait till late 2015, but that's just as been said, low power mobile devices.

    What matters for GPU:
    Does it have lower leakage?
    Can it clock higher?
    Does it have low enough defect rate?

    1st & 2nd can be answered by: "A bit, yes."
    3rd is yes for small enough chips like 6*8mm, not for something like 22*25mm where you may trash 85%.
    For time to come even GPUs will have to go split way where they make smaller pieces, test them and working will be put together to avoid one defect per 1500mm^2 to kill 1/3 of GPUs produced.
     

Share This Page