true high end comes august 2012 GK110 (August), probably 2304 (1D) shader units, 192 TMUs, 48 ROPs, 384 bit DDR interface, game consumption probably ~ 250 watt, performance estimated 390-450% Beast ^ :biggun::biggun::biggun: http://www.3dcenter.org/news/nvidia-kepler-dualchip-loesung-auf-gk104-basis-im-mai
I am a bit disappointed Nvidia decided to release their mid range card as a High end... but it means we can anticipate true high end release later this year... but it's not clear under what name, GTX 685? Of course GTX 680 will cost quite a bit, but I am still targeting for 2 of these little beasties in SLI.
I am very disappointed with this out come, I will not pay $500 for a GK104 card no matter what it does compared to AMDs flag ship.
so GK104 = 680, then GK110 = 690, dual GK110 = 695? soon enough we'll have Gfx cards in line with their prices - Radeon HD8999.99, GTX799.99
Don't know if there is going to be a dual GK110 card, think the dual is supposed to be dual GK104. I was going to say a $8-900 GK110 card won't sell as in my opinion no videogame ever has or will be worth spending that sort of money on, but i'm sure the 8800GTX Ultra did ok. I'm not paying $550 for a GK104 card either, just like the 7970 i doubt the performance is there to justify the price.
Most likely, that would mean they go back to the 200 Series numbers where the GTX 285 was the High End Single GPU. That would mean the Dual GPU (If they decide to bother with that) would be called the GTX 695.
Apple Drops Kepler According to Charlie. http://semiaccurate.com/2012/03/13/apple-drops-nvidia-kepler-from-large-numbers-of-laptops/
i meant about nvidia's TDP for high-end gpu's...anyway moving on, i see that posts about the "REAL" kepler, the one that will be unbeatable are starting to appear out of the blue..people should stop rumoring bull****.
If this is a mid-range GPU being sold as high-end, how long would it take nvidia to change their design and give tri-sli support to this GPU, since nvidia does not provide tri-sli to their mid-range GPUs.
Until Fermi, AMD was the one with that problem though, even high end, compare the 4870x2 to the GTX295. What you mean by real Kepler, you mean GK110, or do you just mean people who actually have GK104?
yes the gk110, i don't think will be another kepler single gpu..either a dual gpu or nvidia' next serie gtx780 released to fight the so called Tenerife.
Probably not, after GK110 i think it's Maxwell which isn't due till at least next year. Can't say i'm overly happy or impressed by the state of things when it comes to GPU's as AMD and Nvidia seem happy enough to just give us small improvements, but charge us large amounts for it. GTX580 - GK104 isn't likely to be a huge upgrade, and GK104 - GK110 isn't either, but i can imagine there will be be a good amount of people who will go that route, so you can see why Nvidia/AMD do it.
yes, i agree..both of them release gpu's to fast and neither of them manage to come with big performance leaps over the last gen, well ATI did with the hd7970 vs.hd6970 but when compared with the gtx580 which was the best single gpu, the difference is small, and now if the gk104 is as fast as the hd7970 or 10% faster again we see just a small increase. Actually nvidia has become a bit lazy tbh gtx480->gtx580->gtx680 barely worth the upgrade.
680 specs http://diy.pconline.com.cn/graphics/news/1203/2702126.html?ad=6431 Core clock is supposedly 1006mhz (not 704mhz), and it only boosts to 1058mhz (only 52mhz more), which is a heap of crap tbh, as an extra 52mhz will make no difference will it ?
that sounds nice and true, considering it's 256bit only. edit: wow..no, 52mhz wont do crap..but what's that, max OC or some kind of turbo boost?..if it's max OC it's bad.
52mhz would make a noticeable difference with my card, and the 570 got roughly a 15% increase in BFBC2 with 68mhz according to the Guru3D review, but Kepler is new, so who knows at this point.