I'm taking back my words. It does shine. But to see gk110 now would be better, cause i see some marketing strategy here=(
Of course you can. With a little effort though. All you can do now is to bypass somehow crappy VRM and power limit.
We'll have a dedicated overclock article as well. BTW SLI review: http://www.guru3d.com/article/geforce-gtx-680-sli-review/
Excellent looking forward to the OC'ing review. Been reading a few more reviews this morning specifically looking at both competing cards oc'd. xbitlabs have done a great job here I must say And Bit-tech have compared the cards overclocked in BF3 here Interesting times :banana:
as i expected, the gk104 would perform well. i dont see nvidia being in any particular hurry to release their gk110 chip, especially when they can milk this card. tsmc's 28nm process isnt exactly the most mature of processes. the yields arent that great from what most news sites are saying, plus factor in that nvidia and amd arent the only hands dipping in the 28nm pool - qualcomm and apple are dipping their hands in it too. apple's chips are used in iphones, which sell far better than enthusiast graphics cards ever will, which makes them a higher priority for tsmc, which means nvidia and amd wont get as many chips as they would like. now factor this in with the rumours that are suggesting that the gk110 will use a 530mm2 -ish die size as opposed to the gk104's 294mm2 -ish die size. this means that for every gk110 chip they could make, they could make nearly two gk104s instead. if you assume that the manufacturing errors per wafer are consistent, then that means the yields would be lower in a gk110 wafer compared to a gk104 wafer. this means that per wafer, they could make more money by selling gk104 as a high end rather than releasing a gk110 as high end.
^^ yeah sadly I think your right. Unless AMD have got something up their sleeve why should nVidia not milk the current situation ? I would if it was my business.
because need for gk110 does not stem from gaming market - at all, but computing/workstation. certain people with certain supercomputers were promised power efficient GPU, for which GK104 with its graphic workload focus is simply not suitable.
Has anyone seen 3D vision benchmarks? The only one I've seen is this and it doesn't tell much. http://www.hard*wareheaven.com/revi...80-kepler-graphics-card-review-3d-vision.html Also, an adaptive vsync test. Very nice if it performs like this in most if not all games. I'll get a kepler card solely based on this, unless AMD implements it too, if that is possible through driver updates. http://www.pcper.com/reviews/Graphi...hics-Card-Review-Kepler-Motion/Adaptive-VSync ------------------------------------------------------------------------ P.S. Why links to some websites are blocked? If I put it normally without a *, it shows up like this: http://www.*******************/revi...80-kepler-graphics-card-review-3d-vision.html
I don't think there's any marketing strategy or conspiracy here. Dunno maybe i don't see one cause i'm not a nVidia fan (i'm not an Amd fan either i buy the best bang for the buck I curently own a 6950 but previously owned a 6800gts other nVidia cards). Why would nVidia not release the best card they can right now ? I mean yes the 680 is a better card (less expensive, better perf overall, better architecture) but the difference is marginal in most games and the 7970 even manage to beat the 680 occasionally. Imo the perm difference is not enough big to steal the Amd fanboys market if both cards are sold at the same price. And it's definately not good enough to convince 7970 owners to trade their card for a 680 unless they are nVidia fans or have money and time to waste. Why would nVidia do that ? Why not release that awesome sauce unbeatable card right now and beat Amd by 30-40% and never get beaten in any game not even Metro ? I just don't understand the strategy at all. Imo the 680 is the best nVidia could do now, at this power usage, at this temp, at this noise level, at this profit margin and price. I think that saying otherwise is being a little bit delusional. I don't say nVidia will not release a better card later this year (Amd can release one too 3-4 months later) what i'm saying is they simply can't right now and meet their own expectation for the power usage, temp, noise level, profit margin and price.
Adaptive V-Sync is backwards compatible with cards through drivers, the 300.xx driver released the other day has it and it works on older cards.
because GK104 is cheaper to produce, there are more volumes and profit is way bigger then to just release GK110 instead of it, and to sell GK104 for 100$ less then it is selling for now. I don't think nVidia has an intention to "Steal" 7970 users at all. Even if they did release gk110, why would someone who just recently bought 7970 for, say, 550$ sell it for less and buy another card? There's nothing wrong with 7970. If I bought it, I wouldn't regret it nor should anyone in their right mind right?
Great, missed that one, but I'll wait for official support, since nvidia themselves confirmed it: "Adaptive VSync will be rolled out to all GeForce 8-series and later users in the near future." So I guess AMD will also offer something alike sooner or later.
Well I personally am impressed and at this point in time, I'm going to be getting one. Just yesterday I moved to Poland though but I should be home again in a couple of weeks so I will order and pick it up when I visit the lovely UK (this is because my pc is currently still back home)
Ah good because I'm curious about that. I'm about to fit my own GTX 680 and love the idea of automatic overclocking as I'm a little apprehensive about raising voltages and clocks in case I damage the hardware. If the voltages are all handled by the card depending on what levels you set with EVGA Precision X then that is just what I've always wanted! One thing puzzles me though, which I hope the article covers. That is the card itself will automatically overclock itself from what I've read and Precision X allows you to tune it further by setting limits for the power, memory and GPU clocks but can I run both Precision X (for the overclocking) and MSI Afterburner (which I use for monitoring the card in games via the OSD and on the desktop via a sidebar gadget) together? Or does MSI Afterburner now have the same functionality as Precision X thus I can just use that?
I just read on the Dutch site Tweakers.net that Zotac is working on a 2Ghz Version of the 680. Here is the source: http://digi.tech.qq.com/a/20120322/002197.htm