Even if there isn't, the GTX 680 allows you to push higher resolutions and settings and gives higher minimum framerates. I can finally run Aliens vs. Predator without stuttering and Alan Wake at 60 fps for example.
Woohoo, finally shipped! :banana: Is a GTX 580 a ridiculous choice for PhysX in Batman? Of course it is! :nerd:
There is something i don't understand. I've just received the new Asus GTX 680. Uninstalled the old GTX 580, put the new card in the box, boot the PC, installation of the 301.10 drivers and.. everything seems fine.. BUT, MSI afterburner display a 705 mhz core clock and a 3004 mhz memory clock :nerd: NvidiaInspector shows me the same. The memory clock seems ok, but the core clock should be at 1006 Mhz, no ? Am i missing something or what ?
If you're shooting for 120 fps @ 120hz minimum, then the 680 is still way under that mark (@ high settings). -scheherazade You're talking about mhz under load, or idle? -scheherazade
My benchies - GTX680 on stock. I dont know why my 3DMark11 score is so low but I have great difficulties to disable Vsync. The global settings seem to get overrided by Program Settings (which I wish I could completely disable). Other running programs seem also to override the current Program Settings. Gah, well I have not been on the green team for some time now so maybe I'm just not used to it? Could anyone analysis my scores and tell me what I can do to improve them?
Same issue here with an EVGA card. Brand shouldn't matter because they were all Nvidia reference and built by them. Anyways my card won't raise past 705mhz core and 1125 mhz memory no matter what I throw at them. Something isn't right.
You're talking about mhz under load, or idle? -scheherazade[/QUOTE] Well, idle is 324 mhz and under load 706 mhz If i look at Nvidia inspector, it shows on idle : Current clock : 324 mhz GPU clock 706 mhz Default clock 706 mhz If i benchmark it, with the exact same settings than Netherwin (just above), i score 1158 in the heaven benchmark, which seems ok, but i dont understand the clock speeds...
Under load. I get 705mhz, idle is 324mhz. I thought these cards were supposed to have a base clock of 1006mhz on load. EDIT strange, I rebooted and it is working fine now... 1019 mhz core..
Testing +600 on mem at the moment and it appears stable On the core I'm not so lucky, can only get to around +130 (stress test stable) on the core taking me to about 1215-1230MHz on stock volts Could do with pushing the voltage a tiny bit more.. seeing about 10405 is 3dmark11 and Heaven maxed I get about 49.6FPS.. Loving this card though
So I've been testing my 680 for the past 2 or so hours. I have to say, I'm a little disappointed at the same time I'm absolutely amazed. In terms of the performance numbers, the 680 is maybe about 5% faster in the stuff I've been testing. So I'm a bit disappointed the numbers aren't a bit higher compared to my old setup, but running high overclocked tri sli gtx 280s I'm not honestly surprised. They might be old, but nothing to sneeze at. I'm only running stock on the 680 so far, but the smoothness, temps, and quietness of this setup cannot be overstated. I can't honestly tell you how good it feels to be able to play with Vsync off and not seeing my screen tearing constantly and endlessly. It is nice to look at a stable image without vsync. Oh yes it is. I've finally been able to experience tessellation and holy what the what more games need this. The highest I've seen the card run at so far has been about 72c with 49% fan speed. A big difference from 3 cards blazing 85c at 100% fan. I'm sure my power bill is going to reflect that change too. I'm really excited to start testing SGSSAA performance on this thing in some of my games, because that is one area the gtx280s did not hold up at all. 2x SGSSAA on gtx 280s was lucky to have acceptable frame rates if at all. EDIT oh and I can now monitor the gpu stats on my g15 without the sensor polling rate causing stutter (from trying to poll 3 cards) while gaming. BIG WIN
Hi guys, Anyone gone from sli 580,s to sli 680,s just wondering if its really worth the upgrade? Or should i just wait for the next gtx version..
If this is the flagship single gpu of Keplar, then this sucks... No offence to the people who bought 680's but it was suppose to be 2x performance of Fermi. it only looks like 1.3x to 1.5x even with ironed out drivers. Maxwell is the 4x step from Keplar. suppose to be bigger than this. ill wait for that. Anyways, congrats for all you guys who love your new cards. I was the same way when i got my 480. (then i watercooled than sum***** because the fan was too loud :flip < LOOK EVERYBAWDY! Its Mr. Kepler!
Sounds it's normal for graphics test only in 3DMark Vantage and 11 because graphics scores are more much important than P scores. I can found your Graphics scores in 3DMV and 3DM11. It's still normal in Heaven.
We are limited by how big our resolution is. We need a 50" 4k monitor (which are actually becoming reality for TV's, but not consumer based yet). Eyefinity and 3D vision surround are "lay-overs" for this problem. If these technologies weren't created, it wouldn't require graphics card manufacturers to produce higher processing gpu's.
nVidia's promise was that Kepler would be "3x the performance per watt of Fermi". They never claimed 2x graphics performance. That was all rumors/speculation. nVidia did claim Kepler would have better GPGPU performance than Fermi though...which according to every GPGPU benchmark, they lied about.
We are limited by how big our resolution is. We need a 50" 4k monitor (which are actually becoming reality for TV's, but not consumer based yet). Eyefinity and 3D vision surround are "lay-overs" for this problem. If these technologies weren't created, it wouldn't require graphics card manufacturers to produce higher processing gpu's.