GeForce GTX 680 review [Guru3D.com]

Discussion in 'Frontpage news' started by Guru3D News, Mar 22, 2012.

  1. NAMEk

    NAMEk Guest

    Messages:
    658
    Likes Received:
    5
    GPU:
    Gainward/GTX560/2GB
    I'm taking back my words. It does shine. But to see gk110 now would be better, cause i see some marketing strategy here=(
     
  2. burebista

    burebista Ancient Guru

    Messages:
    1,740
    Likes Received:
    36
    GPU:
    MSI GTX1060GAMING X
    Of course you can. With a little effort though. :D
    All you can do now is to bypass somehow crappy VRM and power limit.
     
  3. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,392
    Likes Received:
    18,564
    GPU:
    AMD | NVIDIA
  4. slickric21

    slickric21 Guest

    Messages:
    2,458
    Likes Received:
    4
    GPU:
    eVGA 1080ti SC / Gsync
    Excellent looking forward to the OC'ing review.

    Been reading a few more reviews this morning specifically looking at both competing cards oc'd.

    xbitlabs have done a great job here I must say
    [​IMG]

    And Bit-tech have compared the cards overclocked in BF3 here

    Interesting times :banana:
     

  5. CronoGraal

    CronoGraal Ancient Guru

    Messages:
    4,194
    Likes Received:
    20
    GPU:
    XFX 6900XT Merc 319
    so wheres the game changer?
     
  6. k1net1cs

    k1net1cs Ancient Guru

    Messages:
    3,783
    Likes Received:
    0
    GPU:
    Radeon HD 5650m (550/800)
    Sold out. =b
     
  7. Sever

    Sever Ancient Guru

    Messages:
    4,825
    Likes Received:
    0
    GPU:
    Galaxy 3GB 660TI
    as i expected, the gk104 would perform well.

    i dont see nvidia being in any particular hurry to release their gk110 chip, especially when they can milk this card.

    tsmc's 28nm process isnt exactly the most mature of processes. the yields arent that great from what most news sites are saying, plus factor in that nvidia and amd arent the only hands dipping in the 28nm pool - qualcomm and apple are dipping their hands in it too. apple's chips are used in iphones, which sell far better than enthusiast graphics cards ever will, which makes them a higher priority for tsmc, which means nvidia and amd wont get as many chips as they would like.

    now factor this in with the rumours that are suggesting that the gk110 will use a 530mm2 -ish die size as opposed to the gk104's 294mm2 -ish die size. this means that for every gk110 chip they could make, they could make nearly two gk104s instead. if you assume that the manufacturing errors per wafer are consistent, then that means the yields would be lower in a gk110 wafer compared to a gk104 wafer. this means that per wafer, they could make more money by selling gk104 as a high end rather than releasing a gk110 as high end.
     
  8. slickric21

    slickric21 Guest

    Messages:
    2,458
    Likes Received:
    4
    GPU:
    eVGA 1080ti SC / Gsync
    ^^ yeah sadly I think your right.

    Unless AMD have got something up their sleeve why should nVidia not milk the current situation ?
    I would if it was my business.
     
  9. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super




    because need for gk110 does not stem from gaming market - at all,

    but computing/workstation.


    certain people with certain supercomputers were promised power efficient GPU,
    for which GK104 with its graphic workload focus is simply not suitable.
     
  10. TheDeeGee

    TheDeeGee Ancient Guru

    Messages:
    9,634
    Likes Received:
    3,413
    GPU:
    NVIDIA RTX 4070 Ti
    Currently €443,- here in Holland.
     

  11. eddman

    eddman Guest

    Messages:
    166
    Likes Received:
    0
    GPU:
    EVGA GTX 1060 6G SC
    Has anyone seen 3D vision benchmarks? The only one I've seen is this and it doesn't tell much.

    http://www.hard*wareheaven.com/revi...80-kepler-graphics-card-review-3d-vision.html

    Also, an adaptive vsync test. Very nice if it performs like this in most if not all games. I'll get a kepler card solely based on this, unless AMD implements it too, if that is possible through driver updates.

    http://www.pcper.com/reviews/Graphi...hics-Card-Review-Kepler-Motion/Adaptive-VSync

    ------------------------------------------------------------------------
    P.S. Why links to some websites are blocked? If I put it normally without a *, it shows up like this:

    http://www.*******************/revi...80-kepler-graphics-card-review-3d-vision.html
     
    Last edited: Mar 23, 2012
  12. MonstroMart

    MonstroMart Maha Guru

    Messages:
    1,397
    Likes Received:
    878
    GPU:
    RX 6800 Red Dragon
    I don't think there's any marketing strategy or conspiracy here.

    Dunno maybe i don't see one cause i'm not a nVidia fan (i'm not an Amd fan either i buy the best bang for the buck I curently own a 6950 but previously owned a 6800gts other nVidia cards). Why would nVidia not release the best card they can right now ? I mean yes the 680 is a better card (less expensive, better perf overall, better architecture) but the difference is marginal in most games and the 7970 even manage to beat the 680 occasionally. Imo the perm difference is not enough big to steal the Amd fanboys market if both cards are sold at the same price. And it's definately not good enough to convince 7970 owners to trade their card for a 680 unless they are nVidia fans or have money and time to waste. Why would nVidia do that ? Why not release that awesome sauce unbeatable card right now and beat Amd by 30-40% and never get beaten in any game not even Metro ? I just don't understand the strategy at all.

    Imo the 680 is the best nVidia could do now, at this power usage, at this temp, at this noise level, at this profit margin and price. I think that saying otherwise is being a little bit delusional. I don't say nVidia will not release a better card later this year (Amd can release one too 3-4 months later) what i'm saying is they simply can't right now and meet their own expectation for the power usage, temp, noise level, profit margin and price.
     
  13. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    Adaptive V-Sync is backwards compatible with cards through drivers, the 300.xx driver released the other day has it and it works on older cards.
     
  14. gx-x

    gx-x Ancient Guru

    Messages:
    1,530
    Likes Received:
    158
    GPU:
    1070Ti Phoenix

    because GK104 is cheaper to produce, there are more volumes and profit is way bigger then to just release GK110 instead of it, and to sell GK104 for 100$ less then it is selling for now.
    I don't think nVidia has an intention to "Steal" 7970 users at all. Even if they did release gk110, why would someone who just recently bought 7970 for, say, 550$ sell it for less and buy another card? There's nothing wrong with 7970. If I bought it, I wouldn't regret it nor should anyone in their right mind right? :)
     
  15. eddman

    eddman Guest

    Messages:
    166
    Likes Received:
    0
    GPU:
    EVGA GTX 1060 6G SC
    Great, missed that one, but I'll wait for official support, since nvidia themselves confirmed it:

    "Adaptive VSync will be rolled out to all GeForce 8-series and later users in the near future."

    So I guess AMD will also offer something alike sooner or later.
     
    Last edited: Mar 23, 2012

  16. Silent_Takedown

    Silent_Takedown Master Guru

    Messages:
    459
    Likes Received:
    0
    GPU:
    Gigabyte GTX 770 WF3
    Well I personally am impressed and at this point in time, I'm going to be getting one.

    Just yesterday I moved to Poland though but I should be home again in a couple of weeks so I will order and pick it up when I visit the lovely UK :p (this is because my pc is currently still back home)
     
  17. Anarion

    Anarion Ancient Guru

    Messages:
    13,599
    Likes Received:
    386
    GPU:
    GeForce RTX 3060 Ti
    There are ****loads of Asus GTX 680's in stock here but the price, 569€ is just too much.
     
  18. kapu

    kapu Ancient Guru

    Messages:
    5,418
    Likes Received:
    802
    GPU:
    Radeon 7800XT
    overclocked HD7970 beats overclocked GTX680 at 2560x1600.

    Says xbitlab.
     
  19. Darren Hodgson

    Darren Hodgson Ancient Guru

    Messages:
    17,212
    Likes Received:
    1,536
    GPU:
    NVIDIA RTX 4080 FE
    Ah good because I'm curious about that.

    I'm about to fit my own GTX 680 and love the idea of automatic overclocking as I'm a little apprehensive about raising voltages and clocks in case I damage the hardware. If the voltages are all handled by the card depending on what levels you set with EVGA Precision X then that is just what I've always wanted! :D

    One thing puzzles me though, which I hope the article covers. That is the card itself will automatically overclock itself from what I've read and Precision X allows you to tune it further by setting limits for the power, memory and GPU clocks but can I run both Precision X (for the overclocking) and MSI Afterburner (which I use for monitoring the card in games via the OSD and on the desktop via a sidebar gadget) together? Or does MSI Afterburner now have the same functionality as Precision X thus I can just use that?
     
  20. TheDeeGee

    TheDeeGee Ancient Guru

    Messages:
    9,634
    Likes Received:
    3,413
    GPU:
    NVIDIA RTX 4070 Ti

Share This Page