55nm GTX 260 uses MORE POWER & puts out MORE HEAT.

Discussion in 'Videocards - NVIDIA GeForce' started by biggerx, Dec 27, 2008.

  1. biggerx

    biggerx Guest

    Messages:
    3,587
    Likes Received:
    11
    GPU:
    EVGA RTX 3070 FTW
    http://en.expreview.com/2008/12/27/the-first-review-of-55nm-geforce-gtx260.html

    Please read the article entirely before you post.

    I want to discuss this because if these results are correct then the GTX 260 55nm is a very pointless card. WTF???

    Now I have read other places that the 55nm will use almost 75W less power, but those also were not quoting real world benchmarks.

    Please discuss.

    Thanks to DeadlyDevil666 for pointing this out.
     
    Last edited: Dec 27, 2008
  2. DeadlyDevil666

    DeadlyDevil666 Member Guru

    Messages:
    170
    Likes Received:
    0
    GPU:
    9800 GTX / Palit / 512mb
    Yes, i am very upset, i waited to upgrade till the 55nm came out as i was sure it would be better as i didn't wait for the 55nm 9800 GTX and it owned the 65nm one.. heh

    It's all good i guess... I'm just going to get the XFX black edition :)
     
  3. LedHed

    LedHed Banned

    Messages:
    6,826
    Likes Received:
    1
    GPU:
    BenQ FP241W
    55nm vs 65nm simply by architecture can't use more power than it's 65nm brother, I don't think you should be basing everything off of that one site. Wait until we see some reviews from AnAndTech/XBitLabs/BitTech

    Also the heat output is directly related to the power usage.
     
  4. biggerx

    biggerx Guest

    Messages:
    3,587
    Likes Received:
    11
    GPU:
    EVGA RTX 3070 FTW
    You didn't read the article.

    They are assuming the heat output is because the heat sink on the 55nm is smaller then the 65nm.

    I agree though that we need to wait for more tests, but it looks like they did a decent test on the thing.
     

  5. BlackZero

    BlackZero Guest

    According to that review it uses a smaller heatsink than the 65nm version, so it is possible but best thing to do would be to wait and see.
     
  6. LedHed

    LedHed Banned

    Messages:
    6,826
    Likes Received:
    1
    GPU:
    BenQ FP241W
    The 55nm 9800GTX+ only gave a benefit of out of the box stock overclocking. If you volt-mod or use after market cooling then you wouldn't see very much gain going to the 55nm G92 compared to the 65nm. I mean hell I have my 65nm G92 @ 870/2106/2000, now show me a 9800GTX+ running those speeds without the same volt-mod.
     
  7. biggerx

    biggerx Guest

    Messages:
    3,587
    Likes Received:
    11
    GPU:
    EVGA RTX 3070 FTW
    Hmmm the "benefit" of 55nm is starting to seem like a pipe dream.
     
  8. DeadlyDevil666

    DeadlyDevil666 Member Guru

    Messages:
    170
    Likes Received:
    0
    GPU:
    9800 GTX / Palit / 512mb
    Yeah, im strongly considering just going with that XFX card... lol :banana:
     
  9. LedHed

    LedHed Banned

    Messages:
    6,826
    Likes Received:
    1
    GPU:
    BenQ FP241W
    It depends on the architecture and the power usage. If you use a smaller manufacturing process and the same amount of power then you should be able to hit higher clock speeds.

    Compare the 90nm G80 to the 65nm G92, the clock speeds on the G92 (65/55nm) are HUGE, so the decrease to 55nm produced a smaller jump from the already hugely popular overclocking 65nm G92. You can see how much a 25nm decrease helps in comparison to a 10nm decrease.
     
  10. R0achTheWarHero

    R0achTheWarHero Banned

    Messages:
    171
    Likes Received:
    0
    GPU:
    55nm 260gtx 750/1511/2406
    One Possibility is, EVGA released 3 cards very fast, a stock 576mhz, a 626mhz, and a 670mhz card, some of the overclocked cards use more gpu and ram voltage, it might be that they rushed the cards and used the 670mhz voltage (above stock) on all 3 cards.
     

  11. EddieG

    EddieG Master Guru

    Messages:
    593
    Likes Received:
    0
    GPU:
    A few
    My 2 month old 260_216 is sitting at 733/1500/2200. The temps never go higher than 62c. i.e. Fallout3> 84 to 112 fps. I have no artifacts in games. Stock fan@72%. Game at 1680/1050 4xAA all high.
    I would like a good reason to step up....:)

    I've temporarily removed the water block from the card to "step-up"
     
    Last edited: Dec 28, 2008
  12. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    but this doesn't explain a "supposed" increase in power usage, power usage would increase heat, hence why that would make the article make sense, so even if it uses a smaller heatsink, who cares, it still produces less heat then the 65nm counterpart, as i do not and would not believe that the 55nm uses more power
     
  13. DSK

    DSK Banned

    Messages:
    17,914
    Likes Received:
    1
    GPU:
    HD5770/BenQ G2220HD
    no mate you have got that wrong.
     
  14. unbreakable

    unbreakable Member

    Messages:
    38
    Likes Received:
    0
    GPU:
    8800GTS 512 800/1905/2200
    have you even read the article??

    I mean, yes in theory it is, and this article needs to be confirmed by the other hardware sites.
     
  15. MM10X

    MM10X Guest

    Messages:
    4,240
    Likes Received:
    1
    GPU:
    3080 FE
    i read the article, maybe more heat since the smaller heatsink, but not more power.
    then again, that heatsink is barely smaller at all.

    i call bull****.

    waiting for the guru3d review.
     

  16. cowie

    cowie Ancient Guru

    Messages:
    13,276
    Likes Received:
    357
    GPU:
    GTX
    well the card was run 675 the card that they were comparing it to was run at 575,just the higher clocks alone would have the card running hotter.
    the hs does not look too much smaller....
    like said we need more info....
     
  17. Burnt_Ram

    Burnt_Ram Guest

    Messages:
    5,921
    Likes Received:
    0
    GPU:
    Zotac GTX 1050 Ti
    interesting! im looking forward to more reviews, and some "benchmarks".
     
  18. Sneakers

    Sneakers Guest

    Messages:
    2,716
    Likes Received:
    0
    GPU:
    Gigabyte 980Ti Windforce
    Seems pointless to conduct a test with diffrent paramenters for both the cards, mind you the heat output from raising clocks is not linear, so as others have said I wouldn't put much faith into this test.

    Now interesting would be a 65 nm 260 at 675 clocks vs the measured temps and wattage useage of the 55 nm one. Seems their whole test is fail.


    /edit,

    After reading the thread abit more carefully it seems both cards are tested at stock clocks ( same clocks ). Only advantage I can outread fromt he 55 nm process is higher overall maximum clocks, wich you pay for in more heat due to a in fact smaller heatsink with less flenses. ( Why didn't they just put the heatsinks on a scale to just prove it is less metal to conduct the heat instead of that SS :x )

    I'm sure they will be able to sell this card to people sitting on older cards for sure, it is not a great upgrade, but then again neither was 9800 GTX+ over 9800. A good / luck example of a 65 nm 260 can still be clocked over 700 where a bad 55 nm might stay at 700 so not much of an upgrade, no :)
     
    Last edited: Dec 28, 2008
  19. eynmyn

    eynmyn Master Guru

    Messages:
    368
    Likes Received:
    0
    lol

    it does run hotter expreview or something took it apart and compared the heatsinks with the 216 core and the 216 core had a more effcient heatsink.
     
  20. biggerx

    biggerx Guest

    Messages:
    3,587
    Likes Received:
    11
    GPU:
    EVGA RTX 3070 FTW
    I don't know why there isn't one yet.
     

Share This Page