Geforce 8800 GTS waste of energy in idle

Discussion in 'Videocards - NVIDIA GeForce' started by fabie, Feb 20, 2015.

  1. sykozis

    sykozis Ancient Guru

    Messages:
    21,783
    Likes Received:
    1,047
    GPU:
    MSI RX5700
    Conroe was also the name of the micro-architecture used to create the entire Core 2 product line. Calling an E7400 by Conroe, while being technically accurate, does nothing but cause confusion.
     
    Last edited: Feb 21, 2015
  2. Dillinger

    Dillinger Master Guru

    Messages:
    654
    Likes Received:
    0
    GPU:
    GTX 950
    Are you on your computer 24/7?

    Is your electric bill significantly higher when you're on your computer more often?

    If you're so concerned about power consumption & heat at idle why not just use your onboard video?

    Then when you want to play some games just plug your 8800GTS back in.

    Buying a different video card seems more ridiculous to me. You'd be wasting your precious pesos...
     
  3. nexxusty

    nexxusty Banned

    Messages:
    84
    Likes Received:
    0
    GPU:
    MSI GTX 980 Gaming 4G
    It's actually not a mistake at all, my bad though.

    Definitely a slight misunderstanding because of the way I worded my post. Should have been more clear.

    Thanks for the correction. Unlike a lot of people I actually appreciate it. If only to inform others as I did know it was a Wolfdale I was just not thinking at the time.

    Appreciate that as well. I knew this but there is no point I getting in an argument about it. My side would be a bit iffy to "fight" from at best with this topic.

    Most people just assume that because Conroe was actually just the codename for the first Core 2 CPU's as well, Wolfdale would be it's successor. While true, like you mentioned it's also true when they took the Pentium M arcitechture and made a new one with it, it was dubbed Conroe.

    Nehalem & Bloomfield for example. Two different names, same chip. Core 2 when first released was Conroe & Conroe.

    :)
     
    Last edited: Feb 21, 2015
  4. LiquidFrost

    LiquidFrost Member Guru

    Messages:
    146
    Likes Received:
    1
    GPU:
    EVGA GTX 970 FTW
    Overall scores will be lower with a venerable E7400 with 2gb ram, but the 8800gts is archaic at best. It was released 9 years ago. That's several lifetimes as far as GPUs are concerned.

    For raw floating point operations, the 750ti is roughly 4 times as fast as the 8800gts.
    For raw texture fill rate, the 750ti is roughly twice as fast as the 8800gts.
    For raw pixel fill rate, the 750ti is roughly 60% faster than the 8800gts.
     
    Last edited: Feb 21, 2015

  5. nexxusty

    nexxusty Banned

    Messages:
    84
    Likes Received:
    0
    GPU:
    MSI GTX 980 Gaming 4G
    Oh I think you must be wrong sir!

    The GPU in which you speak of is a 60w GPU. No way it beats a 135w GPU even if it is 9 years old.
     
  6. cowie

    cowie Ancient Guru

    Messages:
    13,249
    Likes Received:
    327
    GPU:
    GTX
    My brothers friend has a 750 in his fathers computer .
    not too sure about 3d06 but it goes 23,879 in 3d vantage and that's better because its longer and has bouncing boobs but I hate when his mom covers my eyes at that part :wanker:
     
  7. stevevnicks

    stevevnicks Maha Guru

    Messages:
    1,440
    Likes Received:
    11
    GPU:
    Don't need one
    it probs could do or does, due to the 750ti having a far more efficient architecture.

    example (i know im using CPU's as example but easy way to understand)

    my A10-6800K apu uses 130wats runs at 4.1-4.4ghz

    my i7-4770 cpu uses 65watts runs at 3.4-3.8Ghz and wipes the floor of A10-6800k in any cpu task.

    would of thought it hold true for GPU's but i may be wrong.

    this is just an example.

    due to more modern and efficient architecture i would of thought the 750ti would laff at an 8800gts
     
    Last edited: Feb 21, 2015
  8. Cartman372

    Cartman372 Maha Guru

    Messages:
    1,469
    Likes Received:
    0
    GPU:
    980 FTW

    That was a horrible comparison. You're comparing an APU to a CPU. If you compare the A10-6800k directly to the i7-4770k on iGPU performance, the A10 is way faster.

    Here's two benchmarks from bit-tech.

    [​IMG]
    http://www.bit-tech.net/hardware/cpus/2013/07/02/haswell-and-richland-gpu-testing/3
     
  9. stevevnicks

    stevevnicks Maha Guru

    Messages:
    1,440
    Likes Received:
    11
    GPU:
    Don't need one
    yea i know but said at any CPU task its just an example to use in more watts does not always mean better again i will repeat i said CPU task's etc

    if you want to knit pic thats upto you your just a troll in my books looking to stir stuff try reading i PUT ANY CPU TASKS just in case your blind or for get what i put CPU TASKS ok ?

    finny afterwards i knew someone would bring up gpu part lol oh well you fook all to do with ya time fine by be :)
     
  10. sykozis

    sykozis Ancient Guru

    Messages:
    21,783
    Likes Received:
    1,047
    GPU:
    MSI RX5700
    Technically, they're both APU's since they both contain a CPU core and a GPU core, which are the prerequisites to be an APU.
     

  11. CalculuS

    CalculuS Ancient Guru

    Messages:
    3,039
    Likes Received:
    255
    GPU:
    GTX 1660Ti
    Come on people, sarcasm isn't that hard right?
     
  12. ---TK---

    ---TK--- Ancient Guru

    Messages:
    22,111
    Likes Received:
    2
    GPU:
    2x 980Ti Gaming 1430/7296
    People calling other people retards has nothing to do with sarcasm. It is due to not being able to express your opinion in an educated manner. And quite frankly seems more the norm here.
     
  13. nexxusty

    nexxusty Banned

    Messages:
    84
    Likes Received:
    0
    GPU:
    MSI GTX 980 Gaming 4G
    Lol. He got it.

    Maybe it is hard.

    Yeah TK I'm not sure who he thought he was but he proved himself to be an ass. Definitely acts like one.
     
    Last edited: Feb 22, 2015
  14. p0ppa

    p0ppa Master Guru

    Messages:
    306
    Likes Received:
    0
    GPU:
    GTX970 G1 SLI
    wow my 100watt light bulb is sure brighter than my 60watt
    but holy ****! my 30watt compact fluorescent light is brighter than my 60watt bulb! how's that even possible? it is not credible for a 30watt bulb! :stewpid:
     
    Last edited: Feb 22, 2015

Share This Page