GeForce GTX 470 and 480 review [Guru3D.com]

Discussion in 'Frontpage news' started by Guru3D News, Mar 27, 2010.

  1. Alex Vojacek

    Alex Vojacek Member Guru

    Messages:
    160
    Likes Received:
    0
    GPU:
    XFX GTX295
    I agree with Hilbert, I've never saw any meaning to just bashing the other card/manufacturer/whatever.

    Don't you just know by now that if the other "opposing" guy is not there, market will not be healthy anymore?

    Gez, this is something that people just don't get it.
     
  2. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    42,488
    Likes Received:
    10,288
    GPU:
    AMD | NVIDIA

    So I was able to confirm that there indeed are higher clocks when multiple-screens are used. Nathans results do differ from our findings though. Once two monitors are installed the clock speed the first few minutes will indeed act weird jumping from 700 Mhz to lower then up again and then lower etc etc.

    After a couple of minutes the clock settles though. But still too high. It remains a constant 400 MHz on the core (on our card) with two monitors used. As a result the power consumption WILL be higher and the temps in idle will go upwards to 60-70 degrees C.

    So that's confirmed ... have a look at the screenshot. That's the system in idle / gtx 480 / dual-screens.

    Not good.
     
  3. Exodite

    Exodite Ancient Guru

    Messages:
    2,048
    Likes Received:
    224
    GPU:
    Sapphire Vega 56
    Hmm.

    Well, I suppose the jumping clocks could be fixed with a driver update. Hopefully.

    In general this is the same as the 5k-series though, is it not? IIRC a lot of those cards idle at 400/900 as opposed to 157/300 clocks when multiple monitors are connected.

    Obviously that's not quite as obvious as the 5k-series doesn't draw quite as much juice or run quite as hot but it seems the 'solution' to screen flickering is the same in both camps.

    At least NVIDIA seem aware of the issue and we can hope they'll resolve the random clock jumping and get the cards to idle at somewhat lower temps.
     
  4. GC_PaNzerFIN

    GC_PaNzerFIN Maha Guru

    Messages:
    1,045
    Likes Received:
    0
    GPU:
    EVGA GTX 580 + AC Xtreme
    Thats only 7c hotter idle than my HD 5870 with 120Hz monitor.
     

  5. Exodite

    Exodite Ancient Guru

    Messages:
    2,048
    Likes Received:
    224
    GPU:
    Sapphire Vega 56
    Is it running at 68% fan speed though?

    Looking at Hilbert's screenshot of AB it seems the 480 is.
     
  6. GC_PaNzerFIN

    GC_PaNzerFIN Maha Guru

    Messages:
    1,045
    Likes Received:
    0
    GPU:
    EVGA GTX 580 + AC Xtreme
    Cannot compare before I know how high rpm does 68% mean on the 480.
     
  7. IPlayNaked

    IPlayNaked Banned

    Messages:
    6,559
    Likes Received:
    0
    GPU:
    XFire 7950 1200/1850
    Yeah it'd weird that both camps have the 120hz bug?

    Windows issue maybe?
     
  8. Thug

    Thug Ancient Guru

    Messages:
    2,200
    Likes Received:
    9
    GPU:
    RTX 3080
    I was a little worried about the temps, with dual monitors too.
    I run a 28" and 17" monitor.
    Hilbert, did you test the temps on the 470 or just the 480?
    I was probably gonna buy the 470 due to pricing in the UK.
     
  9. Exodite

    Exodite Ancient Guru

    Messages:
    2,048
    Likes Received:
    224
    GPU:
    Sapphire Vega 56
    IMO the 470 is definitely the better deal, especially if NVIDIA can squeeze in some more optimizations in upcoming drivers.

    The fact that the 470 doesn't rely on a heat-pipe cooling solution tells me it might be amiable to some clocking with good third party cooling too. At least I suppose better cooling should be able to do more for the 470 than the 480.
     
  10. Alex Vojacek

    Alex Vojacek Member Guru

    Messages:
    160
    Likes Received:
    0
    GPU:
    XFX GTX295
    No, not a windows issue, basically it is how 2 monitors work, for the card to "mantain" a healthy sync between both monitors you have to "force" a lock on the memory clock.

    have a look at this test.

    Go to Catalyst control center or NVIDIA Overclocking util of choice, change the GPU clock and look the screen up, no change... then change the "Memory" clock and look the screen in the moment you change the clocks you will see an instant flash and flickering.

    This "flickering" happens when you change memory clocks, cannot be undone, cannot be masked, you don't see it when in single monitor, because happens in the precise moment you open up a game, so, get's masked on the instant "black screen" before game starts, it gets masked again when you exit the game (the only 2 moments the memory clock changes from 2D to 3D and from 3D back to 2D).

    But, imagine same situation with 2 monitors, the second monitor is idling on windows desktop, when the memory clock changes back it could happen that you SEE the flicker on the second monitor and maybe lost sync with it, because one monitor is doing something 3D and the other in 2D.

    That problem is cured by "fixing" the memory clock on the GPU, that's a solution both AMD/ATI and NVIDIA pursue, the problem with this is, that, ATI has a wonderful downclock rutine on their GPU's and so, memory "locking" will not be so bad because GPU is already downclocking like hell with clock gating like the 5000 series do.

    Memory Lock does increase power consumption and heat alright, even on 5000 series.

    The problem with GTX480 is that Fermi is no "low power friendly", Fermi base almost all of it's powersaving by downclocking like hell it's GPU, contrary to what AMD does, so, basically, 5000 series is far superior in retaining the powersavings.

    To add more insult to injury like the prhase is known, NVIDIA does seem to need to increase the GPU clock to maintain sync with the increased memory clock, something AMD/ATI does not need to do (maybe better memory/gpu bus control) so, Fermi NEEDS to up it's GPU clock to mantain sync with it's increased memory clock (remember it had to increase it to mantain sync with the second monitor) so.
    NO, it does not seem to be a possible solution to this, not even with driver updates, because that's the way NVIDIA manage this situation, they should need to change fermi's behavior to allow the GPU to mantain a healthy low clock without the increased power output.

    And since memory on Fermi is so important, with such a big bus width and so much memory there (1.5Gb) it only complicate matters more, because more memory in full clock more heat and more power requirements.

    It's a very troublesome problem, really.
     

  11. Corrupt^

    Corrupt^ Ancient Guru

    Messages:
    7,081
    Likes Received:
    373
    GPU:
    Geforce RTX 3090 FE
    Anyone care to explain this bug? I've got a Samsung 2233RZ
     
  12. Alex Vojacek

    Alex Vojacek Member Guru

    Messages:
    160
    Likes Received:
    0
    GPU:
    XFX GTX295
    I already explained it in detailed before you post your question :)
     
  13. GC_PaNzerFIN

    GC_PaNzerFIN Maha Guru

    Messages:
    1,045
    Likes Received:
    0
    GPU:
    EVGA GTX 580 + AC Xtreme
    FAIL.

    HD 5870 ups core to 400MHz with dual screens and 120Hz single monitor.

    Oh and the memory is at full 1200MHz clocks too.
     
    Last edited: Mar 28, 2010
  14. Alex Vojacek

    Alex Vojacek Member Guru

    Messages:
    160
    Likes Received:
    0
    GPU:
    XFX GTX295
    My 4870 setup does not increase GPU clock when connecting 2 monitors on my platform, I assume 5870 didn't either.

    Maybe because 4800 series have 2d clock and 5870 series have ultra-low 2d clock.

    Maybe you could point out my mistake but, see. this only confirms what I just said about power efficient 5800 series.

    So, both GPU's ups the 2D clock and Memory, more evidence then that 5800 is doing something extra on it's gates to reduce heat that fermi is not.

     
  15. GC_PaNzerFIN

    GC_PaNzerFIN Maha Guru

    Messages:
    1,045
    Likes Received:
    0
    GPU:
    EVGA GTX 580 + AC Xtreme
    Power efficiency my ass, it runs full memory clocks and half GPU clocks. Idles over 60c. Not to mention powerplay screws up once in a while resulting in flickering, or that overclocking drops the idle clocks so low its un-usable.
     

  16. Alex Vojacek

    Alex Vojacek Member Guru

    Messages:
    160
    Likes Received:
    0
    GPU:
    XFX GTX295
    One thing is for certain, and I need someone refute this to me.

    All cards WILL up the memory clock to it's full 3D clock and put a LOCK in the clock once you plug in a second monitor, even my old 295 and 4870 does this.
     
  17. Alex Vojacek

    Alex Vojacek Member Guru

    Messages:
    160
    Likes Received:
    0
    GPU:
    XFX GTX295
    I don't like your attitude, I'm just trying to bring what I know of the subject in order to see if there can be a possible solution, maybe you know more that you would like to share with us instead of talking to me like this?
     
  18. morbias

    morbias Don TazeMeBro Staff Member

    Messages:
    13,445
    Likes Received:
    37
    GPU:
    -
    You're right, ATI 5xxx series is supposed to do that but I had major problems with Powerplay and multiple monitors on a 5870, even without running the card overclocked; the card would just not detect I had two monitors plugged in and would stay at 2d clocks, causing flickering and tearing.

    The higher clocks needed for multiple monitors on the GF100 cards apparently cause a much bigger energy drain because of the increased transistor count over RV870. It seems to me Nvidia need to come up with a better low power 3D state for this situation, which looks like a new version of firmware.
     
    Last edited: Mar 28, 2010
  19. GC_PaNzerFIN

    GC_PaNzerFIN Maha Guru

    Messages:
    1,045
    Likes Received:
    0
    GPU:
    EVGA GTX 580 + AC Xtreme
    http://www.youtube.com/watch?v=7aUzZYM59C8

    [​IMG]

    and last but not least,

    [​IMG]
     
  20. Alex Vojacek

    Alex Vojacek Member Guru

    Messages:
    160
    Likes Received:
    0
    GPU:
    XFX GTX295
    So basically you're saying your card at 66 degrees with a fan staying in the "27" % speed is the SAME as a Fermi card Hilbert has on 68 degrees with the fan almost at 70% right?

    You really have to take THAT into account to say "more efficient my ass".

    It seems much more power efficient to me, just for the FAN figures alone (I will not count the fact that the Cooler and FAN of the GTX480 are much more powerful than the one in the 5870.
     

Share This Page