Plugged in my HDTV via hdmi into my 6950 and straight away the card's voltage incresed from 0.9v to 1.0v and temps rocketed from around 40c to 60c while Idle? is this a software glitch?
I hate that about recent ATI cards, multi monitor setup and HD playback seem to increase clocks(from 250c:150m to 500c:1375m) hence higher temps and power consumption. I like my card but because of this I'm thinking of switching to GTX580 since I leave my PC idle most of the time.
As noted the increased voltage and temperatures are due to increased clocks when running a multi-monitor setup. It's a bother but it's not much you can do about it, save possibly force a lower-clocked profile. Presumably the increased clocks are to combat flickering but I've never noticed that. *shrug*
That's a shame... so im looking at much higher power usage just because of an hdtv being plugged in. Let's hope it gets fixed soo and I'll just keep the second display disabled when not in use.
The most annoying part is that the clocks don't go down after you disable the second monitor. Only a restart in single monitor configuration will ensure lower clocks in idle.
Mine seems to go back to default clocks/voltage on disabling second display, otherwise I believe you can use overdrive to set iy back to default. Yeah I really like the card so far.
Mine doesn't go down to the 250/150, they get stuck at 500/1375. Overdrive doesn't go down to 250/150
Having the same issue with my card as well. Really annoying because if I forget to disable my TV display my temps can creep into the 100's, this is dependant on fan speed of course.
That doesn't sound right regardless, you should never be hitting those temperatures! I idle around 40C with just one display enabled but while adding one or two more does increase idle temperature, as well as ramp fan speed somewhat, I don't hit 100C even under full load. Or anything close to it at that.
I agree with above poster, there's definately something wrong there, either your case needs better cooling or you have a bad card.