Hi everyone, This is not an idle vs load comparison. It's an idle vs idle comparison. So I am not using the computer, but it's running in the background as I use my soundcard as a mixer for my speakers, or I am downloading and I have left it on overnight for example. I run my CPU without any power saving options on. So my 4790K is currently running at 4.6Ghz @ 1.22v and my GTX 1070 is running at 2076Mhz Core & 9332Mhz Memory - LOCKED Frequencies. Assuming CPU usage is at 1% and GPU usage is at 1% how much electricity am I wasting per month in comparison to having the power saving options on at IDLE? CPU @ idle @ 4.6Ghz is using 28W to 32W and at 100% load it uses 104W maximum. What would the GTX 1070 be using at idle at locked voltage & locked high frequencies? difference between 0% and 100% load? I used Open Hardware Monitor to check the CPU, but I can't check the GPU.
There Power Consumption sensor in GPU-Z. You should check it there. It did raise for 2% as I applied overclock on idling.
Only 2%? Almost as if downclocking would be a little... useless. Just buy the more efficient card and you're done.
To get a real idea instead of guess work , the best would be to get a external device and measure the real output.