Hey guys, I did a quick search but didnt find what I needed to know, so here goes ... I have SLI Gigabyte GTX460 OC 1g cards in my PC and the temps from 1 GPU to the next are vastly different when running anything that opperates in SLI. Running EVGA Precision to monitor the results over a 10 min duration each .. Room temp is 22c @idle GPU1 runs 40-45c .. Fan @40% GPU2 runs 32-35c .. Fan @40% @100% load GPU1 runs 95-99c .. Fan @100% GPU2 runs 60-65c .. Fan @60% Edit: Individualy the cards run 70-72c with Fans @65% My second test was to see how the temps go at idle with Fans @100% The result, I must say, was more than strange ... @idle and Fan @100% (10 min run) GPU1 .. Decreased to 39c, then went up to 48c after 5 mins .. GPU2 .. Decreased to 29c, then went up to 37c after 5 mins .. Umm .. WTF ? .. During this test, the PC / GPU's remained 100% IDLE .. Edit: Setting fans back to 40% reduced the GPUs back to normal idle temps Anyone care to explain this strange outcome ? Now I do understand that GPU1 will be hotter due to drawing heat from GPU2, restricted airflow between cards etc. Swapping the cards around gave the same result. But should it be 30-40c higher @100% load than GPU2 ?