Discussion in 'Folding@Home - Join Team Guru3D !' started by yoyo69, Apr 15, 2009.
If a 275 is basicaly half a 295 what PPD does one produce??:cyclone:
Yeah it is 1 of the 2 cards in a gtx 295. The gtx 295 basically has 2 gtx 275's that are clocked at gtx 260 speeds.
I have two GTX 295 cards, currently operating at default clock speeds, which I allow to fold in the background while working. At night I kick in the SLI for a little gaming, but I switch back the next morning to continue folding.
Each GPU is showing 6800 to 6900 PPD in FahMon. That should give you some idea of what the 275 will be able to provide.
I should recieve GTX275 within the week so will find exactly what the PPD is then..
(Currently getting 2500-3000 PPD with HD4870 1024Mb)
Congrats on the order yoyo69! Please post what PPD you get with the GTX 275. It would be interesting to know. I have a friend in the office who may be in the market for a new video card... the 275 looks like it could be a winner!
Recieved the GTX275 today.
It is quite a bit bigger than my ATI cards, had to move my water pump a little to get it to fit. (Might have to find a different one some time in the future).
First run of Folding is showing 7500 PPD. So all is looking good. (It might mean you dont pass me quite as quickly in the team rankings now J_J_B).:eatme2:
Thanks for the info yoyo69! The GTX 275 sounds nice.
A little competition is a good thing, particularly for this cause!
I won't be passing as quickly as I might though... my fiance' abhors the "waste" of electricity and won't let me leave my system turned on at night... also says my computer is too loud. I'll need to settle with about 10 hours of folding during the day (plus the nights that she spends over her sister's house ).
got to admit, that must raise the electric bill quight a bit if 1kwh is ~ 25 cents. If your psu was running at full all night. If you just left it running all day it could cost ~ $7.20! =D just thought you should know! ^
Nah - I did the calculations. It would cost me about $1.25 to run for 12 hrs at my electric rate. My system consumes 650 watts (+-10% error) with all four GPUs folding.
Skip Dunkin Dounuts in the morning and I'd have it covered!
Ok the GTX275 is running sweet (Approx 8000PPD) I would like to add another card and fold on that too.
What I would like to know is does the second card have to be the same as the first?
I have a 8800GT that I would like to use.
(I am not trying to SLI).
Your 8800GT *should* work fine as an additional folding client in your system. I still consider myself a newbie at this, but give it a try! I read online about someone's experience with folding on an 8800GTX. He said it was running at a rate of 5000 PPD and 87 degrees C. So, keep the additional generation of heat in mind... case temperatures might climb a bit depending on your layout.
Perhaps you could use the 8800GT for dedicated PhysX processing in games that support PhysX too. It could be a win-win!
My 8800GT outputs around 3500 - 4000PPD ish.
8800GT is in but at the momment will not run work units, keeps downloading new WU runs 4 a few seconds then stops and sends results:bang:,will have to have a play with it tommorow.
Are you using the proper command line arguments for both your first GPU client as well as this second one? Here are mine... these are the "target" fields in the properties of the shortcuts I use to launch each client.
"C:\Program Files (x86)\Folding@home\Folding@home-gpu\Folding@home.exe" -gpu 0 -forcegpu nvidia_g80
"C:\Program Files (x86)\Folding@home\Folding@home-gpu\Folding@home.exe" -gpu 1 -forcegpu nvidia_g80
and these are the "Start in" fields of those shortcuts on my computer:
Also be sure to assign a different machine ID to each folding client in the F@H configuration screens.
My targets do seem to be correct, the only part I do not have is the ( -forcegpu nvidia_g80) does this have to be on both clients? and why g80 when the GPU for the 8800 is a g92 and the 275 is s gt200?
Below is from my log file.
06:20:31] Folding@Home GPU Core - Beta
[06:20:31] Version 1.19 (Mon Nov 3 09:34:13 PST 2008)
[06:20:31] Compiler : Microsoft (R) 32-bit C/C++ Optimizing Compiler Version 14.00.50727.762 for 80x86
[06:20:31] Build host: amoeba
[06:20:31] Board Type: Nvidia
[06:20:31] Core :
[06:20:31] Preparing to commence simulation
[06:20:31] - Looking at optimizations...
[06:20:31] - Created dyn
[06:20:31] - Files status OK
[06:20:31] - Expanded 46710 -> 252912 (decompressed 541.4 percent)
[06:20:31] Called DecompressByteArray: compressed_data_size=46710 data_size=252912, decompressed_data_size=252912 diff=0
[06:20:31] - Digital signature verified
[06:20:31] Project: 5766 (Run 5, Clone 403, Gen 301)
[06:20:31] Assembly optimizations on if available.
[06:20:31] Entering M.D.
[06:20:37] Working on Protein
[06:20:38] Run: exception thrown during GuardedRun
[06:20:38] Run: exception thrown in GuardedRun -- Gromacs cannot continue further.
[06:20:38] Going to send back what have done -- stepsTotalG=0
[06:20:38] Work fraction=0.0000 steps=0.
[06:20:42] logfile size=0 infoLength=0 edr=0 trr=23
[06:20:42] - Writing 635 bytes of core data to disk...
[06:20:42] Done: 123 -> 124 (compressed to 100.8 percent)
[06:20:42] ... Done.
[06:20:42] Folding@home Core Shutdown: UNSTABLE_MACHINE
[06:20:45] CoreStatus = 7A (122)
[06:20:45] Sending work to server
[06:20:45] Project: 5766 (Run 5, Clone 403, Gen 301)
[06:20:45] - Read packet limit of 540015616... Set to 524286976.
[06:20:45] + Attempting to send results [May 9 06:20:45 UTC]
[06:20:46] + Results successfully sent
The "-forcegpu nvidia_g80" part only forces the use of the nVidia-specific F@H core rather than the ATI core. It doesn't actually have anything to do with with nVidia card generation/family.
Ensure you have a monitor connected to the 8800GT output and your Windows desktop extended to that monitor. Alternatively, if you go into the nVidia control panel, enable the use of a GPU for dedicated PhysX processing, and possibly select the 8800GT card as the PhysX processor (I have heard that there might be a drop-down box that allows you to select which card is used) then you might be able to get away without having to connect a real or dummy monitor to the 8800GT.
Below was taken from the Folding forum (folding@home)
Report this postReply with quoteRe: Fold on GT200 and G80 same computer
by RBIEZE on Sat Apr 11, 2009 10:53 am
The gtx295 has 240 Stream Processors and the 8800gtx 128 S.P's
They wont fold together properly,you'll only be able to fold on the gtx295.
If you had a gtx 275-280 or 285(240Stream processors) they should work no problem.
You should be able to leave in the 8800 for Physx but you wont be able to fold on it reliably .
Even if you get it running ,the points output will be lower than just running the 295 by itself
How right it is I do not know, but I will continue to look in too this.
In the mean time the 8800GT is being used as a PhysX card.
Each GPU has its own client and does not share anything with the other GPUs, so that doesn't make much sense.
Did you connect a monitor to the 8800GT as J_J_B suggested?
I have VGA dummy on 8800GT and Desk top extend onto it.
Every thing I have done seems to be correct just that the second client will not work correctly. Even when run on it's own.
I will keep trying with this, as I am sure that it should work.
By Accident I now have the 8800GT Folding.
After yet another clean install of the Folding client and running through the multi gpu configaration process (still without any joy on the second client).
I launched the normal client from the programs list by mistake but this now seems to be working, I will let you know what PPD I get with the 8800GT and GTX275 over the next few days.
I have been reading on some of the other furoms that you get slow down when folding with GPU's with different numbers of shaders. We will see how it goes.