Discussion in 'Folding@Home - Join Team Guru3D !' started by pint, Jul 4, 2008.
Yeah I know, but I can't afford Vista atm. So not really easier and more profitable to upgrade
I see... But if you have the money don't hesitate- I think that the time is just right to get it, with SP1 and all... Maybe you could get yourself an Upgrade or System Builder version (much cheaper)?
Thing is, if I bought Vista I'd need to buy 4GB of ram and new hard drives too, I'm still using the same IDE drives I had 5 years ago lol
Then again when Stalker Clear Sky comes out I'm gonna want to play it in DX10 I think.
For now though I'm hoping a new client might reduce CPU overhead a little.
What kind of ppd are the new Radeons going to put out with an updated client I wonder..
Yeah I also heard that if you use the 4800 series, all of the 800 sp's are not going to be fully utilized at the moment, perhaps in the future it will use all of the sp's in folding@home. Folding@home is still working for an mutli-gpu client. 2 4870'sX2 would be a funkin' killer if all the sp's are utilized. That would be something.
Yup, they're using 320 out of the 800 SP's 4800 series cards have.
My 4850 did nearly 2000 PPD with some WU I don't remember right now, but it hovers around 1600-2000 PPDs.
If they're doing that with 320 SP, with 800... a peak of 5000+ would be possible. I hope they get that code working soon...
Long time no see, Dr.
Yeah these are old WU's from the 2xxx/3xxx line and dont forget it doesnt scale perfectly but 4-5k PPD is a reasonable estimate. But currently the nvidia PPD is way off anyway - yet to see them fix this.
I'm getting 1098 PPD on my 2600XT so I would think 5000PPD on the 4800is is pretty likely.. maybe more.
Why the heck we can't control how many shader units to use? I mean don't they want more calculated data flowing in? 320 shaders is lame considering we have almost 3 times more in HD4800 cards...
You cant because the core used and work unit used only 'know' or are designed to use upto 320 unit. Which is why they are working on the new cores and work units
Also the drivers play a large part with the folding cores too, which have to be updated - and since its ATI its monthly (ussauly).
Some reading material:
8800GT vs HD4870
4870 GPU2 PPD
ATI vs Nvidia PPD vs architecture
No, really? I was wondering why they can't just make the thing to use all shader units available. 320 shader units is just a current limitation that could be eliminated by updated folding apps...
Well they can, but their development team is rather small.
how do i see how fast im folding
Welcome to the forums.
To see how fast you are folding download fahmon, it will tell you how many ppd you're getting on all your clients.
Not technically true. ATI use 5 worker threads x 1 shader processor, where as nVidia use one worker thread per shader processor...
So technically, if you divide 800 by 5, you get the true amount of actual shader processors compared with nVidia, in which case is only 160.
I dont know why ATI multiply the actual number by the amount of worker threads per processor, but they do.
If there was software capable of properly taking advantage of this, then they'd probably steam into an even bigger lead. I suppose it would be upto the software designers to program the folding clients to take advantage of all the extra processing headroom.