Discussion in 'General Hardware' started by Brian the King, Sep 5, 2006.
also notice that you did not pick the same frame to capture FPS on. Look at the soldiers, one screen has them shadowed out and the other isnt. The only real way you can do this is to run a benchmark. It will be to hard to capture the same frame and get results just screen capturing random screens.
Ill have to see a better test of bottlenecking before Ill OC anymore.
It's the same frame but when I pressed numlock to capture the frame it paused for a sec then resumed. look at the position of the helmets on the guards, they are the same.
its not the same frame look at the shadows. No one is going to be able to capture the exact same frame in two different runs. You need a better benchmark than that.
I'm gonna post my 2 cents then leave.
I've got a novel idea that both can ponder for a second. Neither the cpu or the gpu bottleneck each other. Look at it like this...
Both have separate jobs but both are trying to reach the same goal(higher fps). Or look at it as a two person team race. In lower res the gpu is reaching the goal first because less pixels does not make the card sweat. So, teh cpu is playing catch-up drawing wire fames, executing AI routine, physics, geometry, etc. the cpu is holding the team back at lower res. Hence speeding it up helps the team reach the goal faster.
Now up the res to 16x12. Now the gpu is starting to sweat and huffing and puffing, too many pixels to render. But guess what? The cpu's job is still exactly the same at any res. Upping the res does NOT affect the cpu at all. So, the cpu is still chugging along at the exact same speed it was at 10x7. The difference now is that the gpu is playing catch up. Upping the res just threw a 100lb sack on the gpu's back! Now, upping cpu speed does jack because the gpu is holding the team back. now, upping gpu speed will really pay- off by helping the team reach the goal faster.
After going through that ridiculous little story I'm going to have to say Led-head is WAY closer to the truth. spectacle you need to read that link he gave nice and slow and let all that info sink-in nice and good. No offence spectacle but you have no idea how cpu/gpu affects fps.
You are ridiculose.......It's the same damn part and the fps do not vary more than 3-5fps. Excuse the hell out of me for not having a cybernetic eye to capture the exact same damn frame......
You are wrong, you were wrong the whole damn time....just get over it and move on. It happens to everyone sooner or later.
Wow, only managed to cuss 4 times in that post. No one is going to take what you just did as a benchmark. You could run FRAPS video or whatever it is to get a avg fps reading on that video. Now that would work, but what you are doing is just screenshots. Also read what Pappi just wrote above you.
Thank God Im not going crazy in here. This guy was so set on his theory on computer hardware that it made me wonder about my own. Interesting read on the rest of your post.
I read it, and just because he is just as clueless as you does not make you right.....
Dont you wonder why we are all wrong (including an article on bottlenecking) but you alone are right?
from what i can tell spectacle was saying that his 7900gt was being bottlenecked by his opty even running at 2.8. the second set of screenies he took proved that it was because according to ledhead he would not have seen any iprovement by overclocking it to 3.0. and there was a nice lil boost in fps.
WR3CK dont even try to come into this. If you go back and read the very first thing I said is that it very well may go up. The arguments came over Spectacle not wanting to run the bench because he didnt need to in his eyes. He created a skewed benchmark and then proved just how ignorant his computer knowledge is to everyone reading this forum. Then he ran a video on FEAR and tried to screen capture scenes close to each other? That isnt a benchmark, the guy proved nothing and made a complete fool of himself on here. If you go back and read you will see that in his eyes if you OC the CPU and the FPS go up then you must have a bottleneck between the CPU and the GPU. My argument the whole time was trying to tell him this isnt true.
Pappi posted the most intelligent thing in this thread in the last 3 pages or 4 pages.
I really am in just complete and utter awe. Are you really this dense? I showed you the proof, I even did it the way you wanted. Yet still you refuse to believe the simple and obvious facts.
And let me let you in on a little secret....just because it's on the internet, does not make them correct. I gave you the hard facts. Yet you still you refute them. Do you think the fps went up by magic? No, they went up in both instances because the CPU bottlenecked the 7900GT! Wether by physics calculations or by some divine light the CPU bottlenecks the GPU at 2.8Ghz.
I simply cannot believe you are this dense...are you just f$#cking with me? No one can be this stupid........can they?
The FPS went up when going to 3Ghz. I don't care if all it was was the AI and physics calculations that relieved some of the pressure. Guess what? The CPU was ultimately able to feed the GPU with more data when clocked higher, thus we saw in increase in FPS......which is known as a [size=+2]BOTTLENECK[/size]
thats what u said not me. and last time i checked there was a diff
Spectacle did show that upping his cpu speed improved frames, true. So that means the gpu is not stressed enough to bottleneck the fps. The cpu is currently still bottlenecking the fps. Moving to 16x12 I can almost guarantee that fps will not go up from cpu speed increase. But the funny thing about games is that some parts are more cpu intensive or gpu intensive than others. So in some spots, say in the middle of a firefight with multiple enemies and nades going off with crazy physics going on the cpu will still increase fps even at 16x12. But step outside by yourself where the gpu is the only one working i don't care if you up the cpu to 50ghz your fps is still not going to increase. The gpu would be the one currently holding the fps back.
Pappi give it up on this kid, he is a lost cause. He knows everything already.
Get a CONROE!!!
God i love yelling that word.
That is what I posted. Also I posted this after he got all excited for no reason.
Dear god...I hope you two (you and pappi) are not representative of what this forum has to offer....up untill I now, I actually thought people around here had a clue.....
Spectacle you are the guy saying
not us. The relationship between the CPU and GPU isnt that simple. One day you'll understand.
No, what I'm saying is that at a given CPU clock (2.8Ghz in this example) the CPU is not fast enough to feed the GPU with data to realize it's full potential, which is why we see an increase in fps with an increase in CPU cocks while the GPU remaind at 600/1600.