Dreen, i will play VSYNC on man ..i love to play at 30 FPS cap as i already used on consoles..its more good to eyes too ..am i wrong?
I'm used to 30 fps now in the game and it resolves all of the performance issues I was having before so no more hitching/2 second freezes or drastic framerate drops when opening chests and digging (you reading this, Neo Cyrus?). So, yes, it is fine for a game of this type as it isn't fast-paced except for the combat. What isn't so great is the nasty motion-blur which is really hard on the eyes and can be seen in some of the screenshots I've taken. Both AA and motion-blurring are part of the Effects setting though which I prefer to leave on its max of High. I can't lose one without sacrificing the other unfortunately so I'll have to live with it for now.
Just do what I mentioned earlier, disable the in-game vsync which caps it to 30fps and force vsync through nVidia Inspector (3rd party program, great stuff, use it). Also turn the shadows to minimum, they look fine and take a lot of stress off the GPU compared to max shadows. Really poorly done programming. I have a question for Bigtime as well, what framerate do you get that it seems smooth to you? Edit: Nevermind, you already said you capped it at 30fps. That would drive me nuts.
I've already tried what you suggested. I knew about before I installed the game so the first thing I did was disable in-game v-sync then later on after a brief play forced it via D3DOverrider. Big mistake as that led me to believe the game was buggy as I was getting massive framerate drops and 2 second hitching. If I re-enable v-sync in-game then the game runs perfectly. The only conclusion I can draw from that is that the engine isn't coded to run at 60 fps and results in the issues that people are reporting unless you lower the shadow and draw distance settings. I can play the game with EVERYTHING maxed and the framerate stays at 30 fps even in packed towns. I'd rather have consistent performance that 60-to-20 fps dips that require me to press START twice to fix. Maybe Microsoft will fix the issue but somehow I doubt it as the engine seems designed to run at 30 fps. The 30 fps cap doesn't bother me, it's the horrid motion-blur that is the big issue.
P.S. Does the temporal AA still work with v-sync disabled as I thought it needed a fixed 30 or 60 fps in order to work properly. I'm sure when I was playing it without v-sync there were loads of visible jaggies which aren't there when played at a v-synced 30 fps.
Not being coded to run at 60fps is meaningless if you stop and think about it. The only thing it wasn't coded to be is optimized which is why you get frame drops unless something is really glitched with forcing vsync. They put the 30fps cap for consistency as you mentioned, most games have some sort of auto-smooth point. The standard Joe won't notice or even care that a game is running at freakin' 20fps but he will notice if it's constantly fluctuating from 40 to 20 and will complain that it's choppy. Anyway I'll disable vsync at a point where I get under 60fps and let you know the difference. Maybe D3DOverrider just doesn't work properly with it or something, either way you should be using Inspector, it's a very handy program. I use RadeonPro for this card, which isn't half as useful as Inspector. On another note, does it have its own filtering which we can't disable? That would explain why it glitches if I try to force any filtering on it aside from MLAA.
There is an AA compatibility flag for Fable III which someone posted in the NVIDIA driver section which allows forced AA. I'll add it later with Inspector but otherwise why do I need it for Fable III, Neo Cyrus? The driver game profiles already contain an entry for Fable III so I can forced v-sync through that if that is what you mean? There's no need to use Inspector for that unless you're creating a new profile or adding a SLI/AA flag.
Well I didn't know how well the nVidia default stuff could force vsync, in the past forcing settings wouldn't work half the time for me so I would just force settings through 3rd party programs as a habit. AMD has no profiles in their current drivers so I don't have any option but to use a 3rd party program if I want profiles. I tested it out, the result: There's nothing wrong with forcing vsync for 60fps on it. I went to a spot where it got 20-22fps and the result was the same no matter what I did. In-game vsync, forced vsync, no vsync. They all got the same result in terms of frame rate.... well, almost. The in-game vsync actually got 2fps less, it was getting 18-20fps but I may have shifted over a step or something. The tearing without vsync is atrocious.
I get 30 fps constant with in-game v-sync, Neo Cyrus, and haven't seen it dip below that. I only see drops to 20 fps when opening chests/digging if I disable in-game v-sync and force v-sync on outside of the game (so it hits 60 fps). Weird bug, whatever.
The game is fairly easy until you get closer to the end. I beat it on the 360 without being knocked down once. Just make sure you map your health potion to a key/button. The balverines are a bitch to kill. Especially in packs of 4 or 5. Learn to use your evade ability well.
guys, but when i did disable VSYNC, i didnt see too much screen tearing or freezing or other stuff..hmm... Anyway, when iam at work. when i go home iam gonna game this like there is no tomorrow..i hope.
Well I obviously had to hunt for a spot to get such low frame rates like that. I forget the town names, Bowerstone Industrial area I believe facing the water where it shows the great majority of the map so it has to render pretty much everything. Even at 1GHz my 6970 might just be that much weaker than a GTX 580, the 6970 is about on par with a GTX 570 with a default 880MHz. That and of course the nVidia drivers are at least somewhat optimized for it, the AMD ones aren't at all. Though I don't get drops like that when opening chests or forcing vsync. It's only if I get a good view of the majority of a map so it has to render a ton of stuff. Throughout most the game I get 60fps. Balverines are the only thing I encountered which are threatening at all and that's only in large groups. Trying to beat them with a sword just is too annoying so I tend to just use magic to wipe them out easily. Fire/Lightning charged is pretty much effective against anything.
I just turned everything to lowest, and surprise, the game looks the damn same minus the bullcrap effects like the bloom, blur and unknown form of filtering. In front of the thingy to check your stats where I would get like 120fps on max I now get 230+. Everything is clearly jaggier (meaning we were right, it is using some form of filtering) and the annoying blur and bloom are gone. Microsoft went from dicks to super dicks with this one, the settings are not at all what they say they are. When I increase the "texture" setting I expect the texture resolution to increase, not some BS effect to be enabled.
Edit: Nevermind, it turns out I'm blind. It clearly states under the effect setting that it is for AA, bloom and depth of field. Would it have killed them to have separate settings for those things? Bah.
I don't know about the PC version yet since I don't own it, but in the 360 version you can change the settings so that a little orb floats around the screen representing other people playing the game that are in the same area as you. You can interact with them at any time and request them to join your game or try to join their game. And Co-Op was done much better that it was in Fable 2.