Had stuttering for the first 30-40 minutes of playing, then switched "Lock Framerate" to off in the graphics settings. Stuttering went away but had horrible screen tearing. Changed vsync to adaptive in program settings for nvidia control panel, been playing the past 7 hours with smooth fps no screen tearing. Ultra preset, rig in profile. For sure the in-game vsync (Lock Framerate) was the culprit for my stuttering, I could feel the fps constantly spiking between 30 and 60. Weird.
yes indeed.. I love bioshock.. I remember how much I loved it when the first one comes and I played it for hours such a beatiful envirionments and the way of the game proceeds.. man !
i had non . even on PS3 too..i think you pc is so poor. time to puıt some money on it. i thought Dutch government feeds poor pc owners..not?
You got this game twice? Or just on PS3 which probably can't even render framerate high enough to start tearing in first place XD
Anyone noticed some fps drops and a minor freeze while entering new locations, it freezes and drops fps only for a second, then it's back up, and i think this happens only when enabling v-sync.
Here it is: In game benchmark with my specs at 1680/1050 Ultra DX11 DDOF: Scene Duration (seconds), Average FPS, Min FPS, Max FPS, Scene Name 81.64, 52.81, 11.75, 120.72, Overall
Also, anyone has some AA flag, or how to increase AA in nvidia driver and to work in the game, i think it's unreal engine, so there is some aa flag for now ?
VRAM usage hovers well over 2GB. It peaked at 2749MB according to AfterBurner. This game uses more VRAM than Tomb Raider. Tomb Raider uses around 2.4~2.8GB. But that's with 4xSSAA while BioShock is using mere FXAA. With SSAA, BioShock definitely will go over 3GB.
Is that GreenManGaming? I may well switch to digital only and use them, particularly as I'm due to get a 100 Mbps connection at some point in the next 18 months which would mean a ten-fold increase in download speeds for me on average. At that point there would be little point in buying retail disc games plus I'm finding myself running out of space to store things as I live in a small two-bedroomed house.
Yes twice lol. and u must see it in first place. ps3 renders it nice and at constant FPS for now. and even there is AA..looks good. not as good as PC indeed but it looks like developer did a great job. 10/10 to even console.
It's an Unreal Engine 3 game that uses DX11 so the normal flags are not going to work. You'll either have to use an SMAA injector and/or downsampling from a higher resolution if you want better AA. It may be possible to force the game to run under DX9 with a compatibility flag but that will surely come with a downgrade to the visuals.
Very nice artistic direction. I'll give you that. But technically, this game can be very ugly. This is supposed to be water.
Yep, just realized, well no more AA for me, seems the game eats 1.9gb vram, holy sh-it. Is just like i'm playing Skyrim with 4gb of texture mods...
1680*1050 everything on Ultra + AA 2021MB is the highest Vram Usage i've seen so far.. I think AMD did ''something'' because clearly the graphics are worse than Crysis 3.
I suspect the game is pushing a lot of high-res textures and that is the reason for the high VRAM usage. Of course, the high-end AMD graphics cards have 3 GB so it may well be that the developers were told to maximise the game's highest settings for 3 GB cards thus leaving NVIDIA's 2 GB high-end ones looking comparatively weaker. That said, someone said earlier than the high VRAM usage isn't the reason for the stutters/framerate dips (it's probably the game streaming in those HD textures actually).
According to the PCGamesHardware.de article I linked to it's related to shadows or ambient occlusion. Specifically V.High and Ultra are not available unless you use DX11 and then you get some filter method plus contact hardening which softens the edges further, you can also enable a alternate depth of field effect which is much less blurry (Doesn't blur the entire distant view.) plus ambient occlusion relies on HDAO and the higher HDAO-HQ mode. (And FXAA can use some shader-model 5 support as well.) EDIT: The game does use about 8 GB of texture data all in all and it uses a rather large cache for the streaming system as well so that could also be causing a bit more VRAM usage than what we normally see with the Unreal Engine. (As stated in the above post. )