ridiculous but it seems like my system is pcie bus limited.. look at the pcie traffic,gonna try lowering texture settings maybe it's run out of vram.. 1080p, Ultimate, FXAA GTX4601GB@900/4200MHz, Q9650@4.25GHz ------------ yep it was all about the vram, texture quality @low and pcie usage backed down to normal level, and vram usage was 660MB.. not to mention avg FPS is just doubled.. 1080p, Ultimate, FXAA, TextureQ low GTX4601GB@900/4200MHz, Q9650@4.25GHz
1080p, Ultimate, FXAA, TressFx Off, TextureQ High GTX4601GB@900/4200Mhz, Q9650@4.25GHz, ForceWare 314.07(HQ) 1080p, Ultimate, FXAA, TressFx Off, TextureQ Ultra, High Precision Off(this is a vram hog, without it vram ultra isn't problem at all!) GTX4601GB@930/4600Mhz, Q9650@4.25GHz, ForceWare 314.07(HQ)
1920x1200, Ultimate, TressFx On, FXAA. Min: 22.6 Max: 47.2 Average: 39.2 Is this normal for my setup? seems really low lol.
It definitely does use a lot of VRAM. I'm showing well over 4GB being used on Ultimate setting with 4xSSAA on.
Win 7 x64 SP1 with i5-2500K@4.5GHz and GTX 480@reference-stock - 313.96beta (HQ) 1080p - Ultimate quality min: 15.1 fps max: 35.0 fps AVG: 26.4 fps But newer GF driver may be better soon...
The game runs surprisingly well on an old video card that I'm using because it's faster in Maya (blame Nvidia/ATI for having bad drivers). Core i7-3770 GTX 275 16 GB Ram Settings 1920x1080 Medium AA Turned Off Everything else at normal. Min FPS: 30 Max: 77 Average: 62
Similar setup here (kind of) 1920*1080p, Ultimate, TressFX On, and every setting at max. Min: 1fps (seems too low, not sure why...) Max: 42.7fps Average: 33.6 Other than my minimum, yours seems about ok. I am running Catalyst 13.1 WHQL Overall, game runs great and no issues at all. Not to mention the awesome game play, already 4 hours in !
Found this thread looking for TR benchmarks to compare with, and thought I'd contribute. 3930K @4.6GHz (2) 7970's @1200/1500, +20% power limit (water cooled) I ran the benchmark with everything at the max settings, then varied AA and TressFX. Posting a bunch of screenshots seems wasteful, so I'll just summarize as min/max/avg, with a truncated decimal point. At my normal resolution of 2560x1600: No AA / TressFX - 32/89/76 No AA / No TressFX - 52/114/98 FXAA / TressFX - 31/90/76 FXAA / No TressFX - 60/108/95 2xSSAA / TressFX - 18/58/48 2xSSAA / No TressFX - 35/75/64 4xSSAA / TressFX - 0/42/10 (slideshow whenever hair was visible) 4xSSAA / No TressFX - 30/56/48 So TressFX reduces the frame rate about 25% all by itself. Combined with 4xSSAA, it turns the game into a slideshow with fairly regular quarter second plus intervals of no movement at all. It's such a jarring result that I'm sure something other than extreme processing load is going on there. Graphics memory usage was at around 2.2GB out of 3GB, as per Afterburner. I suppose it's possible more was actually in use, and the slideshow was due to extreme texture swapping. On the topic of memory usage, I found that TressFX by itself caused a delta of over 400MB. That can easily push cards with less memory from coping to crippled even at lower resolutions. For apples-to-apples, I also ran the benchmark at 1920x1080 with the top two presets plus 4xSSAA: 1920x1080 / Ultimate - 50/151/123 1920x1080 / Ultra - 86/190/163 1920x1080 / 4xSSAA / TressFX - 42/140/116 On a tangent, I want to congratulate the developers for putting a Triple Buffering option in there. But I can't, because it doesn't work properly. With vsync and triple buffering enabled, the frame rate should be continuously variable between 0 and 60. But that's not happening here. As soon as the frame rate dips below 60, it plummets immediately to 40. No, that's not a typo. It's not dropping to 30, which is what you'd expect without triple buffering. It's dropping to 40. It's such a bizarre result that I can't figure out what's actually going on. On the plus side (for me at least), all I need to do for a steady 60fps is to disable TressFX.
I'm using 13.2 Beta 7 drivers. Run the benchmark 3 times, you will see the min fps change. The first run always seems to be in the 1 digit range.
I get artifacts on Lara's face/skin on both my of my PC's (GTX 680). Very noticeable in start of demo, running on Ultra settings. ANyone else?
Does the camera get stuck on the beginning of the bench? And then proceeds to get through Lara's body ? Sometimes it happens to me too.
I decided to test the game with my old Radeon 5870, because i haven't seen it on any benchmarks yet. 1920x1080 Ultimate, Tressfx ON and FXAA.
GPU 1.225v. VRAM voltage is stock. The card and bios: http://www.techpowerup.com/vgabios/62601/XFX.HD5870.1024.091126.html Other stuff: Arctic Cooling Accelero Xtreme 5870. Noctua nt-h1 and Arctic silver adhesiv.
I'm really enjoying this game, i've been running it at ultimate with TressFX enabled and FXAA rather than SSAA. SSAA has a massive impact on performance, and i've been monitoring it in my screenshots below - with 4xSSAA its showing VRAM usage as over 4000mb! surely this is being doubled and its really only using 2gb...otherwise this game has one hell of a memory leak lol. Cpu at 4.5Ghz, gpus at stock 1050/1500 heres ultimate settings with FXAA Heres ultimate with 2xSSAA heres ultimate with 4xSSAA