I dont have the game yet, though here is a fine specimen 3930k@4.5, 16GB Ram, 2xGigabyte Titans 2560x1600, Ultimate settings, 2xSSAA
I think its supersampling AA fault, this thing is a killer in any game atm. Check SleepingDogs, SniperElite v2, TheWitcher2,..
its clear if you enable SSAA ( 2 or 4x ) at 2560x1600 + all max setting , this will be normal ( then we dont know if the SLI profile is ready ).. FXAA could be enabled anyway.
surely the point of the benchmark would be to do it at the same settings as everyone else? ie, 1080p ultimate here's mines 33.6, up 1fps from when I did it 10 hours ago.
Those are mine benchmarks results/ Catalyst 13.2 beta7: 1080p, Vsync On, Ultimate settings (tFX On) - 1080p, Vsync On, Ultra settings (tFX Off) - Note, difference between Ultra and Ultimate = tFX On/ Off IMO, despite the fact that the tFX looks really cool and awesome though with unnecessary performance hit at medium range graphics cards (not optimazed build/ drivers?) ... the game has a very not cool (unpleasant) way to implement how the hairs rendered on Lara (NPC's) - they do not lie on the shoulders of the characters, but it lies over the invisible boundaries (3d mesh) of the body - not cool, not cool at all...this one little flaw make this feature kinda defective...because it is there and you can see it all the time while playing... PS. If anyone found out what the "high precision" function do, pls let me know ? It costs me 4-8 FPS with tFX=Off and I don't see any differences ...very appreciated for any help here.
Ultimate settings...FXAA, triple buffering, tFX on, 1920x1080 Ultimate, FXAA, trip buff, tFX off, 1920x1080
HD 6990 Ultra and Ultimate (TressFX Off in Ultra and On in Ultimate) 1920x1080 and 5992x1080 Eyefinity with Bezel Compensation : 1920x1080 Ultra: 1920x1080 Ultimate: 5992x1080 Ultra: 5992x1080 Ultimate:
I think there is some thing wrong for me. Ultra 4xSSAA Ultra AA off I get 48fps with 4xSSAA and 49fps with no AA. I thought the SSAA was like Ubersampling in the Witcher 2? or am I wrong?
Upgraded from 2x 4870 to 2x 7970 on old system works well Setting on ULTIMATE 1080p qx9650 @ 4.02ghz w/ m80 corsair cooler Sapphire 7970 in CS @ 1050/1500 8GB Ram @ 1600 (stock)
hey jeffokada81 are your video cards both ATI GHZ edition cards or are you just over clocking them normal 7970? i'm looking at getting something to replace my 5870 after this game crushed my video card and your setup looks promising.
I flashed the GHZ edition firmware onto my reference cards. You may see artifacts in some games. I didnt have any problems in Crysis3, minor artifacts in BF3 and alot in Dead Space 3. None in Tomb Raider =) I just use CCC to turn the clocks back to 925/1050 when video starts to get too annoying. btw, i was able to find both cards for $350 each if you look hard enough Edit: All this talk about overclocking, I went on and changed my clocks to 1100/1800 with +4% on power limit and I got a max 129 fps on the bench. 7fps increase for a slight tweak aint bad. =) (using msi after burner. Trixx is garbage)
@r9800pro resize the images in your post to max 1280px width pls.. edit: lol morbias've already done that, while i was typing! :infinity: