I have been browsing through the last few pages of this thread wondering what all the beastly systems these days r getting (I been considering upgrading) and um, well, I am suprised to say the least. did anybody miss this back on page 15? Now, did I do something wrong, or is everybody getting some ****e scores?
Oddly I can get some very inconsistent results tends to get lower the more times I run the test until I exit and re-enter the game. Vista, Dual Monitors, Highest Resolution, all options enabled.
Update the game to v. 1.08 or go and find the .cfg file that has the settings and open it with notepad.........search for resolution and when you see it replace the resolution you want.
new card - new scores! Maxed@1024 - 72 / 166 / 421 Then I came out and stuck on Super-Sampling 72 / 166 / 408 I always thought it was my CPU holding my minimum back, just goes to show how GPU dependant fear is. Just for the hell of it this was 1280x960 4xSSAA 16xAF 66 / 141 / 380
I upgraded PC. Make no mistake about it, no tricks here, just good honest benchmarks - Maximum / Maximum / SuperSampling ON / HIGH Quality in driver settings. Min - 92 Av - 188 Max - 556 http://img210.imageshack.us/img210/893/fearpe0.jpg :bounce:
AMD x6 1055T@stock but I try to test at OC@3.3GHz tomorrow with geforce newer driver. GTX 480 with 260.99whql (you look down to date) I can't improve peformance from 9800gt to gtx480 at 1024x768 due to my current CPU bottlenecked but I got at 2048x1536@4xAA-16xAF running Windows XP x86 SP3: ->old CPU C2D E6600 with old 9800GT-v180.48whql-Quality (January 28, 2009) ->this my current spec with HQ by geforce driver (January 4, 2011): I think that AMD x6 1055T may be a bit slower than C2D E6600.
..... Holy f*cking thread revival, batman! pretty sure it's safe to say that pretty much ANY relatively recent hardware would get over 30fps minimum...
Sorry due to too later. but I got crash to desktop while entering game - FEAR 1.08 but I keep 1.0 for avoid crash. I got with all options unchecked (sound, music...) running AMD X6 1055T@3.28GHz and XP x86 SP3 with geforce 285.58whql-HQ: at 2048x1536@4xAA-16xAF with all max's at 2560x1920@4xAA-16xAF with all max's - wow! I think that my GTX 480 (CPU Intel i7 2500K/2600K "simulated" ) is (almost) 4x faster than my old 9800 GT at 2048x1536@4xAA-16xAF.
HA did this just for sh*t's and giggles!!! Here is my test, everything maxed out @ 1080p 4xaa, 16xaf HAHAHA made me laugh when I saw the results!!! I remember struggling to run this game at even medium settings on my VERY old single core 2.66GHz Intel Celeron (yes, CELERON! lol), 1GB DDR400 RAM, and a 256MB Nvidia Geforce 6800GT Ultra (cant remember if it was the GT or the Ultra version, it was the Doom 3 version, and had the old CGI tiger animal on the box). Seriously underpowered CPU for that GPU at the time, I remember having to force pixel doubling to run the game maxed out at 1280x1024 LOL!!
but you don't keep stock speeds for GTX 480. I got 17 fps min at 1024x768 w/o AA nor AF on my old 6800ultra (but died ) and my old Pentium D 820 in 2006 but I can't remember avg and max. I already tested at 2560x1920 (look my post #715).
So you want me to re test with stock GPU clocks? Or both CPU and GPU at stock? Plus I can only run 1080p as this is my max res.