Hi I'm new to guru 3d, this is my first post Recently I have upgraded my video card into a hd 7990 from a MSI gtx 580 oc, and I was expecting to have a better game experience in games like crysis 3 which the old card couldn't fully handle. But surprisingly my hd 7990 can only have around 40 fps(average) on very high+FXAA(1080p) in the grassy level(welcome to the jungle), especially when the grasses increase, the frame dropped dramatically to 22-30 fps. Is it kinda normal to other 7990 owners? I seriously doubt that. I've tried different versions of catalyst driver(from 13.4 to 13.10 beta) but got similar results. It is so frustrating ! It runs well on other games like sleeping dogs and tomb raider 9. Besides the 3dmark 11 E score is over 5760, from which I believe the card is running normally. My system specs: i5 2500k 3.3 ghz(turbo boost up to 3.7) z68 16gb ddr 1600 ocz 750w I really hope someone could help me out ! Thanks in advance !
Thank you for the reply! How to enable crossfire then? The ccc shows that the primary adapter and secondary adapter are already linked. Do I need to do anything particularly for this game, maybe some in game settings ?
updates: some findings: The crossfire has been enabled for sure, however, when I ran games like bf3 and crysis 3, the gpu usages for both cores are around 50% when crossfire is on, when I manually disable crossfire in ccc, the single gpu hits over 90%. That explains why it performs like a hd 7970. Now the question is, what caused this problem ? Is my cpu a bottleneck ? Or my ocz 750w psu is not good enough for supporting the video card? Any thoughts?
I would try to benchmark the 7990 in CFX. Try to overclock the 2500K. Benchmark the 7990 again in CFX. Compare the scores, drastically change? its CPU bottleneck, no change? you can look for other things. I'm not trusting that OCZ, how old is it and what model? maybe its on its last life line.
the 3dmark 11 extreme score is 5750 when cpu is at stock, seems prettly normal. when running heaven 4.0 and tomb raider 2013 benchmark, all of the gpus hit over 93% usage, which is desirable. And I also noticed that the cpu usage never reached 100% during the tests. the psu is http://www.newegg.com/Product/Product.aspx?Item=N82E16817341041 I bought it in may 2012, it powered my gtx580 well. things are getting strange...
CPU is main bottlenecking factor for BF3. Even one HD 7970 can max anything there with exception of AA if target is standard 1080p and 60fps. I am getting 120fps (not maxed, just few things tunned and no AA) on one HD 7970 1050/1500MHz with utilization around 85-95% therefore I get fps drops to around 100 on some maps. But at lower CPU clock I had fps all over the place due to fact that turbo will drag you between 3.3 and 3.7GHz. It should be no problem to get all your CPU cores to 4GHz. Mine did that with stock cooler while hitting 69°C. With proper cooling I am at 4.6GHz and BF3 flies.
Yeah I agree, the cpu is bottlenecking. I've observed that the when the cpu usage hit 90-100%, both gpu usages dropped from about 90 to 50, even under 50 sometimes in Crysis 3 and bf3. Clearly those games are very cpu demanding... So I overclocked the cpu to 4.2 ghz, the gpu usage got better but the issue was not completely solved. Do you have a set of parameters of msi bios on overclocking 2500k to over 4.5 that I could take a look? Since I noticed that your system is quite like mine. Thank you so much dude !
if it is a pirated copy of crisis 3 that doesn't have patches you have to disable vsync and renable it for crossfire to kick in. That's how the game was for me vanilla when it first came out. Another reason I don't pirate games.
Or alt-tab out of the game then enter the game again, same thing. A simple reset ofr display-related stuff for CrossFire to kick in on Fullscreen with this game. And it happens with me even after it's fully updated. I have it on Origin.
Every chipset/board is bit different and even memories play it's role. For me I can't change base clock even by 0.1MHz otherwise It will not boot. You should always target overclocking to temperature. So if you are hitting 75°C in Intel Burn Test You should not increase clocks any further and I guess system would crash at that point anyway. In IBT I get up to 72°C but in any real world heavy workload it stays below 64°C. Code: PLL.Overvoltage - Enabled V.Core - 1.365V CPU I/O Voltage - 1.17V VccSA - 1.125V CPU PLL Voltage - 1.65V (Default is 1.8V) EIST - OFF Enhanced Turbo - OFF Memory X.M.P. - OFF Spread Spectrum - Disabled Dram Voltage - 1.5655V PCM 1.05 - 1.0555V CPU Features: Power Technology - Custom C1E - Disabled Overspeed Protection - Disabled Intel C-State - Disabled You may select all values exactly same with exception of "CPU PLL Voltage" & "V.Core" itself, those are quite CPU/Chipset dependent. I do overclock by multiplier and increasing it by 1 at time while increasing all Voltages a bit. If it passes Stress tests I go and decrease one by one till I find which Voltage has to stay higher at higher clocks. (write down everything so you know exactly bottom voltages for each multiplier) This way you can easily predict what voltage you need to reach another step. And you may decide to stop. Don't forget to write highest temperatures caused by IBT on each step too. I am running Noctua NH-D14 so I guess my CPU may be pretty stressed without me realizing it based on temperature measurement. And one more thing to note, My PC has pretty good PSU and energy goes through UPS which is providing filtering/stabilization too. Side Note: 4.2GHz is OK, tick 1T command for memories instead of 2T. I am not done testing at 4.6GHz, so it's why my signature stays on 4.4GHz which is btw. optimal speed for my silicon considering Voltage required and heat generated versus performance delivered.
Thank you so much bro ! I'm gonna try it ! Apparently overclocking works, im gonna see how much I can overclock it.