Discussion in 'Games, Gaming & Game-demos' started by ALien8ed, May 7, 2013.
wait for the bargain bin its heading that way
Well for starters try smaller texture maps instead of 16k & 8k use all at 8k or even 4k..
Also use cuda transode lines, it should run fine with 256PPF and cuda enabled.
And last but not least jobthread =4 so it uses more cpu powah.
With custom ini VT COMPRESION OFF
I dont no of it work?
I figured out the grain issue. Sometimes when you move to a new area, the area's lighting sorta 'loads' and then its where you get the grain issue. It seems my 760 and 560Ti have issues in 'loading' some areas and I get left with grain.
Also 560Ti, I tried 512 for diffuse only and 4096 page size for vt, the game still jitters like a broken tape, especially in the second level where the old guy tells me to kill everyone.
How to tell if CUDA transcode is REALLY working? Will 256PPF be fine on both AMD and my i5-4670k? 256 sounds overkill
Same problems as with Rage.
set in Nvidia inspector or in Nvidia control panel SSAA under wolf profile. Start game open console and write r_multisamples 4. If you set 4xSSAA game run with 4x SSAA+MSAA. 2xSSAA do not look much good and 8x is performance killer. This settings do not look good in some areas. Postprocess sometimes break this settings...
for anyone with low-end pc type "wolfenstein voksi" in google, and use fix (min 1gb GPU).it works and doesn't have virus, but somehow its not legitimate. First time i used something liked that and it really works. from 20-30fps to stable 60fps
Just gotta keep in mind this game engine is heavily dependant on CUDA transcoding, which for some odd reason, isn't enabled by default and the menu option for it is hidden (unlike RAGE).
Without transcoding on, expect stuttering, texture popin, and constant frame rate issues.
Once you enabled transcoding in the game console, and confirm its running correctly, all texture popin and framerate inconsistency should go away. Read back a few pages on how to enable it with the ingame console command.
And once again, I cant repeat this enough... MAKE SURE YOUR PPF settings are not too low or too high depending on your CPU. intel i5's... you shouldn't be going any higher than 16 or 32, mainstream intel i7's shouldn't be going any higher than 32 or 64, and extreme editions and xeons with 6 cores or more, you should be from 64 to 128. I've never seen any benfit setting it higher than that.
And for immediate texture popin relief (though it may not reduce all of it like transcoding should), make absoutlely sure you Wolfenstein install has created a texture swap cache in your "Users\(user profile\AppData\Local\Machine Games\Wolfenstein The New Order", and preferably have these cache files on a SSD, not a HDD. If you don't see them, or are unsure, read up... it's explained several times.
Also, SLI is a big no-no. I know I sound like a broken record. Because of the transcoding features, SLI can and will never work. If you have a SLI machine, COMPLETELY disabled it through your Nvidia control panel before starting this game. Don't bother trying to disable it and set the second GPU as dedicate CUDA transcoding, that feature is now broke too (unlike how it was originally with RAGE). IF you leave SLI enabled, with or without transcoding, expect a lot of stuttering and hangs and other anomalies.
Another big thing is the drivers. Tweaking anything in the Nvidia drivers (such as trying to force AA, adaptive vsync, whatever) will likely cause issues. Not for all, but for most. Leave the driver settings at default whenever possible. I've tried a couple different driver sets, from last WHQL, to current beta, to the leaked x.81 international beta. If I try setting anything outside of default, I will stuttering and hangs.
Lastly, make sure you add a "com_videoram" in your wolfesntein config file correctly for your VRAM size. Especially if you're trying to running with everything maxed, or tweaking texture size to 16k. This is all explained over the last several pages of this thread.
The engine is very highly optimized for Nvidia cards with CUDA transcoding in mind, so its no wonder why AMD users are having issues.
As far as custom texture resolutions upto 16k, other CUDA settings, core/threading config adjustments... You can try screwing with those... but I have not, and whether they make a difference or make things worse, I do not know.
Following my approach above, I am running this game on a single 780ti with 3GB's of VRAM (I disabled SLI), EVERYTHING maxed out in the game options menu (with vsync off), at 2560x1600, and not getting a single framerate drop, stutter, or texture popin. It runs smooth as butter, and looks fantastic.
I do know, that before I made these very simple changes, that the game was crashing, stuttering, and had texture pop-in like mad.
This is weird. What exactly does Voksi fix? He should have better documented his 'fix'.
Even with CUDA on my 560Ti in the living room there is a violent jittering like a broken tape (yeah I've said this too many times) in the second level.
CUDA is probably working on my 760 fine, but its definitely not helping on my 560Ti. Now next option is to either bring my PC in the living room or steam in-home streaming. I really wanted this to work on my 560Ti at 60fps, seeing they recommend a 460.
From what I can tell I can't actually say but I can clearly see why it was deleted on the Steam forums, since it includes a modified Steam_Api64.dll file it's likely cracked though using just the exe should work on legit installs.
I'm about to give it a try myself to see what exactly it does, the main thread on the Steam forums was as I said deleted so it's missing some information but maybe it'll actually work without totally killing the visual quality, ha ha.
i don't know really, all i know my game is now playable
Video here playing maxxed out at 1440p running 1 gpu at 1306mhz fully stable. Game is completely playable untweaked whatsoever.
Hmmm, game is constantly crashing on my system, what drivers are you guys using?
Same here; no crashes.
337.82 is smoother.
Do you put the commands in game(and do they save), or config file? If latter, do you make new .cfg or edit old? Might have been answered but i couldn't find it.
I couldn't get them to save so I bind them to keys, for instance
bind k r_multisamples 4
Then I just press K to activate it. The binds will save next time you play too
Seems stable now after changing to 337.81.
Does Triple Buffering work with those drivers?
Also, to answer your question from posts ago.. I've the 1.5gb model of the gtx580. Im actually really tempted to go and pick up 2x secondhand 3gb 580, and go sli. I've been so impressed with these 580s that they are the new 8800 to me.
Any tips on overclocking these things, the 1.5gb's that is?