Discussion in 'Games, Gaming & Game-demos' started by deathfrag, Apr 19, 2013.
So I'm guessing no sli/crossfire support again due to idtech5?
This game looks terrible, graphically. Why would someone willingly use the tech5 engine. Goodness....4gb of vram? For blocky low-res ported over crap from the 360/ps3? That's all this is. Same goes for ps4/xb1 - the base of the game is coming from 360/ps3, another 3rd party game trying to push a last gen game into next gen to make as much $$ as possible without using any of the power next-gen and/or PC have.
Another pos unoptimized game, gameplay could be awesome but optimization probably worse than watch dogs... well if it so i hope modders will do justice.
2:35:1 aspect ratio, film grain like i'm watching a movie with too much mosquito noise and 4GB Vram requirement for crappy textures.Thanks, but no thanks.
Didnt Wolfenstein also had recommended 4gb limit?
I bet this game will play fine at 2gb card with VT compression on, or VT compression off and 3gb cards.
Vt compression on vs off was very minimal if you used extra higher mipmap commands and extra LOD sharpen.
I couldn't notice any difference at all between VT Compress Enabled/Disabled.
Love how everyone is so quick to trash this game.
Personally, I don't care, the game looks like a blast gameplay wise... and a true return to survival horror from the guy that created the genre.
I can't wait to play it.
who's trashing it?
I was being sarcastic about a few of the posts above mine...
"Another pos unoptimized game" and "This game looks terrible, graphically. Why would someone willingly use the tech5 engine. Goodness...."
And it seems they are judging the game based off screenshots from when the game was first announced in April 2013. Lets ignore that the game looks WAY better now then it did then.
This hour long stream from Bethesda looks awesome... and the game itself looks great.
^ Some people are just annoyed by the requirements and at first , the game doesn't look like it's showing 4GB worth of textures and stuff, but we can't know till the game comes out since YT is crap for judging texture quality. I personally can't ****ing wait to play it , i have pretty high expectations for EW , i always put graphics as number 2 but i get mad when it's crap coded like WD or DR3 etc.. and doesn't justify the performance hit.
Acctually, on idtech5 was working Cryengine expert for better lighting and shadows.
Game is awesome looking but what is known to jeopardize gameplay:
long loading levels,
I don't believe for a second it actually needs an i7 CPU or a GPU with 4gb of vram.
The whole thing just stinks of stupid bloated system requirements.
It's the same thing Bethesda did with Wolf: TNO. Stating that game needed an i7 CPU or higher, yet people ran it on i3 and lower CPUs with zero issues and great performance.
They also bloated up the requirements for DisHonored's CPU and GPU, and yet the game runs great on systems well below those requirements.
wolf indeed is heavy on the cpu btw, just sayin, it heated mine up real quick at 4.9ghz that's for sure. uncompressed textures and using 64 pages per textures you see +3gb usually and even up to 3.8gb.
one of the best lookin games out imo, like all the other heavy vram games.
Should I copy paste my Wolfenstein vram settings again with 64pages and VT compression off?
Nope it didnt use whole 3050mb, actually max 2.8gb..
No temp issue either, this is @ 4.7ghz on all 8 threads
I run Wolf at basically the same settings, only difference is I have VT Compress Enabled.
The difference in visual quality is so minor it's not even worth looking for. Yet, it drastically cuts down on the amount of vram needed.
Doesn't heat up my CPU either, and my CPU is getting up there age wise. Yet, it runs great.
So, seems Bethesda do listen to people...
The Minimum Requirements for The Evil Within...
To anyone paying attention, they'll notice the only difference is the GPU requirement.
But, it's nice that Bethesda finally put something out there.
But Bethesda did post on the forums,
Showing that the game will use an in game AA option, Tessellation and SSAO. So, that should make some happy.
Personally, I'm happy they are including an 'Motion Blur' option. I'll be turning that off first thing.
^^Good on them. :thumbup:
Cheers for the update
tessellation in a opengl game?
Tesselation is not a new technology, we had it as far back as DirectX8 and original OpenGL.
Plenty of old games have it too, good examples are all the Quake games, Return to Castle Wolfenstein, Serious Sam, Neverwinter Nights 1 and the list goes on.