Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Apr 14, 2014.
SHHHH! (Don't tell him thats AA.)
bring it on my gtx 780s 3 way-sli @ 2560 x 1600 going love it ..
By that time nvidia released there duel 790 card or the new maxwell gtx 800 series.
I refuse to play this game without aa, and i only want to use msaa, best aa in all ways
4x is still oke for me hopefully my 690 wil pull that off @ ultra detail @ 1920x1080p @ 120 hz monitor between 70-80 fps i dont complain.
else i just buy a single duel gpu chip that runs this game to maximum like a 790
they have released a dual card, the titanZ.
I just wait for a duel 780 ti
the titanZ is duel 780 ti.
I dunno if we'll see 790.
well, when I got my first 780ti, I tried to enable ubersampling on witcher 2. To be honest, my fps still sucked. May be its the same ubersampling thing again :/ ?
True but the titan z is more a workstation card then a gaming card tho its more like the god of the nvidia quarto family.
I still wait for a propper duel gpu like the gtx 690 i have now.
And not a price tag of 3000 euro thats just insane.
But just 1000 or 1100 euro, and hopefully they name it the gtx 795 or 790 ultra or what so ever.
So articles only concerned about Nvidia or only focus on Green site?
Nice, then i'll avoid this Nvidia game and stick to games that made with both cards in mind or only AMD
At least i have more Vram then 780 ti
Good luck with that! I think the 295x2 is the only thing that will fit that bill.
I think nvidia spend additional cash in game development which concludes in their product's cost of production. hence more expensive hardware
while you enjoy similar \ same performance for 100-150$ less, at least let us enjoy our 150$ difference worth of propaganda
I agree, MSAA (x4) and CSAA (x8) are the only ones I use. No blurriness to textures with those. FXAA and TXAA just make everything blurry & decrease the sharpness of the appearance & textures.
No, thanks. Can't they create games instead that run with an enjoyable performance? If I want slideshows, I'll look at Google Images.
Game devs seem to miss the point of high-end hardware. It's not there in order to raise a game's performance slightly above "barely acceptable." It's there to get enough performance for your 144Hz monitor, your multi-monitor "eyefinity" or whatever setup, etc.
And why are people saying "this game has the best graphics ever?" If the graphics run like crap, guess what, they're bad. Image quality is only one aspect of good graphics. The other is fluidity. If either of those is crap, then the overall result is crap too. Image quality without fluidity is only important for static images.
Unless they actually intend to make sales 4 years from now, since that's the timeframe for the hardware to catch up to this stuff. But by then, the game won't be profitable anymore to begin with.
The Witcher and Crysis - wasting otherwise good PC firepower on unoptimized crap.
You can run SLI AA for games...
I too think AA makes things a bit blurry at times. If I enable it, I only keep it at 2x or 4x. The performance vs visual result ratio is not worth it to me to use AA, so I've been getting away with using mid-range GPUs for a while. Besides, it's pretty easy to get used to no AA at all. There are plenty of old games you can play today and cringe at how ugly they are but after a couple hours you quickly learn to accept the look. It's almost as though your brain fills in the gaps in the details.
I agree with one of the earlier posters though - it would be nice to see a piece of hardware dedicated to processing AA, and maybe the smoothing of shadows too (as far as I'm aware, it's a similar calculation).
Nobody is forcing you to run the game on max possible settings, you can mess with options and get the visuals/performance how ever you want - its not like the game will be an ugly mess if its not maxed out!