Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Feb 11, 2016.
My skepticism levels are off the charts, my pessimism aswell.
But that's basically my point. The 280x has a gig more of ram on it.
I went from a 690 to a 980, in the vast majority of games it was a sidegrade. Skyrim unmodded, same exact FPS. Throw a bunch of textures mods into it, and the 980 nearly doubled my 690.
It shouldn't surprise people that the 770/680 is falling behind. There were threads on this very forum about the 7970 being the better long-term buy due to the lack of VRAM on the Nvidia equivalents. Why is that Nvidia screwing customers? Why isn't it customers screwing themselves by not doing basic research?
I personally don't care, I'll go with whoever has the fastest card at the time I feel like upgrading. My last AMD card was a 4870X2, I was actually a semi-decent experience. I'll gladly go back to them.
I just can't stand the overall negativity on this thread and Guru3D as of late. Nearly every single post is about x company screwing y customer. Bunch of debbie downers if you ask me.
It's worth it imo. IPS/Gsync/165hz/1440p is what you need. Gaming is now a pleasure. I kill more often and games like Alan Wake (which l hadn't played before) look amazing.
I have the Acer XB270HU!
Like hell I'd going 1440p with a 970! No way.
If this game is indeed fully DX12 then that implies that the game will support multiple different GPUs including the iGPU on the CPU yet no mention is made of this whatsoever. I see that the requirements are for at least a Haswell CPU and a 700 series NVidia graphics card so I guess we'll have to wait and see if this game is truly DX12 or if they are just cherrypicking a few features.
This game can be DX12 and don't add multi-card support at all.
One things is all the options available in DX12 API and a VERY different thing is a game dev using them ALL in a game.
Let's wait for a serious game review until then i will remain skeptic.
A 980Ti for Recommended!
Well, if you say so.
I mean, if DX12 is all that, id a thought that we see a slow down on inflated requirements for ports.
Will be interesting to see if modders find a DX11 work around.
If 'recommended' graphics is a 980ti then the developers are placing hardware demands beyond the means of most consumers and therefore shooting themselves in the foot if they want to sell a lot of games. It makes sense and would be relatively simple to code the game to utilize the iGPU to handle the post processing and lighten the load on the GPU allowing a more mainstream graphics card to handle high settings. Also I'm assuming that they are talking about driving a 1080p monitor and a 980ti is way overkill for that resolution even at 144hz.
That's pushing it a bit unless you're talking about older games.
6gb of memory for 1080p? No way!
Didn't Dune 2000 already tried with live action actors during cut scenes back at 1998?
Or was it Emperor: Battle for Dune?
This game better look twice as good as the witcher 3 / assassin creed unity / rise of tomb raider
This game better look so good that it litterally is a representation of not this generation but of the next generation. I better be able to see individual hair strands and sweat on each npc.
When I play this game I better see my 4930k at 100 percent cpu usage and my two 780 tis consuming 1,100 watts of power at 100 percent gpu usage each and It better show 30 fps at 1080p.
only then, and only then will this game be worthy of those system requirements.
But! if this game looks as good as any other game that has come out like the division with game works pre baked in, with godly amounts of tessellation, tessellating every single crevice of the game and nose hair. With TXAA 4x all over the place with no way to turn it off. Then obviously this game is a crap port.
saving now for something later next holiday season then, gosh, the textures and AO :V
This. Exact same problem i had with the recently released Tomb Raider.
Id like to know why nearly every game now is demanding the top end cards, when back in the days of Crysis 1,2,3 (though, i felt 3 was pushing it) those games were like nothing else in the industry and looked like they deserved the system requirements they asked for.
So unless the PC version is like you say, the next gen of the next gen, then i fail to see how, from what little Ive seen of the game (granted, it was most likely X1 footage) that it requires a 980ti.
don't think it's that simple mate
Well, sorry that not all of us were born in wealthy high paying countries... a hole.
Is this now a new hardware - old hardware, much available income - low income debate? :wanker:
I personally too see that not everybody can afford a 10core CPU or SLI of Fury X / 980Ti. Yet, those who can should be able to make use of it too. Would love to see games scale from older hardware to newest stuff and CFX / SLI! And maybe this game actually does help since it supports dx12, probably getting a tad more performance out of older systems. (Up to date / enthusiast hardware won't see as big gains from dx12 as older hardware.)
I am not. Since it is published by ms.
Calm down guys. Requirements were updated.
Async Shader's AMD (Hitman) versus GameWorks DX12 Edition (Quantum Break).
AMD needs to be best with Hitman's GFX performance.