Discussion in 'Games, Gaming & Game-demos' started by Stone Gargoyle, Apr 30, 2014.
Wow! Looks promising.
Looks like it'll be quite good. Or... at least better than Ghosts. Hopefully they decide to use an EXP based weapon unlocking system, instead of the sh*tty currency system.
Also nice to have an idea of exactly how far in the future they set this up for.... 2054.
Crysis2 meets TitanFall.. Pass.
Gfx doesnt look anything better then Ghosts (has the same model detail, reflections, shading), Imo the only thing that makes it "look better" is this DOF abusing
Yeah, the trailer looks nice... but DOF is horribly overdone. And the amount of scripted "lets move the camera around" bits were annoying.
Also, while it is hard to tell from simply watching a video, the gunplay looked incredibly weak. Feedback on the weapon looked completely non-existent.
And enemy AI was pathetic.
According to the dev COD Advanced Warfare will be identical across all platforms.
Does that mean no keyboard & mouse support?
They'll have to make a choice. The Treyarch way, meaning good PC support and options, or the IW way -> no options and horrible performance.
I'm expecting horrible optimization and zero polish on PC. The COD IP is dead on PC and Activision isn't interested in rekindling the fire.
They will do as little as they can get away with for the PC version.
They're really focused on the better selling skus.
Well, the Treyarch games were fine. I'm still expecting something nice from them.
I've no reason to trust them either. I'm also worried about anti-cheat in the MP. BO2 was awesome but Ghosts had zero protection, not even VAC. CBA with such BS on launch day.
Many boxes have to be ticked in order to make the game fun on PC. Good controls, good optimization, good backend (servers?) and good anti-cheat protection.
Ghosts couldn't tick a single one of them.
Some screens, no idea how the game will look like at launch but I like the shots we've seen so far, quite a upgrade.
(These ones are - obviously? - supersampled though.)
Nice bullshots. In game first person view screenshots without absurd levels of myopia-aka-dof plox.
Finally, the world can understand the plight of my fellow myopia sufferers.
Drawing great graphics then blurring them, makes sense... Lol.
No doubt the DOF on consoles is to have really close okay graphics and the blur to cover up disgustingly poor draw distances. On PC though, no doubt there will be a whole lot of processing power on long draw distances, then blur them. This will be so you need a good computer to run it. Makes it look 'more impressive'.
Having it look the same over all platforms is actually quite stupid, it probably requires more developmental time to achieve this. They could at least make ultra quality textures for PC version and better shaders. In a couple of years when the next 'next gen' consoles come out, you could simply transfer over the shaders, textures, do a little bit of tweaking and release it as a remastered version!
Really hope the fad of DOF is short lived, it's really quite sh!t during gameplay and hurts your eyes, it really only suits screenshots. Your eyes simply tried to focus the blur which out can't. It's not the same as focusing your eyes normally, since the screen is flat and how the left and right eyes treat the image.
It's not inconceivable that people who play extended periods of these DOF games may require glasses. Lawsuits, I bet these idiots never considered that possibility.
I don't think this is a financially reasonable proposition. Higher quality assets are very expensive to make and only a tiny portion of the PC market could recoup this investment.
It's not worth it right now.
I reckon COD Advanced Warfare will be demanding enough as it is even though it will have to run at 50-60fps on consoles.
I meant higher quality within reason . Doing it from the beginning and scaling as necessary is much better than trying to make it look better as an after thought.
You are absolutely right about it being demanding enough on PC, that is exactly what I was talking about. It would' look bad' if requirement specs were low, they can get around this with poor optimisation. This can be done simply by changing the amount of processing time for the draw distance before blurring it. On consoles, the blurred parts would be very low quality before blurring, and you wouldn't notice the difference.
I'm not sure about that at all. Consoles are the main market so it makes sense to cater the tech to them first and foremost, then to add a couple extras if ressources allow for the PC version.
Again I don't know what you are focusing on DOF so much as it is absolutely not representative of the complexity of scaling visuals and art budget across so many hardware configurations. It is just one effect, it does not have anything to do with optimization on a wider scale. Do you genuinely believe "blurring" is the answer ?
Everything starts with consoles : they build assets and effects so 50-60fps can be sustainable at 1080p or lower. This is their lead platform because the COD audience is there. The art is tailored around those machines, the R&D is done on PS4/Xbox One, the game design is made with a controller in mind.
How the final game looks like on our platform is dependent on the amount of ressources Activision is willing to allocate for the PC. Given that COD's relevance has been dwindling steadily for a long time I don't think they will bother at all.
Same assets, same effects, same everything across PC and consoles.
I will be shocked if they even try to optimize the game for PC.
Folks looking forward to it will have to rely on sheer brute force.
That's why videocardz' rumors that Advanced Warfare will support AMD's Mantle is incredibly hard to take seriously. Why would they bother with something only the PC can take advantage of, and a minority at that ?
I know what you're saying, but if the game does look the same across the new consoles and PC, then the hardware requirements for PC should be quite low. This is regardless of whether the consoles have better API efficiencies (when compared to DirectX). Of course, I'm not talking low as in 32-bit and 1 GB of RAM or anything, I just mean you shouldn't need a high end card and latest i7 CPU to play it on the highest settings.
OH WAIT, settings? Why would there be settings? If it's the same across all three platforms settings would be a moot point? What, the PC version looks the same when everything is maxed out, and yet when you lower it to say, 'high' the graphics are worse than consoles?... So, basically if you need a highish end PC to play on the highest setting, everyone else will have worse graphics than consoles. That's assuming there will even be options for graphics quality, apart from resolution settings.
What I mean about the blur is just that, it's applied as a post-processing effect typically and will most likely be for this game as well. If you blur the distance you wipe out basically all detail and aliasing etc, so you could have what would look like disgustingly crap graphics in the blur area, but once the blur is applied you wouldn't know the difference. If they wanted to 'harmonise' the performance of the PC version, they could simply still have high quality distance assets before blurring, requiring a higher end PC and then vindicating themselves as to the reason why there was no point in making the PC version look better.
Just look at Watch Dogs, their explanation behind that was just a poor excuse and people like Luneyah have shown this. The assets were already in the game, they deliberately crippled it. Unfortunately future games on that engine will most likely have all 'PC betterment' features stripped.
Actually, I'm one of the guys who likes depth of field I've always kept it enabled while playing.
For photographs it's very famous and "hyped" at the moment, as it helps to make the main motive stand out against the background. The more blur, the shallower the area of sharpness is, the better. At least so it seems!
It's also great for 3D as it helps getting the focus (of the player, the audience whatever) right where you intended it.
Thing is -> the focal point should always be sharp, no matter of the distance. That's one main point for me about Call of Duty, it always remained sharp at all distances. They should definitely keep it that way.
We are in agreement : COD is supposed to hold a somewhat stable 60fps on Xbox One and PS4 so achieving parity with those consoles should not be much of a problem assuming you pack a solid mid-range machine. Requirements will be low, they have to. If that's not the case then the only logical assumption is that the PC build packs some extras or isn't optimized at all. To be more specific I believe a I5 3570 paired with a GTX 760 should be able to get the job done at 1080p.
Never, ever, forget that consoles have the luxury of a significantly leanier and more flexible API. Features like async compute isn't exposed in D3D among other things. My point has already been made countless times : to achieve parity with a console you need a bit more than paper specs equivalence.
You seem slightly confused while there is no reason to.
Yes, settings make perfect sense when comparing PC-consoles. Unless Sledgehammer is playing tricks on us AW on consoles will be the equivalent of maximum PC settings. But as you can guess max settings in thise case will never require cutting-edge hardware at 1080p. 4K is another story.
The Witcher 3 on PS4 for instance will be equivalent to the PC's high setting and to get back to AW I assume the game will scale much, much lower than console quality to accomodate for the overwhelming majority of PC gamers who pack low-end hardware.
They don't need to embarrass themselves by making up technical excuses as to why the PC builds looks identical to consoles : this all boils down to ressources management and budget. I have no idea how you could construct such a scenario, in my opinion you are building a case for something that simply isn't there.
I disagree wholeheartedly. Ubisoft didn't purposefully "downgrade" Watch Dogs for PC. There is no conspiracy to cast consoles in a better light or anything of the sort. I have not been impressed at all with TheWorse mods and the effects look like poo. That's the reason why Ubisoft didn't bother to push the PC version further, they believed they achieved pretty good visuals already and higher quality assets are way too expensive to make. There is always the possibility of polishing visuals but every sku is limited by budget so according to the few people who believe in conspiracy theories a PC version is by design downgraded because, had more ressources been allocated, you could have pushed graphics a lot further.
The PC version suffered as much as the PS4/XBO versions from the last-generation of consoles' focus.
You can always puch graphics further but why should they ? PC isn't entitled to better visuals. We are merely entitled to the game, period.
The deciding factor is budget and nothing else. PC versions sell a lot less and therefore take a backsead.
I also don't get your last sentence. Why would the PC extras be stripped out ? Ubisoft have no reason to do that if the don't go over budget.
If the features are too expensive to develop they will do what they did with Watch Dogs : they will hid them in config files for people to mess with but there is no malicious intent behind all this.