Discussion in 'Games, Gaming & Game-demos' started by sAAdC, Jan 10, 2013.
I hear rumours on the internets LoL
hahahahahah! classic george. classic george.
Console have actually no fault in this.
If they really where constrained by the performance of the consoles vs. PC, it's a simple solution for that.
Release for consoles first and after 6 months for PC or vice versa. And graphics doesn't matter anymore between the two platforms.
I watched 90 minutes of that IGN live stream of the PC version last night on YouTube and I think the game looks absolutely phenomenal. The cutscenes are extremely well done with expressive characters - the eyes especially look amazing - often the humans look uncannily real. Voice-acting and dialogue choices seem excellent too. The stream mostly avoided spoilers but the combat looks like a lot of fun and I really cannot wait to explore this huge world that CDPR have created (which many reviews are saying is one of the best to date). I really love games like this and easily see myself ploughing (ha!) 100+ hours into it like Skyrim and Dragon Age Inquisition.
Hoping The Games Collection send my PS4 copy out today as they have done before for Tuesday releases then I can play it over the weekend. Fingers crossed.
Have you guys seen this?
From NVIDIA: http://www.geforce.com/whats-new/articles/the-witcher-3-wild-hunt-is-your-system-ready
1920x1080, Low settings GTX 960
1920x1080, Medium settings GTX 960
1920x1080, High settings GTX 960
1920x1080, Uber settings GTX 970
1920x1080, Uber settings w/ GameWorks GTX 980
2560x1440, Uber settings GTX 980
2560x1440, Uber settings w/ GameWorks GTX TITAN X, or 2-Way SLI GTX 970
3840x2160, Uber settings GTX TITAN X, or 2-Way SLI GTX 980
3840x2160, Uber settings w/ GameWorks 2-Way SLI GTX 980 or GTX TITAN X
So that that mean there is no scaling lower than GTX 960? I mean low/medium/high the same?
Don't know if this has been mentioned yet, but apparently CDPR has been running the PC "ultra" footage with HBAO+, AA, light shafts, sharpening, CA all turned OFF the entire time. In fact, I doubt we've ever really seen footage of the game truly maxed out.
The substitution of HBAO+ for SSAO is a massive downgrade in and of itself. Ambient occlusion is a subtle technology, but it's impact on overall scene quality can be huge and HBAO+ is the creme de la creme of ambient occlusion tech for games.
So one thing we can be assured, is that foliage should look much better when we actually get to play the game ourselves assuming your hardware can run the game with HBAO+ enabled.
All of those make very little difference to the image really. In fact, i imagine alot of people will turn off sharpening and CA for image quality reasons and turn HBAO+ and light shafts to save performance, as with them off vs on, they aint gonna make much of a difference.
Not really a big deal imo. The fact that they've called Very High "Uber" says it all. Uber is reserved for setting beyond Ultra.
Meh, I can't recall one single triple A game in recent years that did not receive a downgrade prior to launch.
It pisses me off to great extents that developers use false advertising to promote their products. There really should be a law forbidding this to protect consumer rights. I don't see any other industry where this is legal practise.
HBAO+ makes a huge difference compared to SSAO, especially when it comes to foliage. Check this out:
Far Cry 4 SSAO
Far Cry 4 HBAO+
As you can see, the HBAO+ makes a world of difference when it comes to giving the foliage depth and form. SSAO is absolutely terrible in comparison!
Man, I just preloaded the Witcher 3 using GoG Galaxy and I was pleasantly surprised at how fast it was. Maxed out my connection speed @ 15 MB/s..
It really doesn't happen that often, so think your exaggerating.
I don't think you should regulate disappointment, there is no crime here.
Having just hit 92 hours playing of DA: I myself I know precisely where you are coming from. Is there really anything better than a game that just keeps on giving? I don't even feel like I've touched much of the surface yet either which is quite something in itself.
I watched it too and one thing I can say for certain is that W3 proves without a shadow of a doubt that Ubisoft have not altered their terrible animation system in Syndicate and which hasn't been any good since ACIII. The animations alone in W3 are proof of the games quality never mind a million other things that stand out more than any other game I've seen to date.
The new AMD PC build I bought for my son only a week ago gets set up the very day before this releases which is his 18th birthday present and you can bet your arse Witcher 3 will be the first game that gets run on it for him. Right along with me playing it side by side on my own.
http://www.gamespot.com/videos/the-witcher-3-wild-hunt-xbox-one-now-playing/2300-6424837/ about 11:15 XBO
https://www.youtube.com/watch?v=6mDR3osVfQ4 about 4:20 PS4
Hang on guys, explain something to me here.
from that GeForce.com article, isn't the very first sentence exactly the reason - in fact the reality that CDPR and most companies lately - have engaged in this downgrade business in order to keep people from choosing the pc version over consoles?
The Witcher 3: Wild Hunt’s May 18th release is tantalizingly close. On PC, the experience will be enhanced with higher-resolution textures, higher-quality effects, higher levels of detail, uncapped framerates, and many other enhancements.
Is GeForce.com putting this out in spite of all the efforts by CDPR and presumably their deal with M$ and Sony? Does this simple statement not fly in the face of what they are trying to keep "on the down-low"?
It's an Nvidia sponsored game, they always do that.
I realize that. I was just wondering if the features they are talking about are things missing from the gameplay footage of the last few weeks.
I mean, when Ubisoft tells you a couple minutes of footage is from PC, can you trust it? And is it because they are trying to temper the reaction from potential XBO and PS4 customers? Is that any more believable than not admitting to a downgrade?
Guys, ambient occlusion not only affects the quality of shadows, but the quality of lighting in a scene as well by correctly accounting for the occlusion of light sources.
Here's a great example using Far Cry 4 again:
Far Cry 4 SSAO
Far Cry 4 HBAO+
The lighting in the HBAO+ pic gives a much more accurate and realistic depiction of the scene by properly accounting for the occlusion of light sources.
So basically, it's possible that the supposed downgrade of the lighting in the screenshots and media can be explained by CDPR omitting HBAO+ in favor of SSAO in order to enforce more parity between the PC and the console versions for the sake of marketing!
I still remember the gameplay footage and one of them said that we will be impressed when we see the Ultra/Uber settings on our PCs. It's time to laugh hard i guess ?
I think your knowledge of PCs and PC gaming is flawed, I'm very sorry.
Undying actually asked you genuinely, not to put you down.
I would also feel silly if I bought 2 titans for a game that they didn't deliver on. It's a fact that the game was "optimized" for consoles.. we all know that, and it's apparent from the screenshots and the features removed. What else is there to talk about?
This "PCs not being able to run them at 60fps" is the most ridiculous talk I've ever heard. PCs ARE MEANT TO BE PUSHED. The 2013 graphics CDPR ran that with a 780Ti @ 45fps ultra quality. It was 2 years ago, what videocards do we have now? There's no point in PC gaming if stuff can't run our hardware.
By this time now most of us have already upgraded to a 2nd card in sli/cf or have stronger cards.
Kinda irrelevant and a waste of resources really, considering the fact that HBAO+ has such a short reach in outside areas and in the witcher you dont play the game from a near birds eye view perspective.