The Witcher 3 Official Thread

Discussion in 'Games, Gaming & Game-demos' started by sAAdC, Jan 10, 2013.

Thread Status:
Not open for further replies.
  1. TaskMaster

    TaskMaster Ancient Guru

    Messages:
    1,712
    Likes Received:
    692
    GPU:
    ZOTAC Trinity 4080
    this could be GOTY for me
     
  2. Carfax

    Carfax Ancient Guru

    Messages:
    3,973
    Likes Received:
    1,463
    GPU:
    Zotac 4090 Extreme
    But it's already been stated several times that CDPR is purposely holding back on showing ultra quality until game release. Why? Likely because of marketing reasons. They don't want to show footage that will make console gamers feel more alienated, and give the wrong impression of making them think the console version will look similar to the PC version on ultra..

    All of their marketing material has been using high settings so far, with the possible exception of the earlier footage such as the reveal trailer and the VGX trailer.

    A lot of material has been shown so I don't think I would characterize it as "little." In fact, I'd say too much has been shown..

    Art style is subjective, so you have the right to dislike it or think it's flat and cartoony. Most would likely disagree though.

    As for shadows, the only thing that doesn't seem to cast shadows is grass. Larger foliage like trees and bushes do cast shadows though.. The grass not casting shadows may be some kind of performance enhancing metric for lower settings, or so I'm hoping. Ultra quality will hopefully have more shadows.

    Tons of people are upgrading their rigs for current gen only games like the Witcher 3, Batman Arkham Knight, Star Citizen etcetera..

    The thing is, that PC hardware has gotten so fast, and the spectacular growth of the console market has slowed down the advancement of game technology..

    OK I'm going to have to raise the B.S flag here. I used to own a pair of GTX 580s myself back in the day, so I know what they can do. I had them overclocked to 900Mhz, and I used them for my first playthrough of Crysis 3 at max settings 1440p with Vsync off and SMAA set to 1x.

    I wasn't even close to 60 FPS, unless the camera was pointed at the ground. My average frame rate was in the low 30s in the most GPU bound areas like the Roots of all Evil level, and the highest frame rate hitting the low 40s in more CPU bound areas such as Enter the Jungle..

    I also played Metro Last Light with the same GPUs, and although I can't remember what my frame rate was for that game, it wasn't anywhere near 60 at max settings 1440p even with a dedicated Physx card. Frame rate had a noticeable dip in areas with volumetric lights and a lot of NPCs due to their tessellated forms.

    At any rate, the GTX 580s were beasts back in their day, but they are undeniably long in the tooth now. Compared to more modern GPU architectures like Maxwell and GCN, they are very weak in compute performance and as we all know, compute shaders are becoming more and more important for game performance.

    And games that use a lot of compute shaders are the ones where the GTX 580 is the most behind..

    GTX 980 vs GTX 580 in Anandtech bench.
     
  3. (.)(.)

    (.)(.) Banned

    Messages:
    9,089
    Likes Received:
    1
    GPU:
    GTX 970

    Read what you quote, here it is again:
    MalDo's Configs kicked ass, I never said I played C3 or LL at 1440p and I didnt use AA like SMAA 1/2x which is rubbish vs the performance hit and as i said, 55~ fps for game like Metro LL and C3, and stutter that is induced at around that mark is easily remedied by applying a bit more mouse smoothing via config/console tweak. Its only single player after all. AA was usually applied via sweetfx using fxaa as that gave a good enough result for no fps loss. I dont like anything other than 60fps max settings. Imo, theres no point playing on pc if its not a minimum of 60fps @ 1080p maxed settings. Clearly you didnt dont know what the 580s are capable of. Hell, changing to windows 8.1 has even give a few extra fps and driver make a massive difference along with the sli bits people discover.

    i never use to be able to have 4x aa in stalker with AA2 and not tank fps, now, i can play at 1440p and have 2x aa.:pc1:

    Im not interested in bull****ting about what my old (and it is old) hardware can or cannot do. I've only kept them this long due to there being no reason to upgrade. What is there, BF4/HL, FC4, AC Unity. None of those games are worthy of a gpu upgrade. Same goes for cpus atm. Well, on intels side.

    As for people upgrading, yes they are, but its more due to needing or simply the standard "due for an upgrade". Its not and hasnt been at the level of where people, in droves, upgraded for the games i mentioned.

    Also, saying the Witchers cartoony art style as being subjective is like someone saying that the simpsons is borderline on uncanny valley. Cartoony is cartoony, theres no way around it. The cliff face textures remind me of something out of DragonBall Z. Its pastel, its flat, it Dragon Age: I art style boring. No, I dont like it. Would that stop me from playing it, no of course not. But im just voicing what it think of it and that I dont think that such an art style suits a game that can be quite dark at times. It make about as much sense as having blood and gust the likes of which is in doom, in Zelda or Super Mario Bros. Im sure texture mods will make this game shine far more than what the devs could of even imagined.
     
    Last edited: Apr 11, 2015
  4. RecluSe

    RecluSe Guest

    Messages:
    885
    Likes Received:
    0
    GPU:
    EVGA GTX 980 SC/ACX 2.0
    Looks better than it did before :/ A comparison over at the gaf.. scroll down a little.

    http://www.neogaf.com/forum/showthread.php?t=1026691&page=19
     
    Last edited: Apr 11, 2015

  5. Calmmo

    Calmmo Guest

    Messages:
    2,424
    Likes Received:
    225
    GPU:
    RTX 4090 Zotac AMP
    Looks like Spyro the Dragon
     
  6. Halfmead

    Halfmead Guest

    Messages:
    275
    Likes Received:
    50
    GPU:
    GigaByte 2070 Super
    I think W3 looks just fine....;)
    [​IMG]
    Gameplay over gfx!! :p

    Since i got it with the 980 purchase, gonna play it for sure...at least see what it offers gameplay wise (haven't played the previous 2)

    But still waiting for them to get cracking and releasing Cyberpunk 2077, which i REALLY looks forward to.
     
  7. evasiondutch

    evasiondutch Guest

    Messages:
    207
    Likes Received:
    0
    GPU:
    2x-MSI gaming 290x 4gb OC
    Console screenshots.

    My Skyrim modded looks better then this shot.

    I fear TW3 also to much made for console and ported to PC:(

    Almost all videos so far console on youtube and ones for pc ALL only 1080p (where are the 1440p or even higher????)i'm not impressed.

    Hope the game works fine with keyboard mouse or its my last game i buy full price.
     
  8. Damien_Azreal

    Damien_Azreal Ancient Guru

    Messages:
    11,526
    Likes Received:
    2,192
    GPU:
    Gigabyte OC 3070
    It's sad how it seems far too many people only care about how the game looks.

    Seems to be all I see on forums, is that The Witcher 3 doesn't look good enough. No talk about how good the gameplay could be, or if the story will be great... just that it doesn't look good enough.
     
  9. WildAce

    WildAce Guest

    Messages:
    439
    Likes Received:
    0
    GPU:
    3x 4GB AMD 290's
    yeah i am not sure what game they are looking at either because it looks great to me.
     
  10. evasiondutch

    evasiondutch Guest

    Messages:
    207
    Likes Received:
    0
    GPU:
    2x-MSI gaming 290x 4gb OC
    I agree on the CDProject deserves some respect and people should buy this game full price.

    One of the few developers left that make it possible for 100% DRM FREE and all DLC for FREE.

    But many this site steal games they dont care if game developers go bankrupt they keep stealing until they get caught or game companies make it impossible to get game with super DRM restrictions.

    All thieves who steal games are cause of state where in now concerning games, plus not to mention the decline of whole PC market and its getting worse.

    I predict because of this(not all but it helps) we have no PC games(all console crap ported to PC which already happen a lot) or Desktops in 5 years time.

    But as i already said they don't give **** they just do what they please them these are the kind that cheat there way through life also, corruption is everywhere im afraid.

    Already know what people will say about this lol.
     

  11. boodikon

    boodikon Ancient Guru

    Messages:
    4,007
    Likes Received:
    106
    GPU:
    Leadtek 8800 GTS 640mb (600 core)
    Great point :)
     
  12. eclap

    eclap Banned

    Messages:
    31,468
    Likes Received:
    4
    GPU:
    Palit GR 1080 2000/11000
    Let's not forget that a lot of people have been talking about how great it will look, how many gurus said they'll upgrade for witcher etc. Now it kinda looks a little bit meh. So let's not be surprised that some are a little bit pissed. It's normal.

    In fact, if the preview screenshots were much better than what we get in release version, it should be noted and frowned upon.
     
  13. Solfaur

    Solfaur Ancient Guru

    Messages:
    8,014
    Likes Received:
    1,534
    GPU:
    GB 3080Ti Gaming OC
    Well, I agree, but at least I won't have to worry about upgrading now. :D

    I still think it will be a kick-ass game, even with the downgraded visuals, if not, well... GTA5 will probably keep me busy for a long time (until SW Battlefront most likely).
     
  14. (.)(.)

    (.)(.) Banned

    Messages:
    9,089
    Likes Received:
    1
    GPU:
    GTX 970
    Its a harmless gfx disscusion. No one ever said that it would effect the gameplay.

    This thread so far:

    Guru1: Ahh graphix awesome, the best on your face PC mater race!
    Guru2: gfx Look meh, seen better under my house!
    Guru1: Far too many people care about gfx, just play game and smile like cheese.
     
  15. Carfax

    Carfax Ancient Guru

    Messages:
    3,973
    Likes Received:
    1,463
    GPU:
    Zotac 4090 Extreme
    SMAA 1x had an almost nonexistent FPS hit, but had a huge impact on aliasing. I remember toggling it on and off to see if I could get any extra performance, but there was practically no difference.

    You can say there's no reason to upgrade, if you're not playing the most advanced games I suppose. Any Frostbite 3 engine game is going to give your 580s a heavy workout, as FB3 uses a lot of compute shaders. Same with later versions of the CryEngine.

    [​IMG]
     

  16. (.)(.)

    (.)(.) Banned

    Messages:
    9,089
    Likes Received:
    1
    GPU:
    GTX 970
    Are you sure, i recall turning it off as the per hit was quite heavy. I aint installing origin or C3 to refresh my memory, so...

    Im just going to save that gif you posted. HAhaha, fantastic.
     
  17. Vipu2

    Vipu2 Guest

    Messages:
    553
    Likes Received:
    8
    GPU:
    1070ti
    Its always easy to blame pirates, not because games are bad...

    But yes people should buy of course when games are good, like it seems Witcher 3 will be but we dont know that yet.
     
  18. Carfax

    Carfax Ancient Guru

    Messages:
    3,973
    Likes Received:
    1,463
    GPU:
    Zotac 4090 Extreme
    Until we see what ultra quality looks like, I would hold off on accusations of downgrading.

    Ultra quality reportedly gives a nice boost in detail to character models and vegetation quality, whilst also increasing tessellation factors, post processing effects and physics..

    Even so, the visual quality has to be balanced against the size and scale of the game, which is immense. Much bigger than any open world game so far, with full day and night cycles and dynamic weather patterns.. All in all, I'm very satisfied with how the Witcher 3 has turned out and I'm certain it will become a landmark game that will revolutionize how we think of RPGs..
     
  19. Carfax

    Carfax Ancient Guru

    Messages:
    3,973
    Likes Received:
    1,463
    GPU:
    Zotac 4090 Extreme
    I'm pretty sure. SMAA 1x is akin to FXAA. At most, the performance hit was around 2 or 3 FPS or something..

    [​IMG]
     
  20. Dorlor

    Dorlor Guest

    Messages:
    1,706
    Likes Received:
    0
    GPU:
    2x evga 780 ti sc acx
    Yeah, smaa is a favourite of mine - gets rid of most jaggies, less than 5% performance hit like fxaa, but doesnt blur the image as fxaa does :)

    I honestly dont understand why more games doesnt include smaa... it should be fairly simple, when we can use a simple injector to get it.
     
Thread Status:
Not open for further replies.

Share This Page