Nvidia Official Response on Tomb Raider

Discussion in 'Games, Gaming & Game-demos' started by SLI-756, Mar 5, 2013.

  1. MarkyG

    MarkyG Maha Guru

    Messages:
    1,032
    Likes Received:
    195
    GPU:
    Gigabyte RTX 4070Ti
    ^ agree. It's hardly the big amazing next thing that AMD were touting. Loving this game, just got my
    flame arrows
     
  2. (.)(.)

    (.)(.) Banned

    Messages:
    9,089
    Likes Received:
    1
    GPU:
    GTX 970
    Wow, cannot believe some of the total dick headed comments in this thread. People complain when theres no official statement, then when there finally is one, people bitch like its not good enough.

    How many times has a pc game been fully playable on day 1? Surely its well known by now that if you day 1 purchase a game that chances are your a beta tester.

    How long was it before Batman AC was playable? 6 months if i remember correctly and Rocksteady didnt even apologize, for they seemed to think it was work fine from day 1.

    How long was it before those who bought Aliens:CM got an apology for that mess of a game and has an official apology even been released yet?

    Nvidia makes an official statement faster than ive ever seen in the gaming industry and pc gamers are still having a bloody whinge, get over yourselves.
     
  3. ViperXtreme

    ViperXtreme Ancient Guru

    Messages:
    3,350
    Likes Received:
    230
    GPU:
    RTX 4070 Dual UV
    isnt kepler cards supposed to be weaker in compute vs GCN and even fermi?
     
  4. Damien_Azreal

    Damien_Azreal Ancient Guru

    Messages:
    11,536
    Likes Received:
    2,203
    GPU:
    Gigabyte OC 3070
    I'm very much looking forward to an update from either Nvidia or Crystal Dynamic on this one.
    With FRAPs on it says my frame rate is in the 50s, yet the game still feels as if there are times when it fluctuates and chugs. Yet, my frame rate never differs.

    I think some optimized drivers and a few small updates from the developer could do wonders. :)
    But, either way, I'm loving the game.
     

  5. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,110
    Likes Received:
    2,611
    GPU:
    3080TI iChill Black
    ^
    Maybe its Fraps fault, try msi afterburner it shows variations.

    yes praise nvidia for failing and now all is good, its nvidia after all!

    I mean how can anyone dare to say anything against that, we will hung and burn you like witches :grin:
     
    Last edited: Mar 7, 2013
  6. The Chubu

    The Chubu Ancient Guru

    Messages:
    2,537
    Likes Received:
    0
    GPU:
    MSi GTX560 TwinFrozrII OC
    At least AMD doesn't goes all "Its not our fault!" when they get the short stick out of TWIMTBP games...
     
  7. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    au contraire mon ami ;)
    they even go so far as to accuse NV of whatever

    from the top of my head: aliasing in Batman, Starcraft 2 and infinite weeping by AMD's R. Huddy about tesselation in Hawx, Crysis 2

    NVIDIA didn't say a single word about Forward+, nor about that useless Global Illumination, or even TressFX. They just said they received final code few days ago.
    And apologized.

    Very classy if you ask me :nerd:
     
    Last edited: Mar 7, 2013
  8. The Postman

    The Postman Ancient Guru

    Messages:
    1,773
    Likes Received:
    0
    GPU:
    MSI 980 TI Gaming
    I bet you have green pajamas...

    Anyway I hope all these problems get sorted out so everyone can enjoy the game.
     
  9. (.)(.)

    (.)(.) Banned

    Messages:
    9,089
    Likes Received:
    1
    GPU:
    GTX 970
    Again, how many game are fully playable without issue on release day. Nvidia apologizes and you complain. I bet had they said nothing at all, it would be all about how devs did a crap job of porting from console version.

    Even the crysis 3 thread has less bitching in it than this one, sli and game patches required for that too. Nvidia didnt apologize for that and now we see why developers or publishers prefer to keep their distance.
     
  10. Darren Hodgson

    Darren Hodgson Ancient Guru

    Messages:
    17,224
    Likes Received:
    1,543
    GPU:
    NVIDIA RTX 4080 FE
    The thing is that if the hair behaved realistically then everyone would just accept it for what it was but as soon as you have any effect acting in an unnatural way then it becomes immediately noticeable. In other words, people only notice these things when they're not doing what they're supposed to otherwise they may not even be aware of them.

    It's a bit like CGI effects in a movie; if they're done well then people don't even know they're special effects but if they're not then they can become jarring and distracting. That is TressFX Hair for me; it looks absolutely stunning in still screenshots but looks far less impressive for 90% of the time when I'm actually playing the game.

    Hopefully, it can be patched to make it look and move much more realistically. AMD also need to assign the hair some kind of weight that varies depending on whether it is raining or not, or whether Lara is wet.
     

  11. Mr.Bigtime

    Mr.Bigtime Ancient Guru

    Messages:
    20,791
    Likes Received:
    11
    GPU:
    4090 on Laptop
    this is why CONSOLES are good for DEVELOPERS. no moaning customers about sht GPU drivers etc etc etc list goes on. hopefully next gen consoles will shut the mouth of PC elitists .
     
  12. tibial

    tibial Guest

    Messages:
    67
    Likes Received:
    0
    GPU:
    Zotac 580GTX 3-Way SLI
    Who cares about apologies? People just want to be able to play their damn games properly.

    If people want to bash/bitching a multi-billion company like nvidia/AMD what do you care and what is your problem? Are you worried they might get angry with us and quit tomorrow?
     
  13. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,110
    Likes Received:
    2,611
    GPU:
    3080TI iChill Black
    A lot if not all were fine, i dont have SLI/crossfire issues.

    And i tend to avoid crappy games. Batman series was also one of them and since you mentioned Batman before; blame nvidia and Rocksteady, it was twimtbp game and nvidia as they like to say: we worked closely with devs and yet they both failed badly, just like by Batman AA. And i knew AC would be the same mess, because AA had the same symptoms and they never fixed those (U3E + hw physx + streaming conflicts).

    Anyway I just said weak excuse and i still stand by it. If that's bitching so be it :p
     
  14. alanm

    alanm Ancient Guru

    Messages:
    12,287
    Likes Received:
    4,490
    GPU:
    RTX 4080
    Yep, you can see the results in Dirt Showdown which utilizes GCN.
     
  15. alanm

    alanm Ancient Guru

    Messages:
    12,287
    Likes Received:
    4,490
    GPU:
    RTX 4080
    I love it when AMD wins a big one vs Nvidia. Keeps NV on their toes and the competition hot. Also may help keep my future NV purchases a little lower in cost. :D
     

  16. ryoohki360

    ryoohki360 Active Member

    Messages:
    62
    Likes Received:
    2
    GPU:
    Gigabyte/GTX660TI/2GIG
    Not only that, need for speed only became playable 2 months after the release. A lot of times these games are not ready for prime time, even in console, but development is pushed to meet quater requirements
     
  17. Darren Hodgson

    Darren Hodgson Ancient Guru

    Messages:
    17,224
    Likes Received:
    1,543
    GPU:
    NVIDIA RTX 4080 FE
    @ ryoohki360 - Yeah, NFS: Most Wanted was a travesty at launch and I only noticed it was completely fixed a few weeks ago, long after I'd given up on the game because the jerky cornering and erratic framerate was annoying me. It now runs pretty much flawless on my system but it's too little, too late as I'm just not interested in playing it any more. I enjoyed this game far more on my PS Vita to be honest than I did on my PC!
     
  18. tw1st

    tw1st Guest

    Messages:
    173
    Likes Received:
    0
    GPU:
    Asus 3080 ROG Strix
    hey guys, quick question for ya.

    Currently running with a single 680 on the older beta driver 314.07 and I am not having any issues with tomb raider as of yet. I have a second 680 coming in today for SLI (my first SLI setup ever). Since people are reporting many issues with SLI setups with the 680's and game crashes, should I be updating to the latest driver 314.14 to get better SLI performance out of tomb raider? Or would it be enough for me just to add an SLI bit for the game and keep my current drivers?
     
  19. Damien_Azreal

    Damien_Azreal Ancient Guru

    Messages:
    11,536
    Likes Received:
    2,203
    GPU:
    Gigabyte OC 3070
    Keep that in mind... and the next time a game is released that is buggy or not what people were expecting.... and they demand an apology from the developer/publisher... there's your answer.

    It happens. All the time.
     
  20. War child

    War child Master Guru

    Messages:
    336
    Likes Received:
    6
    GPU:
    3090Ti Suprim X
    Well I'm curious as to what really is happening. I have 2 MSI reference 680s in SLI, and ive played quite far passed the first cave. On ultimate settings and 1080 resolution.

    I dont want to be too specific to spoil anything really. But my curiosity is that I have zero bad fps drop, zero issues of freezing/pausing like I have seen on some youtube posts running nvidia. The only thing I can think of thats different to many which I doubt will be the factor is I am not using a 60hz monitor. I think my lowest FPS was 90.

    However one thing i did notice was that EVGA and Afterburner monitoring still stated the game was running up to 120fps even when I had refresh rate ingame set to 60hz, or if I made my nvidia profile vsync on etc.

    Before I get bashed by many of you, yes I'm not as technologically advanced as many of you, but my attempts at replicating your issues didnt seem to work. No matter what I tried to do, to lock fps down to around the 60fps mark, (monitor refresh, game refresh option, vsync on etc.. the game still seemed to have no limit.)
     

Share This Page