NVIDIA on Tomb Raider performance issues

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Mar 10, 2013.

  1. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,732
    Likes Received:
    2,701
    GPU:
    Aorus 3090 Xtreme
    lol, not at all, I've been peeved at NVidia when they deserve it.
    You dont seem to have read anything, I cant see your answer to my questions.

    You apply blame where there is none and dont apply blame where it is due.
    Its no wonder that companies get away with bad behavior these days, too many people let them get away with it.

    So how long should they wait before writing a driver for a game?
    Should they wait until the last day in case there is a new update then?
    Because that wouldnt give them time to validate a new driver.

    The devs have to give the gfx card mfrs the code they want the driver written for.
     
    Last edited: Mar 10, 2013
  2. The Chubu

    The Chubu Ancient Guru

    Messages:
    2,537
    Likes Received:
    0
    GPU:
    MSi GTX560 TwinFrozrII OC
    To decode the textures yes. It has nothing to do with virtual texturing in the GPU though.

    As I said, nVidia GPUs lack the ability to handle virtual textures inside the GPU. The CPU still has to handle them. The AMD extension addresses exactly that, the possibility of uploading textures with virtual addresses to the GPU. Its not a f"ix" for anything but a new feature.
     
  3. WhiteLightning

    WhiteLightning Don Illuminati Staff Member

    Messages:
    30,788
    Likes Received:
    3,960
    GPU:
    Inno3d RTX4070
    LOL im through with this thread Mufflore,
    Its a waste of my time.
     
  4. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,103
    Likes Received:
    2,606
    GPU:
    3080TI iChill Black
    Me too, looks like he's fine when companies ripoff us customers and im speaking this in general. This TR fiasco is just a cherry on top.

    @The Chubu

    well nvidia already has this virtual texture covered in openGL /hw Fermi driver level, Kepler only improved it - they added extra optimizations (larger buffers) similar to this new hw virtual texture by GCN.
    I think its is similar to bindless textures/extensions, not 100% sure though I've read this in one white paper.
     
    Last edited: Mar 10, 2013

  5. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,732
    Likes Received:
    2,701
    GPU:
    Aorus 3090 Xtreme
    You're blaming the wrong company, thats why I'm on your case!
    If you care to answer the questions I have asked you, I can demonstrate to you why this is the case.

    I am annoyed at the game devs/managers for not allowing NVidia to have the release code, so they can write a driver for it.
    They clearly were able to give the release code to AMD, so there is no excuse.

    Saying that NVidia should have kept on asking if there is a newer release of code is really stupid.
    If they have to keep asking and never know when they will have the final release code, they wont ever be able to write a driver for it!
     
  6. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,103
    Likes Received:
    2,606
    GPU:
    3080TI iChill Black
    And how do you know AMD received the original newer code sooner? You dont either, all you know nvidia apparently got the wrong code. This is the whole story.

    Also why on earth do they need that new code? the game should have been playable by default, not that you need magic tweaks to make it working right. I mean its not like Glacier2 engine is any different then by Hitman Absolution.


    And this is funny too, now they started to work closely with CD developers, why not 1 month before game release?
    Looks like that old code was perfect after all, no bugs where there otherwise they would have contacted CD and said look there are missing FX by fullscreen or the game has latency issues by TressFX or there are the same tessellation bugs like by DeusEx HR.

    Fair enough?
     
  7. anub1s18

    anub1s18 Member Guru

    Messages:
    157
    Likes Received:
    9
    GPU:
    Nvidia RTX2070S /WC
    i agree, Nvidia is dependent on crystal dynamics providing them up to date code if they fail to do that then it's cd's fault.

    Nvidia wrote the driver for the game they received, if cd then changes key parts of the game that Nvidia is not aware of because they did not bother to send over the new code/new version of the game then it's all on cd.

    fact is we don't know exactly what went on behind the scene or how the game developer / nvidia relationship is or how they go about getting there drivers for the game all we can do is speculate and depending on how u assume things are handled will determine who u blame but i think most roads lead to cd.

    1st question tressFX is amd's tech so they would be closely in on the development to make sure it comes out shining bright, much like nvidia is closely involved when physx is used.

    2nd see 1st and because they seem to have made changes to the game's engine atleast in my mind that is the only thing that could cause things to breakdown like they have. but i'm no dev or anything so that's just speculation.
     
    Last edited: Mar 10, 2013
  8. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,103
    Likes Received:
    2,606
    GPU:
    3080TI iChill Black
    ^
    Forget TressFx for a moment, there are still quite a few glitches without it and apparently there was none in that older code.. I find this very hard to believe.

    But that's just my pov, like Whitelighting's with his own, that's it.
     
  9. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,732
    Likes Received:
    2,701
    GPU:
    Aorus 3090 Xtreme
    Do you have trouble reading?
    AMDs driver at release works with the final release code, not a surprise really, considering they collaborated with CD to write the game.
    http://www.amd.com/us/press-releases/Pages/amd-and-crystal-2013mar05.aspx

    Why do you need the code to write a driver for it?
    Why are you asking ridiculous questions, take a look at what has happened.

    Havent you read anything so far?
    NVidia thought they were working closely with CD, they trusted that they had release code.
    CD didnt give them release code.
    Check post #1 for more info.

    You are embarrassing yourself, check post #1.
    Yeah, the code was perfect and had no bugs :p
    CD changed the code, NVidias driver was no longer compatible with it.
    Read the op, it would have helped if you did this at the start.
     
  10. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,103
    Likes Received:
    2,606
    GPU:
    3080TI iChill Black
    I've read it and not really, i've played other games that still have no profile or optimizations and they just work, old or new or no code. ;p

    /im off.
     

  11. chanw4

    chanw4 Guest

    Messages:
    2,362
    Likes Received:
    26
    GPU:
    NITRO+ RX6800XT SE
    If TressFX is truly run on DirectCompute and OpenCL, then why do the developer need to change the code to cater to Nvidia? Either TressFX is not entirely DC and OpenCL or Nvidia is not fully DC and OpenCL specced.

    For decade, Nvidia user have been telling AMD user that their driver sucks (which i agree) and couldn't handle new game but when its Nvidia turn, it is the developer's fault. I've been saying that it should be the developer work to make the game work on standard API / library available and let Nvidia and AMD to handle the standard API / Library from driver level. This is why API and library are made, to make coding transparent to the hardware. It doesn't matter what hardware you use as long as your hardware is API compatible.
     
  12. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    What I find funny is that had things been the other way around.....most of the people defending NVidia and bashing Crystal Dynamics...would instead be bashing AMD and praising Crystal Dynamics.... Kind of funny how hypocritical people on this forum are....

    It's the dev's job to write game code and ensure their game works. It's the job of AMD and NVidia to ensure their own hardware works properly. The game devs have no obligation to AMD or NVidia.

    TressFX is SquareEnix's "tech"....

    TressFX is DirectCompute....which is an area that NVidia is lagging in. NVidia doesn't want to support any open standards that they think they can avoid, until forced into them.

    I've bolded an important part....because that's exactly how things used to work. Now NVidia and AMD ignore portions of APIs and support those they choose as opposed to offering proper, full support....which results in issues.
     
    Last edited: Mar 11, 2013
  13. eclap

    eclap Banned

    Messages:
    31,468
    Likes Received:
    4
    GPU:
    Palit GR 1080 2000/11000
    I find it weird that nvidia felt the need to come out and say what they did. don't think it was necessary and it makes them look like they're butthurt.
     
  14. HeavyHemi

    HeavyHemi Guest

    Messages:
    6,952
    Likes Received:
    960
    GPU:
    GTX1080Ti

    “TressFX Hair.” This technology is the result of months of close collaboration between software engineers at AMD and Tomb Raider’s developer, Crystal Dynamics.

    TressFX Hair revolutionizes in-game hair by using the DirectCompute programming language to unlock the massively-parallel processing capabilities of the Graphics Core Next architecture, enabling image quality previously reserved for pre-rendered images. Building on AMD’s previous work on Order Independent Transparency (OIT), this method makes use of Per-Pixel Linked-List (PPLL) data structures to manage rendering complexity and memory usage.

    http://blogs.amd.com/play/2013/03/05/tomb-raider-tressfx/
     
  15. chanw4

    chanw4 Guest

    Messages:
    2,362
    Likes Received:
    26
    GPU:
    NITRO+ RX6800XT SE
    No where in it say it doesn't use DirectCompute, in fact, it says it use DirectCompute for the technology which means hardware that support DirectCompute should be able to run it.
     

  16. Penal Stingray

    Penal Stingray Banned

    Messages:
    957
    Likes Received:
    0
    GPU:
    GTX 680 Tri-Sli-S27A950
    So did NVidia paid Crystal dynamics to change the code so that it runs good on NVidia hardware?
     
  17. HeavyHemi

    HeavyHemi Guest

    Messages:
    6,952
    Likes Received:
    960
    GPU:
    GTX1080Ti
    Is there some reason you think I was addressing you in that post or that I don't read what I post before posting it?
     
  18. drakullas

    drakullas Guest

    Messages:
    1
    Likes Received:
    0
    GPU:
    Nvidia GTX 670 2Gb
    Nvidia users ask for a maximum QUALYTY of the game... Not only for playng... or a FPS...like 'OTHER' users... This is a problem...!
    Let them do the job well... trust them... and waiting... After that you can play this game in the REAL best quality mode...!
     
  19. macdaddy

    macdaddy Guest

    Messages:
    2,400
    Likes Received:
    4
    GPU:
    TITAN X
    very low gpu usage in sli. Average load i would say 65%. All this at 2560 x 1440.
     
    Last edited: Mar 11, 2013
  20. SLI-756

    SLI-756 Guest

    Messages:
    7,604
    Likes Received:
    0
    GPU:
    760 SLI 4gb 1215/ 6800
    :wanker:
     

Share This Page