Rise of the Tomb Raider performance, and troubleshooting

Discussion in 'Videocards - AMD Radeon Drivers Section' started by oGow89, Jan 27, 2016.

  1. BingoBongo

    BingoBongo Guest

    Messages:
    21
    Likes Received:
    0
    GPU:
    Fury X Crossfire
    I must say when I cranked up the settings to full and watched the slide show @ 4k I was a little disappointed by the graphics. Definitely did not have the wow factor that the original TR did and I only played that at 1080p.

    There is another performance review here -

    http://forum.overclock3d.net/showthread.php?p=892196#post892196

    But to be completely honest any performance review right now is a waste of time. It's clear AMD did not have the game until it launched and so the drivers people are running it on are not suited for it at all.

    I haven't heard anything from AMD as to how much better it performs on these new drivers, just that they have Crossfire working and they should be out 'soon'.

    Other than the ten minute test I ran on the game (to see if it supported Crossfire and so on) I have not touched this. I want to wait until it is fixed and enjoy it as I should, instead of getting annoyed at the frame drops and lack of Crossfire support.
     
  2. passenger

    passenger Master Guru

    Messages:
    486
    Likes Received:
    95
    GPU:
    Sapphire RX580 8GB
    I have 1440p monitor and Radeon 7870, so I'm prayin' to God for 16.1.1 or/and 25 fps :)
     
  3. BingoBongo

    BingoBongo Guest

    Messages:
    21
    Likes Received:
    0
    GPU:
    Fury X Crossfire
    I am hearing bad things about the protection on this game. Apparently it manages to fill up all of your ram and cause stuttering.
     
  4. AHPD

    AHPD Active Member

    Messages:
    82
    Likes Received:
    4
    GPU:
    Nitro+ RX 6800 XT
    The game becomes a slideshow if you crank stuff up to 4K PLUS any SSAA filters.

    Running at 4K (VSR) here at max settings in a specific area, gives around 23~26fps.
     

  5. Shadowxaero

    Shadowxaero Guest

    Messages:
    223
    Likes Received:
    42
    GPU:
    Zotac RTX 4090 Amp
    Yes it does, I recorded a bit of 4k gameplay and was using around 12 or so GB of memory as seen here. Even got up to 14 a few times.

    https://youtu.be/sWL7OzU_hZM

    But I really do own the game lol I have proof. And I am family sharing with her now but she always wants to play but I am always playing lol.
     
  6. MaskedMuchaco

    MaskedMuchaco Guest

    Messages:
    76
    Likes Received:
    0
    GPU:
    MSI R9 290 Gaming (H2O)
    I was planning on buying this but looking at how it performs i'm not too sure so may wait for a patch and new drivers before spending my hard earned pennies.

    Does anybody have any info on how it performs on an AMD CPU with an AMD GPU? i'm not expecting good news. :cry:
     
  7. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,125
    Likes Received:
    969
    GPU:
    Inno3D RTX 3090
    Did anyone read Hilbert's review carefully? Apparently AMD cards use more VRAM (from 300-500MB more), and they seem to have the least latency in single core (!!!) configurations.

    WTF.
     
  8. sammarbella

    sammarbella Guest

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    Nice (?) try but this is not the right place to ask for "family share" bypass for Denuvo locks...

    Some of us are fully aware of this trick.

    I guess Steam will modify the "family share" feature very soon to lock it for legit uses only.
     
  9. sammarbella

    sammarbella Guest

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    Must be the new and "improved" memory management in HBM/drivers or Gameworks "help"...or both.
     
  10. BingoBongo

    BingoBongo Guest

    Messages:
    21
    Likes Received:
    0
    GPU:
    Fury X Crossfire
    TBH I could probably handle that. Being a 4k gamer I will take anything over 20 FPS lol.

    But yeah, driver should be out soon and I've booked a day off so I can get stuck in :)
     

  11. xxela

    xxela Master Guru

    Messages:
    231
    Likes Received:
    8
    GPU:
    RX6800 XT Red Devil
    This is the only video I found
    https://www.youtube.com/watch?v=96njdiUNjYU
     
  12. AsiJu

    AsiJu Ancient Guru

    Messages:
    8,808
    Likes Received:
    3,369
    GPU:
    KFA2 4070Ti EXG.v2
    Yes and I couldn't believe a game anno 2016 needs a "single-core hack"...

    Though that may apply to Fury/Nano series only as I tried setting affinity to 1 core and certainly got a lot worse performance as you'd expect.

    That VRAM usage caught my attention too, weird. I monitored usage and with Very High textures (and shadows and detail level) VRAM usage was near 4 GB @ 1080p.
     
  13. MaskedMuchaco

    MaskedMuchaco Guest

    Messages:
    76
    Likes Received:
    0
    GPU:
    MSI R9 290 Gaming (H2O)
    That doesn't seem quite as bad as I expected, I may take the chance on it now and hope drivers or a patch improve things soon.
     
  14. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,125
    Likes Received:
    969
    GPU:
    Inno3D RTX 3090
    I believe that the trick is in tandem with the driver thread, so an "affinity trick" wouldn't work. You would have to disable the cores from the bios.
     
  15. AsiJu

    AsiJu Ancient Guru

    Messages:
    8,808
    Likes Received:
    3,369
    GPU:
    KFA2 4070Ti EXG.v2
    Yea could be, just wanted to give it a quick shot with affinity and see what happens as the info on the review was so strange.

    And here I was thinking that this would be the first game to run async shaders with Radeon Software (async shaders being one of the advertised features of Radeon Software).

    Seems like a rushed port indeed, especially as the devs said that atm their DX12 code runs worse than DX11 so for that reason no DX12 support/patch for the time being (if at all).
    Plus the talk about running Maxwell-only 12_1 level features doesn't sound very good either. My tin foil hat is off, but once again it seems Nvidia's participation isn't boding very well for AMD owners...

    Though, Gamesworks or not, it's up to CD/Nixxxes to code the game properly.
     

  16. Valerys

    Valerys Master Guru

    Messages:
    395
    Likes Received:
    18
    GPU:
    Gigabyte RTX2080S
    DX12 may run 'worse' than DX11 just because it would give AMD an advantage. Feature level 12_1 wouldn't be a problem since the game already renders all its assets using the feature level 11_x. :)
     
  17. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,125
    Likes Received:
    969
    GPU:
    Inno3D RTX 3090
    That's another VERY good point.
     
  18. Seren

    Seren Guest

    Messages:
    297
    Likes Received:
    16
    GPU:
    Asus Strix RX570
  19. niczerus

    niczerus Guest

    Messages:
    290
    Likes Received:
    3
    GPU:
    MSI GamingX 580 4GB
    AF X4 vs trilineal 5-10 fps more .
     
  20. nichenstein

    nichenstein Guest

    Messages:
    22
    Likes Received:
    0
    GPU:
    Sapphire R9 295x2
    so i just bought the game cuz yea, no one has a kind heart... it looks pretty good but i get terrible shuttering and freezes of 1-3 sec especially during cut scenes. I have everything on very high except soft shadow (high). Anyone know a tweak? Oh yea i played on CFX default. I will try crosfire disabled but i don t think thats the problem. Tried 4k-2k-1080p same freaking freezes. Should i try 1 core only from BIOS?


    edited so yea i just read that very high textures work only for 4GB + cards so nvm :D
     
    Last edited: Jan 31, 2016

Share This Page