Rise of the Tomb Raider performance, and troubleshooting

Discussion in 'Videocards - AMD Radeon Drivers Section' started by oGow89, Jan 27, 2016.

  1. jmcc

    jmcc Active Member

    Messages:
    56
    Likes Received:
    5
    GPU:
    MSI RTX 3060 12GB
    GPU Usage is almost always at 99%.
     
  2. AsiJu

    AsiJu Ancient Guru

    Messages:
    8,937
    Likes Received:
    3,465
    GPU:
    KFA2 4070Ti EXG.v2
    Yea, I thought of that too.

    The game was first announced as W10 exclusive (=DX12). NVIDIA jumps in. DX12 is dropped and the game is released a lot earlier than originally scheduled, I think.

    XOne version does async compute. PC version doesn't (would require DX12). AMD does async compute better than NVIDIA atm. It's said DX12 will eventually "require" 12_1 feature level, which would tip async perf towards NVIDIA nicely.

    Like said, no tin foil hat here, but there's just too much "coincidences" here...

    Used to be a long time happy NVIDIA owner, but now I think I'll stick to AMD just for the principle of the thing.
     
  3. OnnA

    OnnA Ancient Guru

    Messages:
    17,963
    Likes Received:
    6,821
    GPU:
    TiTan RTX Ampere UV
    Try this (give your threads):

    -D3D11mt -pthreads 6
    or

    -D3D12 -pthreads 6
     
  4. ObscureangelPT

    ObscureangelPT Guest

    Messages:
    552
    Likes Received:
    66
    GPU:
    Zotac GTX 1650 Supe
    And BTW what setting are you actually playing?
     

  5. isidore

    isidore Guest

    Messages:
    6,276
    Likes Received:
    58
    GPU:
    RTX 2080TI GamingOC
    any news on that damn driver?
     
  6. blaugznis

    blaugznis Member Guru

    Messages:
    125
    Likes Received:
    0
    GPU:
    GTX 1060 6GB
    AMDCare said "soon" 6 hours ago on Twitter :D
     
  7. vancook

    vancook Guest

    Messages:
    70
    Likes Received:
    0
    GPU:
    MSI R9 390 8 GB
    It is a strategy of Nvidia, the PC version does not have dx 12 for Nvidia has better performance .. If the game had supported dx 12 AMD would win in benchmarks

    Nvidia always plays dirty. It's the only way he can beat AMD.
     
    Last edited: Feb 1, 2016
  8. oGow89

    oGow89 Guest

    Messages:
    1,213
    Likes Received:
    0
    GPU:
    Gigabyte RX 5700xt
    At least they are playing. Amd ain't playing at all.
     
  9. vancook

    vancook Guest

    Messages:
    70
    Likes Received:
    0
    GPU:
    MSI R9 390 8 GB
    Yes, amd need to more agressive. Wake up AMD!
     
  10. niczerus

    niczerus Guest

    Messages:
    290
    Likes Received:
    3
    GPU:
    MSI GamingX 580 4GB

  11. Mere

    Mere Guest

    Messages:
    124
    Likes Received:
    4
    GPU:
    amd fury 3840 1100/500
    Dat Geothermal Valley performance! |Worse than at the Soviet Installation.. And that's at 1920x1200 :)

    Wow, what a broken POS this game is!..

    PS I already have textures set to high (everything else expect AO->Very High preset).. It looks like abysmal performance there has nothing to do with vram limitations....

    PPS Nvidia: "The Way It's Meant To Sh-t" on PC gamers..I wonder.. Why every single game that has this logo has a piss-poor optimization? :grin:
     
  12. Har3inger

    Har3inger Member

    Messages:
    43
    Likes Received:
    0
    GPU:
    ATI HD 8870m 2GB GDDR5
    For some reason I'm finding AF the single most taxing setting on my setup, even more so than shadows and HBAO+. The difference between trilinear and 2x is like a 25% drop in fps.
     
  13. oGow89

    oGow89 Guest

    Messages:
    1,213
    Likes Received:
    0
    GPU:
    Gigabyte RX 5700xt
    WOW you got the game working on that mobility hd3650? You must be a wizard.
     
  14. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    Similar to my Intel GT2 :rolleyes:
    AMD bloody well knows this. All recent benchmarks done by AMD labs (promo material) typically use 0xAF, like we are in 1992.
    Meanwhile on NV there is 5% difference between trilinear and 2xAF. 2% between 2xAF and 16xAF.
    http://abload.de/img/rise-of-the-tomb-raidtqxa0.png

    Also, I've noticed annoying ground shimmering in GTAV, Witcher 3 and War Thunder that closely resemble pre-GCN(HD4000/5000) AF image stability issues.

    Anyone thinks something might be off with AF lately?
     
  15. Inzam47

    Inzam47 Guest

    Messages:
    10
    Likes Received:
    0
    GPU:
    G1 Gaming 980 SLI@1468MHz
    Using 980s in SLI here, can't use Very High textures no matter what. Soviet Installation stutters massively, also the areas after that.
    Uses 6GBish at 1080p, also at 1440p according to 980Ti owners I know personally.

    Just sad. I wouldn't care much if this was some Gameworks bull****, but Textures are the one thing I would not want to sacrifice.
    What is with the huge Vram usage anyways. This might just be stupid of me but it feels as if Nvidia has something to do with this to further their 980Ti and Titan X sales.

    Any tips at all? ANYTHING?
    How are Fury X users doing? That card also has 4GB IIRC.
    It is sad that 980s are obsolete already. This was my first high-end build and I thought it would last longer than this.. Hasn't even been a year.
     

  16. sammarbella

    sammarbella Guest

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    The game is not using ALL the 6 GB VRAM from 980Ti at the same time at 1080p, a big amount of this VRAM must be old allocated memory (and not used anymore) and/or preloaded stuff (to be used later).

    That's not bad at all.

    The problem is when you don't have enough memory to cover the real time needs at any res.

    From what i read Nvidia users can enable SLI up to 99% GPU usage...:

    http://forums.guru3d.com/showpost.php?p=5226182&postcount=611

    AMD users without multi-GPU support are still waiting for the "lost" 16.1.1 (16.2 now?) driver/hotfix/gift/scam.
     
    Last edited: Feb 1, 2016
  17. red00

    red00 Guest

    Messages:
    67
    Likes Received:
    0
    GPU:
    R9 290X
    FU*K them! will they release that **** or what?
     
  18. Inzam47

    Inzam47 Guest

    Messages:
    10
    Likes Received:
    0
    GPU:
    G1 Gaming 980 SLI@1468MHz
    Thank you for your reply.
    I understand that it may be caching the Vram, but then why are my 980s stuttering so badly? Dropping textures to high makes everything smooth.

    Performance in terms of FPS is not bad at all with custom bits thankfully. Just really want to play with V. High textures, even if I have to lower something else to really low.

    I've been trying to find a workaround.

    Any news on the game's DX12 Patch? And if it will have Vram stacking?
    I was asking about Fury X vram usage btw. I'd be happy to play the game at 45-60 FPS with V. High textures rather than what I have now, a capped 60 with 2xSSAA and injected SMAA.
     
  19. Inzam47

    Inzam47 Guest

    Messages:
    10
    Likes Received:
    0
    GPU:
    G1 Gaming 980 SLI@1468MHz
    Might be the wrong place to ask but do you think I should sell my 980s and get a single Ti?
    Now that games are getting more demanding VRAM wise and all, also considering latest games don't support SLI well. :/
    AC Syndicate, JC3, etc.
     
  20. The Mac

    The Mac Guest

    Messages:
    4,404
    Likes Received:
    0
    GPU:
    Sapphire R9-290 Vapor-X
    if hold off till the next generation is released.

    4GB is fine for the moment.
     

Share This Page