Rise of the Tomb Raider performance, and troubleshooting

Discussion in 'Videocards - AMD Radeon Drivers Section' started by oGow89, Jan 27, 2016.

  1. AHPD

    AHPD Active Member

    Messages:
    82
    Likes Received:
    4
    GPU:
    Nitro+ RX 6800 XT
    Source: http://www.geforce.com/whats-new/gu...guide#rise-of-the-tomb-raider-texture-quality
     
  2. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    Yeah, at 1920x1200 I can play for around a hour or so with the very high texture setting and then it'll have reached max (Afterburner will report around 3900 to 3950 MB of VRAM used, meaning Windows and whatnot have the remaining 50 - 100 MB.) and after that RAM usage will greatly increase upwards to 10 GB or so (All in all, not just for what this game uses.) and as mentioned cutscenes in particular can stutter greatly when this VRAM limit is exceeded. (At 2880x1800 which I downsample via custom resolution that time drops to about 20 - 30 minutes before it's filled up and start using system RAM instead.)

    Pretty sure the R9 390 with it's 8 GB VRAM buffer would do better for longer play sessions, plus of course the 980 Ti with 6 GB and Titan X with 12 GB VRAM on the Nvidia side.

    Going by the GeForce.com guide comparison images the player model of Lara herself sees the greatest difference in texture quality between high and very high settings, most of the terrain and such mostly differ in normal and specular texture clarity with diffuse (color) detail mostly remaining unchanged.
     
  3. lowrider_05

    lowrider_05 Active Member

    Messages:
    71
    Likes Received:
    2
    GPU:
    AMD RX 6900 XTXH LC
    I played the Game for about 6 hours now and can say that it is running pretty good on an all AMD system i play it with all on very high incl. Pure Hair and HBAO+ @ 2560X1440 VSR and it never gets under 30 fps with a 45-50 fps avarage.
     
  4. oGow89

    oGow89 Guest

    Messages:
    1,213
    Likes Received:
    0
    GPU:
    Gigabyte RX 5700xt
    Garbage performance.

    Back in 2006-2007 when we were still in the ps2 era and early ps3/360 era, newer consoles deliver better looking games than most PCs, back then we lacked the hardware. That is where ''Can it run Crysis?'' came. The game was for years both a benchmark and a technical wonder and remains still. It was a pretty game that even now if you launch, you will still lose hours gazing at the scenes of the water, sun reflection, and beach among other features that paint an unforgettable image in your mind to remain even after closing the game. When it came out, no hardware was capable of running the game at acceptable frames on the highest quality. It was understandable for 2 reasons;

    1. Hardware didn't cost as much as today, and the likes of over-hyped hardware didn't exist, and no one had to pay over 1000$ for a single gpu. The gt 8800 which would be the equivalent of the gtx 970 and r9 390 now was around 200$ on release.

    2. Efficiency existed only for consoles, and the software wasn't there either. We had no gameworks crap, and yet developers jumped in and tried to use the latest API which at the time was DX10 with the likes of Call of Juarez, Crysis, Assassins Creed, Lost Planet, Gears of war, Bioshock, and the list goes on. Sure it was part of advertising for Windows Vista and the difference wasn't that much, and crysis was later unlocked to use very high settings on windows xp, but the developers started the implementation rather quick.

    This here on the other hand is as garbage as Batman Arkham Knight in terms of performance. We have both hardware and software and the game takes use of neither. When a game uses one core and makes no use of the others, it is called junk. When a developer has the tools to implement more efficient API, and chooses not to for whatever reason is called stupid.

    You shouldn't be happy, your gpu was advertised to run games for resolution higher than 1440p with max details and all eye candy with high performance and it does in almost every well developed and optimized game. If a gpu costs over 1000$ with the likes of the titan x runs into problems running a game at 1080p with everything maxed out, then we have a problem. Your gpu costs 500+ euros/bucks and you shouldn't be getting less than 60fps on any game on 1440p, no matter how good looking it is. If the game runs on a sub 300 bucks console and even on a 10 years old xbox 360, be at on low settings and resolution, a gtx 750ti shouldn't be struggling at 13 fps. The likes of these games that run like crap, shouldn't sell well. This is becoming more of a fashion trend, with developers betting on how broken a game must be before it stops selling. In few years time, they will just sell us their plans for the game, and we will have to develop it ourselves. DX12 exists and is already complete, why not use it. Mantle before it existed, why not use it, it's free. So really i don't know what you are happy about.
     
    Last edited: Jan 31, 2016

  5. Spartan

    Spartan Guest

    Messages:
    676
    Likes Received:
    2
    GPU:
    R9 290 PCS+
    Actually, for an fx cpu+amd gpu rig, that is a fine performance nowadays.
     
  6. lowrider_05

    lowrider_05 Active Member

    Messages:
    71
    Likes Received:
    2
    GPU:
    AMD RX 6900 XTXH LC
    Ok Ok, i always set my games to be between 30 and 60 fps with the max Quality/AA/Downsampling possible to achive that. If i set it to 1080P with FXAA only i have allmost always 60 fps.

    and with MAX settings it does look a good bit better than on an XBox One and i will remember you that there is a optimized driver in the works for this game. But for now im glad to see this performance on launch for this game.
     
  7. red00

    red00 Guest

    Messages:
    67
    Likes Received:
    0
    GPU:
    R9 290X
    FU*CK THEM! they ALWAYS delay it until Saturday and then bingo! it's weekend's vacation! FFF****KK YOU AMD and ALL your staff!
    Preload, Pre purchase, Pre order, Playing before the others.I really don't know the meaning of these words these days. I, who pre ordered the game 6 months ago, have exactly no difference with someone who's gonna buy the game 1 week after release. thanx to you AMD. and F*ck you again!
     
  8. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    Not true.

    LOL no. Where do you get this from?

    I'm happy because we got one of the best, if not the best looking game. And I'm buying it as soon as AMD finds the time to fix it.
    Why shouldn't I(we)?

    BTW... I am surprised that 4GB+ (8GB on R9 390/X) is starting to make sense, even bellow 4k. I have expected that 4GB would be "future-proof" longer than this.
     
    Last edited: Jan 31, 2016
  9. Shadowxaero

    Shadowxaero Guest

    Messages:
    223
    Likes Received:
    42
    GPU:
    Zotac RTX 4090 Amp
    Well have you ever compared your vRAM usage to that of the 980ti in other games. I know in game like Shadows of Mordor, and GTA V, and even Arkham Knight our Fury for the most part uses quite a bit less vRAM than that of cards with more GDDR5. And it has often been said that our 4GB of HBM is = to 6 and in some cases 8GB of GDDR5 do to the speed at which it can get data in and out.

    I don't know, this is the first game that I have been having RAM issues with. I mean SE recommends a 970 and 8GB of ram and yet I am using 14GB of RAM........
     
  10. ObscureangelPT

    ObscureangelPT Guest

    Messages:
    552
    Likes Received:
    66
    GPU:
    Zotac GTX 1650 Supe
    Hi guys, i've been evaluating the performance for days.

    And to start off, the problem about the AMD framerate is basically the AMD DX11 overhead problem.

    I saw a lot of comments here that let me pissed off, like putting the faults on Gameworks and DENUVO.
    Well not sure how many DENUVO Games you've played, but despite i don't fall in love with it, i think we had seen much worse DRM like Securom for example, and i played original and cracked version of Mad Max both runned the same.

    Second one.. Gameworks.
    I don't think that ****ty nvidia gameworks is doing something there for the performance, its just the HBAO+ effect that is there, the rest of all the stuff is mostly their Crystal Engine rendering and AMD Pure Hair solution.

    Third.. The Review of the Digital foundry is bad, they say that the PC Version have Async computing, that is a big lie since we needed DX12 or Mantle for that, its not the first time i saw them talking of things they don't know, last time i notice a mistake was Leadbetter saying that H170/110 and B150 couldn't run Rams above 2133MHZ, that they needed a Z170 for running rams at higher speeds LOOOOOOOOOOOOOOOOOOOOOOL.
    Leadbetter repeated that for 5 times at least, which is completely false.

    Forth, if you read decently the review of the DF they sayed that its impossible to match the Xbox one performance/settings, since Shadows and tesslation is different.
    So the only time that Xbox one uses tesslation is for the snow footprints, on PC the use of the tesslation is massive all over the place, tough DF put Tesslation on as part of being close to the Xbox one -.-

    Shadows on the Xbox one have the same resolution as the high of the PC, but on Xbox one they miss shadows from a lot of objects which are active on the PC, for those 2 reasons thats why the GTX 960 is needed to "Match" the Xbox one.

    Fifth, the game is heavy cause it relys on a lot of heavy effects, to match Xbox one quality its actually easy, but pushing soft contact shadows, tesslation all over the place, plus Pure hair at very high with 30K strands of hair instead of some strands, a lot more View distance and objects, its normal that it requires a massive hardware.

    Six.. The only drawback of the port in my opinion is the vram usage, Xbox one can match the very high textures, which on PC 4gb of vram doesn't seem to be enough for it.

    Seventh, **** nvidia and Square enix if they release a DX12 for the game that don't work on AMD. That would be demonic, cause Dx11 performance on AMD Sucks big balls, and its urgent that more DX12 come for PC cause AMD needs that much more than nvidia.
    If AMD can't enjoy DX12 cause of the stupid marketing of nvidia, then AMD is ****ed not only in this game, but too on the next games that will follow this strategy.
     
    Last edited: Jan 31, 2016

  11. gx-x

    gx-x Ancient Guru

    Messages:
    1,530
    Likes Received:
    158
    GPU:
    1070Ti Phoenix
    4GB of vram cannot equal anything other than 4GB of vram. Now as for marketing and hype, that's another story, but it's just that, a story.
     
  12. Shadowxaero

    Shadowxaero Guest

    Messages:
    223
    Likes Received:
    42
    GPU:
    Zotac RTX 4090 Amp
    Of course we know it literally doesn't equal more vram but I do know in some games a fury will only use 3GB where a 980ti would use 5.
     
  13. gx-x

    gx-x Ancient Guru

    Messages:
    1,530
    Likes Received:
    158
    GPU:
    1070Ti Phoenix
    I didn't argue that :) Different hardware, different drivers etc. It also goes vice-verse in the case of this Tomb Raider, as evident by testing. Though, it doesn't seem to be too much of a difference.
     
  14. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    Will be interesting to see if memory consumption will be altered once AMD releases 16.1.1 or what they might call it, apparently it might be released tomorrow but it could take longer too I suppose since there's no official confirmation on when it's scheduled to be released although it appears it might have originally been intended for a Friday release last week but some last minute changes must have been implemented so it got delayed a bit.
    (Perhaps a larger 16.2 beta could follow shortly afterwards assuming this next driver is primarily about this game, shrinking down that list of known issues from 16.1 would be welcome for example.)
     
    Last edited: Jan 31, 2016
  15. sammarbella

    sammarbella Guest

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    If nomenclature logic means something for AMD a driver released in ยด16 February should be named 16.2.(x) and not 16.1.1.
     

  16. Shadowxaero

    Shadowxaero Guest

    Messages:
    223
    Likes Received:
    42
    GPU:
    Zotac RTX 4090 Amp
    I know for me personally, playing Tomb Raider at 1440p uses around 3.4/3.5GB on average but my system will use at much at 14GB of its RAM in certain areas >.<
     
  17. gx-x

    gx-x Ancient Guru

    Messages:
    1,530
    Likes Received:
    158
    GPU:
    1070Ti Phoenix
    My (gaming) life got a lot simpler and easier once I stopped using AB overlay to feed my head with the information that I shouldn't really care bout ;)
     
  18. jmcc

    jmcc Active Member

    Messages:
    56
    Likes Received:
    5
    GPU:
    MSI RTX 3060 12GB
    The Soviet Base is the more hardware demanding area of the game. On my humble rig i can get above 60fps on closed areas like caves and etc, but on that soviet base fps drops to 15 sometimes. Not to mention a lot of stuttering when the game saves, heavy stuttering at some cutscenes and even a BSOD.

    Don't know why because this game runs even on a 10 year old hardware that is Xbox 360. The only explanation is very poor optimization.
     
  19. ObscureangelPT

    ObscureangelPT Guest

    Messages:
    552
    Likes Received:
    66
    GPU:
    Zotac GTX 1650 Supe
    Can you monitor your GPU Usage on that soviet place please?
     
  20. Valerys

    Valerys Master Guru

    Messages:
    395
    Likes Received:
    18
    GPU:
    Gigabyte RTX2080S
    That's not possible, if a card supports DirectX 12 it will launch the game in DirectX 12 no matter the feature level. That missing feature level 12_1 on AMD will, at most, have some specific in-game settings disabled but it's very unlikely, so far the subfeature part was all about optimizing certain effects, not disabling them altogether - Like Battlefield with 11.1, or Assassin's Creed with 10.1 which only had some antialiasing optimization going on but not hiding any AA modes from 10.0 cards.
     

Share This Page