Review: Red Dead Redemption 2: PC graphics benchmark analysis

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Nov 5, 2019.

  1. Nizzer1982

    Nizzer1982 New Member

    Messages:
    6
    Likes Received:
    4
    GPU:
    1080
    I can 100% confirm Pascal is not gimped with this game. It's what I'm running and like I posted I've had no issues at all. There's clearly something else going on here that's effecting some players and not others. Whether it's an issue with the 'ti' specifically as I've read similar complaints in other forums from 'ti' users. Only time will tell I guess.
     
  2. Loophole35

    Loophole35 Ancient Guru

    Messages:
    9,793
    Likes Received:
    1,148
    GPU:
    EVGA 1080ti SC
    Remember GN is getting completely different results in their tests. I looks like all the blame should fall at R* feet. Again glad I'm not one of the paying alpha testers.
     
    airbud7 likes this.
  3. Denial

    Denial Ancient Guru

    Messages:
    13,355
    Likes Received:
    2,857
    GPU:
    EVGA RTX 3080
    Nah - honestly I'm over it and I think I'm just going to leave the forums.

    https://www.guru3d.com/index.php?ct=articles&action=file&id=43782

    https://www.guru3d.com/index.php?ct=articles&action=file&id=43790

    https://www.guru3d.com/index.php?ct=articles&action=file&id=43796

    https://www.guru3d.com/index.php?ct...dmin=0a8fcaad6b03da6a6895d1ada2e171002a287bc1

    https://tpucdn.com/review/nvidia-ge...ion/images/monster-hunter-world_1920-1080.png

    https://tpucdn.com/review/nvidia-geforce-rtx-2080-founders-edition/images/hellblade_1920-1080.png

    https://tpucdn.com/review/nvidia-ge...dition/images/rainbow-six-siege_1920-1080.png

    In everyone of these games/benchmarks the 2080 is performing anywhere from 10 to 35% faster than a 1080Ti. The average of all the differences between the 1080Ti and 2080 in what I posted above is 19.5%. So no I really don't recall the two being in the same performance level - unless of course you consider 19.5% over 7 modern titles the same performance level. I don't.

    Hilbert's DX12 results for RDR2 show a 23% difference between the 1080Ti and 2080. Techpowerup shows a 16.6% difference. Hardware Unboxed shows a 12.2% difference - averaging 17.2% - so not only is not an outlier in performance it doesn't even exceed the average of games listed above.

    It's really annoying to me that I had to spend the last like 10 posts or whatever just correcting people's misinformation. A post that took you 5 seconds to write "You remember when 1080ti was 2080 performance level? :p" that we saw people earlier in this thread read into - took me almost 20 minutes to completely disprove. Then you do it again right under that post:

    He didn't say that. He said it runs better in this one area (literally by 3 fps at max and it's on video for like 5 seconds) but then goes on to say that area is an outlier, it spends the rest of it's time capped at 60 because they have vsync on. Every single benchmark, Hilbert/Unboxed/GN/Techpowerup all have the 1660 clearly winning.

    So now someone comes into this forum - sees your post and goes "580 runs the game better than the 1660" based on you completely taking a picture/quote out of context. Cool.

    Then you have this:

    So after I spent like 8 pages refuting this garbage it's just another guy posting the same crap based on literally nothing.
     
    Last edited: Nov 9, 2019
  4. Undying

    Undying Ancient Guru

    Messages:
    15,731
    Likes Received:
    4,755
    GPU:
    Aorus RX580 XTR 8GB
    Great effort to prove everybody wrong. You know i respect your comments but sometimes you can be really stubborn.
     
    Last edited: Nov 9, 2019

  5. pharma

    pharma Ancient Guru

    Messages:
    1,721
    Likes Received:
    523
    GPU:
    Asus Strix GTX 1080
    I do sympathize. The amount of stupid, fake news people post this site can be overwhelming at times, but at those times we just need to take a break and reset.:)
     
    fantaskarsef and airbud7 like this.
  6. Turanis

    Turanis Ancient Guru

    Messages:
    1,777
    Likes Received:
    471
    GPU:
    Gigabyte RX500
    "On a horse, in the snow and storm. Hi-Hoo Hilbert, yee-haw!"
    Great review,Mr Hilbert.Now its complete.Many Thanks. :)

    I dont think is gimped,they do as usual stuff for older Gtx: they launch a new driver for a new game,but they do not optimize for older Gtx.Is just a "supported" driver to match older gpu.
     
  7. Loophole35

    Loophole35 Ancient Guru

    Messages:
    9,793
    Likes Received:
    1,148
    GPU:
    EVGA 1080ti SC
    And sometimes you crap post. Maybe from now on unless you can substantiate your claims don’t post them. We know you view things wit a slant towards AMD.
     
    Stormyandcold and yasamoka like this.
  8. SpajdrEX

    SpajdrEX AMD Vanguard

    Messages:
    2,805
    Likes Received:
    1,087
    GPU:
    Sapphire RX 6800XT
    So how Async Compute improves fps/smoothness on Turing cards, anyone tried?
     
  9. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,849
    Likes Received:
    244
    GPU:
    EVGA GTX 1080Ti SC
    So this is all you could reply to evidence against your claim. Great effort to show you're a douchebag.
     
  10. pharma

    pharma Ancient Guru

    Messages:
    1,721
    Likes Received:
    523
    GPU:
    Asus Strix GTX 1080
    Probably no different than Async Compute improves fps/smoothness on Navi.
     

  11. haste

    haste Maha Guru

    Messages:
    1,252
    Likes Received:
    389
    GPU:
    GTX 1080 @ 2.1GHz
    I guess you should, because you can't even get your facts straight. Difference in most benchmarks between RTX2080 and 1080TI was up to 5%.

    DX12:
    https://www.guru3d.com/articles_pages/geforce_rtx_2080_founders_review,20.html
    DX12:
    https://www.guru3d.com/articles_pages/geforce_rtx_2080_founders_review,17.html
    or GTA5 - the most relevant:
    https://www.guru3d.com/articles_pages/geforce_rtx_2080_founders_review,30.html

    ^The same performance or 1080TI gets even faster.

    Now, the difference in RDR2 between 2080 and 1080TI is 44% in 2560 and 30% in 4K.

    So stop being in denial and accept the fact, that there is either something wrong with this review or Pascal is totally gimped in RDR2.
     
    Lucifer likes this.
  12. Digilator

    Digilator Master Guru

    Messages:
    491
    Likes Received:
    135
    GPU:
    Sapphire 5700XT
    Those with AMD GPUs may want to try this tweak. Supposed to fix stutter(frame times) when used with VULKAN.

     
  13. Denial

    Denial Ancient Guru

    Messages:
    13,355
    Likes Received:
    2,857
    GPU:
    EVGA RTX 3080
    Actually it's about 8% on average over 23 games that techpowerup tested. I don't know why that's relevant though when the difference here falls within the range at launch - I'm not sure what part you're having trouble comprehending but at launch we saw at least 3 titles that had ~30%+ (Wolfenstein, coincidentally also a game developed on Vulkan up at 35%) performance difference between the 1080Ti and 2080 and multiple others in the 10-20% range. So why is yet another title performing at that level strange? How do you explain the difference in all these other titles? Did Nvidia pregimp Pascal in those 7 titles? Is something wrong with the reviews in those 7 titles?

    More realistically, it's probably what I and Astyanax wrote in the first several pages - Turing has significant architectural enhancements compared to Pascal.. some of which make it far more similar to GCN/RDNA. Could it be possible that the developers leaned into these changes? Could it be possible that at launch the 2080 drivers were immature - which explains why 1080Ti sometimes performed faster yet other times the 2080 performed 35% faster?

    Anyway, honestly yeah this review is kind of an outlier compared to others. As I also stated in the actual post you quoted (so im not sure how you didn't read it) Hardware Unboxed/Gamers Nexus/Techpowerup all similarly reviewed this game in various other locations and the results between the 1080Ti/2080 (GN didn't review a 2080) were far closer (for example only 22% with techpowerup @ QHD - 18% at 1080P // Hardware Unboxed 14% on UHD - 13% on 1080P). So even in this extreme review where the 1080Ti/2080 were furthest measured apart, it still basically performing within the range of games when the 2080 launched.. but if you average the reviews of all 4 publications it's actually well within expected difference.

    Also to be clear - it's entirely possible newer drivers will help Pascal close the gap but the argument that this result is evidence of Nvidia gimping Pascal doesn't hold water. Not only is the performance difference within the range of what we saw from the 2080 at launch, but even if it wasn't there are a dozen and a half reasons why that could be that are not related to Nvidia intentionally downgrading Pascal performance.
     
    Last edited: Nov 9, 2019
  14. haste

    haste Maha Guru

    Messages:
    1,252
    Likes Received:
    389
    GPU:
    GTX 1080 @ 2.1GHz
    I really don't understand you post. If summary of multiple reviews (which is actually less relevant than GTA5 result) show 8% difference, that only proves my case that Pascal is gimped and even proves my statement of 10-20%.

    And more importantly, where have I ever said that Pascal is gimped by NVIDIA? You should read my post again. If we had that difference under DX11 I'd blame NVIDIA. In case of Vulkan/DX12, I blame R*... as I wrote in my original post. There is no magical Vulkan extension or DX12 FL12.1+ functionality making 2080 20% faster.
     
  15. alanm

    alanm Ancient Guru

    Messages:
    10,227
    Likes Received:
    2,375
    GPU:
    Asus 2080 Dual OC
    Irrespective of percentages, arch changes, etc, I think Nvidia realizes it would be a huge disservice to themselves if they were to be viewed as gimpers of last gen HW. Upset owners might be more likely to switch to AMD for future-proofing than buy something that may not perform well enough with each passing gen.
     

  16. Yxskaft

    Yxskaft Maha Guru

    Messages:
    1,465
    Likes Received:
    115
    GPU:
    GTX Titan Sli
    Why should GTA V be the most relevant? The PC version was released four years ago, it's an old game at this point. The engine has had years of further development.

    There are multiple differences that could account for Turing's sudden performance advantage, which have alreay been mentioned. RDR2 might be hitting architecture bottlenecks on Pascal but not on Turing and Turing might have gotten more refined drivers. Turing is actually just a little older than one year right now.

    The big driver improvements tend to come when the architecture is still considered "new", a game specific driver that gives merely 10% better performance is typically considered good.


    Go back to when the Geforce 8 series was released and it put the 7 series to shame in alot of titles. Go back to when Fermi was released and AMD users shat themselves seeing that Fermi was much more efficient at DX11.
    Go back to GCN's release and see how the much more efficient architecture let even weak cards like the HD 7790 come dangerously close to former flagship card HD 6970 in DX11 titles.
     
  17. Denial

    Denial Ancient Guru

    Messages:
    13,355
    Likes Received:
    2,857
    GPU:
    EVGA RTX 3080
    Averaging the cards doesn't give the full picture - you need to look at what range the games are falling in:

    I'm going to try to illustrate it by creating an extreme example: let's say 3080 card comes out and has the following performance compared to the 2080 (I'm simulating the numbers)


    RTX 3080 | RTX 2080
    Game 1: 75 FPS | 65 FPS
    Game 2: 85 FPS | 55 FPS
    Game 3: 55 FPS | 85 FPS
    Game 4: 65 FPS | 75 FPS
    Game 5: 52 FPS | 50 FPS

    (I don't know how to make my shitty chart align, forgive me)

    Over these five games the increase is 2fps (.8% on average across the 5 games), because aside from game 5 the other games cancel each other out - the new 3080 series is only giving us 2fps higher than the 2080 on average, over these 5 games, but in game 2 the 3080 is 54% faster. So let's say a game 6 comes out and it's

    Game 6: 80 FPS | 60 FPS

    Now the 3080 is 3.6 fps faster (6% on average across the 6 games) than the 2080 but this sixth game isn't even outside the existing range - it's only 33% faster, which is slower than the 54% of game 2 but the average increase barely moved. That's no different then what's happening here - there are 7 examples I found (8 if you include this game and 9 if you include Middle Earth: Shadow of War which I just happened to come across) where the 2080 is 10%+ higher than the 1080Ti. Some of these games it's 20%+ in extreme cases, like Wolfenstein, it's 35% - all of these titles measured at 2080's launch. Now you have RDR2 coming in at ~20% if you average all the released reviews. Compared to that 8% from techpowerup it looks bad but compared to the 9 other games there are far worse offenders.

    So the question becomes, why are there examples of games 35% faster on Turing vs Pascal?

    Some people here are saying "Nvidia artificially gimped Pascal" - and you're correct, you didn't state Nvidia did this and I apologize for that. Other people are saying "Nvidia isn't updating Pascal to the same degree" then other people are saying "There are differences between Pascal/Turing that could lead to Turing coming away faster. I don't buy the first two - because like I said, at launch there was numerous examples of games significantly faster on Turing than Pascal and in some cases vice versa. So someone arguing the first two points would also have to explain the difference in those 7 - now 8 games older games before Turing?.. and if they did have some other explanation why isn't that explanation good enough for this title?

    To me the answer is the third option.

    Nvidia specifically stated that Turing has roughly 50% more performance per CUDA core than Pascal. The 2080 has ~25% less CUDA cores than the 1080Ti but it's clocked slightly higher - even if we ignore the clock, based on Nvidia's estimation the 2080 should on average be 25% faster than the 1080Ti - making up the CUDA core difference and then some.

    That 50% more performance comes from a number of things but the one in particular is Turing's ability to dual issue INT/FP:

    https://devblogs.nvidia.com/wp-content/uploads/2018/09/image5.jpg

    They state:

    More and more modern titles are making use of integer instructions which is one of the reasons why Nvidia split this up in the first place. You don't need a magical vulkan extension to lean into this performance - if your game makes use of lots of integer compute then it's just going to naturally be faster on Turing/GCN/RDNA than it is on Pascal. Outside of having a completely separate shader path and managing two different instances of multiple different shaders/effects/(arguably entire game) there is no way for Rockstar or Nvidia to simply optimize this for Pascal. The game just naturally uses integer compute and thus it's just going to be faster on Turing - that's really it. Further, Turing also supports FP16 - I have no idea if RDR2 is utilizing it but I know Farcry 5/Wolfenstein both used it and now that both Nvidia/AMD offer it across their product stacks I'd imagine more developers are going to utilize it for various shaders - presumably these shaders will just fall back to FP32 on Pascal and thus run slower. It's just another example of something that Nvidia/Rockstar/whoever just simply can't optimize for.

    Keep in mind that these changes, separate integer pipelines/fp16/etc are all more likely to be used on modern titles than previous ones now that Nvidia has an entire lineup of cards supporting it. These advantages wouldn't show up in a 2080 launch summary of older games but they would probably start to show up in games nearly a year after Turing's launch.

    So even if you ignore the fact that this game isn't outside the range of expected performance based on 2080 launch benchmarks, you still have to consider that there are fundamental changes in the architecture that should push Turing further ahead in modern titles. You should expect Turing to outperform Pascal with no ability for developers or Nvidia to optimize.
     
    Last edited: Nov 9, 2019
  18. haste

    haste Maha Guru

    Messages:
    1,252
    Likes Received:
    389
    GPU:
    GTX 1080 @ 2.1GHz
    If you ever worked as low level programmer on a multi-platform engine, you would know that some render paths, extensions or even shader code affects architectures differently. These differences are either bugs in drivers or lack of optimization from developers. Averaging helps picturing the real performance. And there are usually multiple ways how to implement the same effect on different architectures.

    And in case of RDR2, even Maxwell handles it better compared to its average performance numbers(970/980TI).

    IMHO my best guess is that R* was optimizing for Turing right before release and crippled Pascal in the process, because they share quite a lot.

    EDIT: I should probably respond to this as well:
    Yes, separate INT is one of bigger changes for Turing, which is, unfortunately, completely mitigated by lower number of cores and increased overhead. NVIDIA did this change to accelerate compute and deep learning, not shaders in general. It's the same reason why they added tensor cores. In reality, you won't see much(if any) shaders widely using int variables. It's not even common to see correctly used halfs. Overwhelming number of shaders use just floats. Most of Turing's increased speed in games comes from increased on-chip cache and faster memory, which results(mostly) in faster texture sampling, rasterization etc... But again, not much faster with 2080 vs 1080TI due to its 352 vs 256bit memory bus.

    And TBH I really have no interest in spending hours explaining my reasons why I believe it's not any architectural difference causing this.
     
    Last edited: Nov 9, 2019
    HandR, Lucifer and jura11 like this.
  19. Astyanax

    Astyanax Ancient Guru

    Messages:
    10,571
    Likes Received:
    3,865
    GPU:
    GTX 1080ti
    reducing the parallel cores doesn't necessarily "mitigate" the seperation of int and fp, there are also cache design differences you need to factor in that mitigate the core counts.

    You can't explain it because its bs.

    Vega and RDNA both demonstrate explosive increases in performance when their architectures are taken advantage of.
     
  20. Irenicus

    Irenicus Master Guru

    Messages:
    582
    Likes Received:
    100
    GPU:
    1070Ti OC
    As others keep trying to drill into your thick skull, there is no gimping. Take off your tin foil hat.
     

Share This Page