Review: Red Dead Redemption 2: PC graphics benchmark analysis

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Nov 5, 2019.

  1. Khronikos

    Khronikos Master Guru

    Messages:
    876
    Likes Received:
    96
    GPU:
    EVGA SC2 1080ti
    What lol? The difference between 8x and 16x is easily seen just about anywhere there is a clear path on the ground. You are tripping. No way in hell I am putting that down.

    As for the 1080ti there is no way in hell it has to run this badly. Something is messed up with their game or drivers. Probably to sell cards. It shouldn't be that bad.
     
    Dragam1337 likes this.
  2. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,817
    Likes Received:
    218
    GPU:
    EVGA GTX 1080Ti SC
    So this is all you could reply to evidence against your claim. Great effort to show you're a douchebag.
     
  3. pharma

    pharma Maha Guru

    Messages:
    1,308
    Likes Received:
    221
    GPU:
    Asus Strix GTX 1080
    Probably no different than Async Compute improves fps/smoothness on Navi.
     
  4. haste

    haste Maha Guru

    Messages:
    1,003
    Likes Received:
    276
    GPU:
    GTX 1080 @ 2.1GHz
    I guess you should, because you can't even get your facts straight. Difference in most benchmarks between RTX2080 and 1080TI was up to 5%.

    DX12:
    https://www.guru3d.com/articles_pages/geforce_rtx_2080_founders_review,20.html
    DX12:
    https://www.guru3d.com/articles_pages/geforce_rtx_2080_founders_review,17.html
    or GTA5 - the most relevant:
    https://www.guru3d.com/articles_pages/geforce_rtx_2080_founders_review,30.html

    ^The same performance or 1080TI gets even faster.

    Now, the difference in RDR2 between 2080 and 1080TI is 44% in 2560 and 30% in 4K.

    So stop being in denial and accept the fact, that there is either something wrong with this review or Pascal is totally gimped in RDR2.
     
    Lucifer likes this.

  5. Digilator

    Digilator Master Guru

    Messages:
    377
    Likes Received:
    94
    GPU:
    Sapphire 5700XT
    Those with AMD GPUs may want to try this tweak. Supposed to fix stutter(frame times) when used with VULKAN.

     
  6. Denial

    Denial Ancient Guru

    Messages:
    12,837
    Likes Received:
    2,119
    GPU:
    EVGA 1080Ti
    Actually it's about 8% on average over 23 games that techpowerup tested. I don't know why that's relevant though when the difference here falls within the range at launch - I'm not sure what part you're having trouble comprehending but at launch we saw at least 3 titles that had ~30%+ (Wolfenstein, coincidentally also a game developed on Vulkan up at 35%) performance difference between the 1080Ti and 2080 and multiple others in the 10-20% range. So why is yet another title performing at that level strange? How do you explain the difference in all these other titles? Did Nvidia pregimp Pascal in those 7 titles? Is something wrong with the reviews in those 7 titles?

    More realistically, it's probably what I and Astyanax wrote in the first several pages - Turing has significant architectural enhancements compared to Pascal.. some of which make it far more similar to GCN/RDNA. Could it be possible that the developers leaned into these changes? Could it be possible that at launch the 2080 drivers were immature - which explains why 1080Ti sometimes performed faster yet other times the 2080 performed 35% faster?

    Anyway, honestly yeah this review is kind of an outlier compared to others. As I also stated in the actual post you quoted (so im not sure how you didn't read it) Hardware Unboxed/Gamers Nexus/Techpowerup all similarly reviewed this game in various other locations and the results between the 1080Ti/2080 (GN didn't review a 2080) were far closer (for example only 22% with techpowerup @ QHD - 18% at 1080P // Hardware Unboxed 14% on UHD - 13% on 1080P). So even in this extreme review where the 1080Ti/2080 were furthest measured apart, it still basically performing within the range of games when the 2080 launched.. but if you average the reviews of all 4 publications it's actually well within expected difference.

    Also to be clear - it's entirely possible newer drivers will help Pascal close the gap but the argument that this result is evidence of Nvidia gimping Pascal doesn't hold water. Not only is the performance difference within the range of what we saw from the 2080 at launch, but even if it wasn't there are a dozen and a half reasons why that could be that are not related to Nvidia intentionally downgrading Pascal performance.
     
    Last edited: Nov 9, 2019
  7. haste

    haste Maha Guru

    Messages:
    1,003
    Likes Received:
    276
    GPU:
    GTX 1080 @ 2.1GHz
    I really don't understand you post. If summary of multiple reviews (which is actually less relevant than GTA5 result) show 8% difference, that only proves my case that Pascal is gimped and even proves my statement of 10-20%.

    And more importantly, where have I ever said that Pascal is gimped by NVIDIA? You should read my post again. If we had that difference under DX11 I'd blame NVIDIA. In case of Vulkan/DX12, I blame R*... as I wrote in my original post. There is no magical Vulkan extension or DX12 FL12.1+ functionality making 2080 20% faster.
     
  8. alanm

    alanm Ancient Guru

    Messages:
    9,382
    Likes Received:
    1,618
    GPU:
    Asus 2080 Dual OC
    Irrespective of percentages, arch changes, etc, I think Nvidia realizes it would be a huge disservice to themselves if they were to be viewed as gimpers of last gen HW. Upset owners might be more likely to switch to AMD for future-proofing than buy something that may not perform well enough with each passing gen.
     
  9. Yxskaft

    Yxskaft Maha Guru

    Messages:
    1,446
    Likes Received:
    110
    GPU:
    GTX Titan Sli
    Why should GTA V be the most relevant? The PC version was released four years ago, it's an old game at this point. The engine has had years of further development.

    There are multiple differences that could account for Turing's sudden performance advantage, which have alreay been mentioned. RDR2 might be hitting architecture bottlenecks on Pascal but not on Turing and Turing might have gotten more refined drivers. Turing is actually just a little older than one year right now.

    The big driver improvements tend to come when the architecture is still considered "new", a game specific driver that gives merely 10% better performance is typically considered good.


    Go back to when the Geforce 8 series was released and it put the 7 series to shame in alot of titles. Go back to when Fermi was released and AMD users shat themselves seeing that Fermi was much more efficient at DX11.
    Go back to GCN's release and see how the much more efficient architecture let even weak cards like the HD 7790 come dangerously close to former flagship card HD 6970 in DX11 titles.
     
  10. Denial

    Denial Ancient Guru

    Messages:
    12,837
    Likes Received:
    2,119
    GPU:
    EVGA 1080Ti
    Averaging the cards doesn't give the full picture - you need to look at what range the games are falling in:

    I'm going to try to illustrate it by creating an extreme example: let's say 3080 card comes out and has the following performance compared to the 2080 (I'm simulating the numbers)


    RTX 3080 | RTX 2080
    Game 1: 75 FPS | 65 FPS
    Game 2: 85 FPS | 55 FPS
    Game 3: 55 FPS | 85 FPS
    Game 4: 65 FPS | 75 FPS
    Game 5: 52 FPS | 50 FPS

    (I don't know how to make my shitty chart align, forgive me)

    Over these five games the increase is 2fps (.8% on average across the 5 games), because aside from game 5 the other games cancel each other out - the new 3080 series is only giving us 2fps higher than the 2080 on average, over these 5 games, but in game 2 the 3080 is 54% faster. So let's say a game 6 comes out and it's

    Game 6: 80 FPS | 60 FPS

    Now the 3080 is 3.6 fps faster (6% on average across the 6 games) than the 2080 but this sixth game isn't even outside the existing range - it's only 33% faster, which is slower than the 54% of game 2 but the average increase barely moved. That's no different then what's happening here - there are 7 examples I found (8 if you include this game and 9 if you include Middle Earth: Shadow of War which I just happened to come across) where the 2080 is 10%+ higher than the 1080Ti. Some of these games it's 20%+ in extreme cases, like Wolfenstein, it's 35% - all of these titles measured at 2080's launch. Now you have RDR2 coming in at ~20% if you average all the released reviews. Compared to that 8% from techpowerup it looks bad but compared to the 9 other games there are far worse offenders.

    So the question becomes, why are there examples of games 35% faster on Turing vs Pascal?

    Some people here are saying "Nvidia artificially gimped Pascal" - and you're correct, you didn't state Nvidia did this and I apologize for that. Other people are saying "Nvidia isn't updating Pascal to the same degree" then other people are saying "There are differences between Pascal/Turing that could lead to Turing coming away faster. I don't buy the first two - because like I said, at launch there was numerous examples of games significantly faster on Turing than Pascal and in some cases vice versa. So someone arguing the first two points would also have to explain the difference in those 7 - now 8 games older games before Turing?.. and if they did have some other explanation why isn't that explanation good enough for this title?

    To me the answer is the third option.

    Nvidia specifically stated that Turing has roughly 50% more performance per CUDA core than Pascal. The 2080 has ~25% less CUDA cores than the 1080Ti but it's clocked slightly higher - even if we ignore the clock, based on Nvidia's estimation the 2080 should on average be 25% faster than the 1080Ti - making up the CUDA core difference and then some.

    That 50% more performance comes from a number of things but the one in particular is Turing's ability to dual issue INT/FP:

    https://devblogs.nvidia.com/wp-content/uploads/2018/09/image5.jpg

    They state:

    More and more modern titles are making use of integer instructions which is one of the reasons why Nvidia split this up in the first place. You don't need a magical vulkan extension to lean into this performance - if your game makes use of lots of integer compute then it's just going to naturally be faster on Turing/GCN/RDNA than it is on Pascal. Outside of having a completely separate shader path and managing two different instances of multiple different shaders/effects/(arguably entire game) there is no way for Rockstar or Nvidia to simply optimize this for Pascal. The game just naturally uses integer compute and thus it's just going to be faster on Turing - that's really it. Further, Turing also supports FP16 - I have no idea if RDR2 is utilizing it but I know Farcry 5/Wolfenstein both used it and now that both Nvidia/AMD offer it across their product stacks I'd imagine more developers are going to utilize it for various shaders - presumably these shaders will just fall back to FP32 on Pascal and thus run slower. It's just another example of something that Nvidia/Rockstar/whoever just simply can't optimize for.

    Keep in mind that these changes, separate integer pipelines/fp16/etc are all more likely to be used on modern titles than previous ones now that Nvidia has an entire lineup of cards supporting it. These advantages wouldn't show up in a 2080 launch summary of older games but they would probably start to show up in games nearly a year after Turing's launch.

    So even if you ignore the fact that this game isn't outside the range of expected performance based on 2080 launch benchmarks, you still have to consider that there are fundamental changes in the architecture that should push Turing further ahead in modern titles. You should expect Turing to outperform Pascal with no ability for developers or Nvidia to optimize.
     
    Last edited: Nov 9, 2019

  11. haste

    haste Maha Guru

    Messages:
    1,003
    Likes Received:
    276
    GPU:
    GTX 1080 @ 2.1GHz
    If you ever worked as low level programmer on a multi-platform engine, you would know that some render paths, extensions or even shader code affects architectures differently. These differences are either bugs in drivers or lack of optimization from developers. Averaging helps picturing the real performance. And there are usually multiple ways how to implement the same effect on different architectures.

    And in case of RDR2, even Maxwell handles it better compared to its average performance numbers(970/980TI).

    IMHO my best guess is that R* was optimizing for Turing right before release and crippled Pascal in the process, because they share quite a lot.

    EDIT: I should probably respond to this as well:
    Yes, separate INT is one of bigger changes for Turing, which is, unfortunately, completely mitigated by lower number of cores and increased overhead. NVIDIA did this change to accelerate compute and deep learning, not shaders in general. It's the same reason why they added tensor cores. In reality, you won't see much(if any) shaders widely using int variables. It's not even common to see correctly used halfs. Overwhelming number of shaders use just floats. Most of Turing's increased speed in games comes from increased on-chip cache and faster memory, which results(mostly) in faster texture sampling, rasterization etc... But again, not much faster with 2080 vs 1080TI due to its 352 vs 256bit memory bus.

    And TBH I really have no interest in spending hours explaining my reasons why I believe it's not any architectural difference causing this.
     
    Last edited: Nov 9, 2019
    HandR, Lucifer and jura11 like this.
  12. Astyanax

    Astyanax Ancient Guru

    Messages:
    6,975
    Likes Received:
    2,235
    GPU:
    GTX 1080ti
    reducing the parallel cores doesn't necessarily "mitigate" the seperation of int and fp, there are also cache design differences you need to factor in that mitigate the core counts.

    You can't explain it because its bs.

    Vega and RDNA both demonstrate explosive increases in performance when their architectures are taken advantage of.
     
  13. Irenicus

    Irenicus Master Guru

    Messages:
    569
    Likes Received:
    97
    GPU:
    1070Ti OC
    As others keep trying to drill into your thick skull, there is no gimping. Take off your tin foil hat.
     
  14. Robbo9999

    Robbo9999 Maha Guru

    Messages:
    1,479
    Likes Received:
    254
    GPU:
    GTX1070 @2050Mhz
    Good post, that seems pretty clear to me.
     
  15. haste

    haste Maha Guru

    Messages:
    1,003
    Likes Received:
    276
    GPU:
    GTX 1080 @ 2.1GHz
    You probably don't read, what i wrote as well, which is fine I don't blame you:
    I believe I've explained it well. Stating, that something is BS, doesn't make it BS. But you obviously have much more years of experience in development with these cards or consoles.

    Every architecture does and that was my point of R* gimping Pascal. So you obvisouly didn't even bother reading it.Good for you.
     

  16. Astyanax

    Astyanax Ancient Guru

    Messages:
    6,975
    Likes Received:
    2,235
    GPU:
    GTX 1080ti
    stating what you're saying is bullcrap, makes it bullcrap.
     
  17. airbud7

    airbud7 Ancient Guru

    Messages:
    7,835
    Likes Received:
    4,732
    GPU:
    pny gtx 1060 xlr8
    1- There are two ways to write error-free programs; only the third one works.

    2- One man’s crappy software is another man’s full-time job.

    3- A good programmer is someone who always looks both ways before crossing a one-way street.

    4- Software undergoes beta testing shortly before it’s released. Beta is Latin for “still doesn’t work.

    5- If builders built buildings the way programmers wrote programs, then the first woodpecker that came along would destroy civilization.

    6- Don’t worry if it doesn’t work right. If everything did, you’d be out of a job

    sorry if OT....to much coffee this morning I tell ya!.....:p
     
    HandR, alanm, Loophole35 and 4 others like this.
  18. Robbo9999

    Robbo9999 Maha Guru

    Messages:
    1,479
    Likes Received:
    254
    GPU:
    GTX1070 @2050Mhz
    + rep to you if you created those sayings rather than copy/pasting them, but funny either way!
     
    airbud7 likes this.
  19. sunnyp_343

    sunnyp_343 Master Guru

    Messages:
    505
    Likes Received:
    25
    GPU:
    Asus ROG GTX 1080
    Pascal performance is sad.specially 1080 and 1080ti
     
    MonstroMart likes this.
  20. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    11,594
    Likes Received:
    3,568
    GPU:
    2080Ti @h2o
    Don't leave, just use the ignore list. There's many people around here that are very vocal about expressing how little they know... just put them on ignore, they're not bringing valuable intel to the discussion anyway.
     
    pharma likes this.

Share This Page