Review: Red Dead Redemption 2: PC graphics benchmark analysis

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Nov 5, 2019.

  1. mackintosh

    mackintosh Maha Guru

    Messages:
    1,162
    Likes Received:
    1,066
    GPU:
    .
    Wouldn't be the first time nvidia pulled something like this, they have quite the track record, after all. I don't trust them one bit.
     
    Dragam1337 likes this.
  2. thepath64

    thepath64 Guest

    Messages:
    6
    Likes Received:
    1
    GPU:
    RTX 2060
    Do you even know what setting the xbox one x using ?? it could be running low to medium which give almost double fps compared to higher settings.

    The division 2 also runs native 4K on xbox one x, but according to digital foundary, terrain quality runs lower the lowest setting on PC


    So, it not surprise that the game run native 4K when some setting are even lower than low, and runs only 30fps
     
    jbscotchman likes this.
  3. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    They've never pulled anything like this. Stop spreading misinformation.
     
    fantaskarsef, Irenicus and airbud7 like this.
  4. AndroidVageta

    AndroidVageta Guest

    Messages:
    6
    Likes Received:
    7
    GPU:
    MSI 7970 3GB
    The 10 series seems to just be down all around here, even vs. AMD's offerings...what's this game doing differently than every other game out there?
     
    Dragam1337 likes this.

  5. SpajdrEX

    SpajdrEX Ancient Guru

    Messages:
    3,399
    Likes Received:
    1,653
    GPU:
    Gainward RTX 4070
    Undying, airbud7 and Strange Times like this.
  6. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    Efficiency perhaps, I don't know the improvements in Turing compared to Pascal however but if it relates to improvements for D3D12 or Vulkan that might be part of it.
    Comparably the Vega GPU's and Radeon VII from AMD should be stronger than Navi but they aren't due to shortcomings and limitations though it's a bit different with Navi starting a new architecture after GCN.

    Add driver optimizations and improvements that might allow newer hardware to scale even better and that might explain it although I'm certainly no expert on what the specifics are.

    EDIT: Ah but I see the VII's actually holding it's own here edging out just ahead of the Navi, not bad though the gap is pretty close between the two.
     
  7. Kool64

    Kool64 Ancient Guru

    Messages:
    1,656
    Likes Received:
    783
    GPU:
    Gigabyte 4070
    Hats off to team red for this one. By next gen I'll probably have a good card to properly replace my 1070.
     
  8. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    Utilizes compute more? FP16 support on various shaders? Better use of intrinsic shaders on AMD? Optimized for filling GCN's code path properly to avoid stalls (something that has been done on previous games - see "Strange Brigade" for example) hundred other reasons, etc.

    The argument that Nvidia is gimping their own cards on this specific game - but no other game that's come out recently, just doesn't make any sense. There is no proof of it. There's been no proof in the past that they've done anything like that. I'm tired of reading it.
     
  9. AndroidVageta

    AndroidVageta Guest

    Messages:
    6
    Likes Received:
    7
    GPU:
    MSI 7970 3GB
    This seems a bit suspicious as well. I don't think there being another example of this is conclusive to anything though...at 1080p though the 1080Ti has the advantage by a bit while the 2080 is considerably higher.

    Weird really. I know that pure architectural changes can certainly bring on their own new way of doing things but this game along with CoD really are some outliers against everything else while the games themselves don't appear to be doing anything differently.
     
    SpajdrEX likes this.
  10. HybOj

    HybOj Master Guru

    Messages:
    394
    Likes Received:
    325
    GPU:
    Gygabite RTX3080
    Requirements look extreme, but we are talking ultra here.

    To the folks comparing the console performance to pc... have you seen the comparison screens of console vs pc? On majority of the screens the difference reminded me Crysis on low vs ultra. So thats why.

    And yes. The elephant in the room. Rx 5700xt. My next card. Enjoy your 1usd less electricity bills... or run some rtx demo :)

    Honestly... what nvidia did with 10xx series here is evident. I will much enjoy exact opossite with AMD and my card getting better with time. This is the story which repeats itself for a decade, and I will not ignore this fact, when I decide on the upgrade. 5800xt will end up beating 2080 or better in a while.
     
    Last edited: Nov 5, 2019

  11. Strange Times

    Strange Times Master Guru

    Messages:
    372
    Likes Received:
    110
    GPU:
    RX 6600 XT
    pascal gang where a you lmao
     
  12. HybOj

    HybOj Master Guru

    Messages:
    394
    Likes Received:
    325
    GPU:
    Gygabite RTX3080
    There is no proof but who cares about the technical reasons? Its the results the end user should be concerned about. 10xx series are DEAD in this game. And surelly this will not be the only game.

    Its because nVidia is like a highly optimized car for ONE specific road. Change the road and you are done.

    AMD is a bulldozer with crazy real power, thats why it shines in many aplications besides games, its real power is huge. 5700xt is more powerful than 2080ti in many specific cases.

    So here the road has changed, and the 10xx gen is over, RIP. Who cares why... deffinitely not the 5700(xt) users...
     
  13. Robbo9999

    Robbo9999 Ancient Guru

    Messages:
    1,855
    Likes Received:
    442
    GPU:
    RTX 3080
    Hopefully NVidia can provide some driver optimisations for their Pascal 10 series cards, or maybe like Denial said earlier...maybe this game is just using parts of the Turing cards that excel in comparison to Pascal. Will be interesting to see how this pans out and if there will be any further NVidia Pascal driver optimisations, I would bet it's not the drivers though.
     
  14. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    5,535
    Likes Received:
    3,581
    GPU:
    RTX 4090 Gaming OC
    Nvidia aint gonna waste a single breath on pascal - they want you to buy turing. Anyone thinking this is not intentional are delusional to how corporations operate...
     
  15. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    Ah, here I was thinking I was on Guru3D and not Gamespot.

    How do you explain NULL, Image Sharpening, Image Upscaling and DXR then? Seems weird that they brought all these things back but apparently "not wasting a single breath" on Pascal.

    Lol this makes it sound like updating newer than 39x.xx slows your 1080Ti down to a 2060 in all games? So if Hilbert rolled back the 1080Ti driver for this test it would suddenly perform better in RDR2?
     
    Robbo9999, fantaskarsef and Undying like this.

  16. Berke53

    Berke53 Active Member

    Messages:
    65
    Likes Received:
    13
    GPU:
    2x ASUS GTX 1080 Strix OC
    GTX 1080 SLI owner here. Apparently this game supports multi gpu using the vulkan api out of the box. Both cards are pegged at 99% and performance is increased by an average of 60%. However... In some areas there seem to be some flickering and performance issues with a mgpu config. Also.. The afterburner OSD is flckering constantly. Its really interesting to see this as this is the first vulkan game utilizing both cards. Apparently rockstar wanted to support mgpu rendering in this title but they only came half way. I really hope they keep coding improvements for this as it is not a luxurity to have more than one gpu for this incredibly gpu hungry game.
     
  17. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    @Hilbert Hagedoorn - Can you confirm this?
     
    Robbo9999 likes this.
  18. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    5,535
    Likes Received:
    3,581
    GPU:
    RTX 4090 Gaming OC
    They just applied those things universally to the nvidia driver - not specifically for pascal. And in case of DXR, they only made it available to pascal in order to make people go "oh my performance with it is crap, i gotta upgrade!"...

    But you are suggesting that they were good do'ers by not making their usual artifical segregation by only making it available to the last gen cards, even if there has never been any reason for them to do it in the past, aside of trying to force people to upgrade... ?
     
  19. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    For the image upscaling they specifically said they use a different system to do it on Turing that yields better image quality - so right there they had to do something different for Pascal. It still requires testing that it works on Pascal. If they they were simply get people to buy Turing wouldn't they just not bring any of the features back - wouldn't that make Turing more alluring?
     
    Last edited: Nov 5, 2019
  20. Yxskaft

    Yxskaft Maha Guru

    Messages:
    1,495
    Likes Received:
    124
    GPU:
    GTX Titan Sli
    And Game Ready drivers aren't just about performance optimizations. They can also be about fixing various rendering bugs.
    And that's also why some games sometimes show better performance on older drivers, not because of gimping, but because something might be rendering incorrectly or not at all.


    It does happen that older GPUs see regression in newer drivers, but saying Nvidia is paying its engineers to deliberately sit and write code to bring down the performance on their older series is just insane.
     

Share This Page