Review: The Division 2: PC graphics performance benchmark analysis

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Mar 15, 2019.

  1. Turanis

    Turanis Ancient Guru

    Messages:
    1,695
    Likes Received:
    393
    GPU:
    Gigabyte RX500
    Hey Mr Hilbert,Happy Weekend!Thanks for the review. :)

    (Spring time always gives some kind of depresssing stuff to the people who work hard,but in april-may the organism and mind will have a good health.So keep it up and keep walking. :)
     
    Hilbert Hagedoorn likes this.
  2. Corrupt^

    Corrupt^ Ancient Guru

    Messages:
    6,968
    Likes Received:
    301
    GPU:
    ASUS 1080GTX STRIX
    No idea why they're complaining.

    The main reason why I requested this be added to reviews a few months back is so we can see if a game utilizes more than 4 cores or not and that seems to work. In TD2 the difference is there, albeit smaller than in some other titles (BFV).

    Guess (partially thanks to Ryzen offering affordable 6-8 core CPU's) the reign of 4 cores is slowly dying.

    Thx for adding it to your reviews btw :)
     
    Last edited: Mar 16, 2019
    Hilbert Hagedoorn likes this.
  3. Equinoxe

    Equinoxe Master Guru

    Messages:
    251
    Likes Received:
    17
    GPU:
    MSI GTX 980 TI GAMING SLI
    I hate the fact they did not include SLI.

    Division 1 has fantastic SLI support. And now they did not even bother.

    ''Engine incompatible''


    Its the same damn engine.
     
    Undying likes this.
  4. SSJBillClinton

    SSJBillClinton Master Guru

    Messages:
    743
    Likes Received:
    118
    GPU:
    2080Ti Zotac AMP!
    Pretty sure a few months after launch SLI was still crap on the first game, flickering and low usage compared to single card.

    Wouldn't surprise me if this game is a repeat of that since it is Ubisoft after all.
     

  5. foetopsyRus

    foetopsyRus Member

    Messages:
    29
    Likes Received:
    1
    GPU:
    MSI GTX 1080 X
    because sly Huang stopped optimizing gtx 1080Ti properly in old games, including dx12 gtx 1080Ti faster 2080
    buying cards nvidia you get a subscription for 2 years for your money and optimize drivers
     
  6. Undying

    Undying Ancient Guru

    Messages:
    14,167
    Likes Received:
    3,354
    GPU:
    Aorus RX580 XTR 8GB
    Thats true, we see that every generation of nvidia cards but to be honest in those two years cards perform very good.
     
  7. __hollywood|meo

    __hollywood|meo Ancient Guru

    Messages:
    2,980
    Likes Received:
    133
    GPU:
    MSI 970 @1.55ghz
    to the guys discussing the vram utilization earlier, i can chime in to clear up the details yall were theorycrafting about. owners of the game have noticed that the division doesnt have a texture quality setting at all, only the option for anisotropic filtering & sharpening filter. the snowdrop engine manages this automatically in a pretty straightforward way. it does map the entire framebuffer, but how it functions is not really like any caching techniques that ive encountered before.

    the game calculates vram necessary for the lighting/shadows that need to be rendered immediately (volumetric fog refracting sunbeams? spotlight shadows being cast? ambient occlusion footprint for the current scene? etc etc) alongside the simple geometry of the world & any projected reflections...then allocates the remaining vram for textures currently being displayed. if there is any left over, it fills the framebuffer with textures in adjacent areas to the players physical location.

    if there isnt, it takes into account the object detail + streaming distance settings, which, in conjunction, control the LOD scaling. counterintuitively, object detail doesnt just control the resolution of the meshes according to the streaming distance cutoff...it can cull entire models if set low enough. ive observed small props disappearing from under 50yds away when lowering object detail. ANYWAY it calculates the aggregate vram allocation of these two settings, & the texture resolutions required per the LOD scale table...& then it lowers every texture resolution (via logical proximity to the player) until your vram isnt choked anymore. the less vram you have, the closer low res textures will be to your character.

    my 970 with 3.5gb vram has some problems @25x14 res & requires lower settings (volumetric fog & shadows) to ease framebuffer pressure. i overclock heavily & get nearly the same fps as my friend with an RX480.

    TL;DR theres caching but its weird.
     
    Exodite, airbud7 and JonasBeckman like this.
  8. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,155
    Likes Received:
    2,631
    GPU:
    AMD S. 5700XT Pulse
    Settings also exist in the config file same as the first game. :)

    ["streamer dedicated budget"] = 512,
    ["streamer memory fraction"] = 0.75,

    Though that's just part of how it works, interesting to hear more in-depth detail on the rest of this system. :)
    (First value I think is the one for the in-game option which can also be extended a bit higher but VRAM is also used for other things and this is just one part of it and then there's the cache percentage I think but from the above details it might just be a part of the whole for how it handles memory usage and allocation. Interesting info on that!)
     
  9. metagamer

    metagamer Ancient Guru

    Messages:
    1,812
    Likes Received:
    689
    GPU:
    Palit GameRock 2080
    True. I remember when the 2080 came out, the initial reviews had the 1080ti up in many games. Now, you see the 2080 beat the 1080ti pretty much everywhere.

    That's not necessarily a bad thing. The 2080 was always going to gain some more performance with new drivers, there was still some optimisation to do. And I don't blame Nvidia for focusing on the new generation of cards. The 1080ti performs as ever, it doesn't get worse. What we see is the new cards improve.

    I'm ok with that, I don't keep my cards longer than 18 months in general.
     
    jbscotchman likes this.
  10. BReal85

    BReal85 Master Guru

    Messages:
    433
    Likes Received:
    141
    GPU:
    Sapph RX 570 4G ITX
    "destroyed" = 2 to 4 fps in 4k and 1440P. Nice try lol.
     

  11. vbetts

    vbetts Don Vincenzo Staff Member

    Messages:
    15,072
    Likes Received:
    1,635
    GPU:
    GTX 1080 Ti
    Hey hey hey.

    Watch it. No flamebaiting today!
     
  12. Denial

    Denial Ancient Guru

    Messages:
    13,150
    Likes Received:
    2,647
    GPU:
    EVGA RTX 3080
    I don't know what reviews you were looking at. Guru3D's 2080 review the 2080 was faster in ~80% of games at various resolutions (sometimes significantly) - TPU's overall average has the 2080 up by 8%. That's not to say that drivers haven't/won't increase 2080 performance further - just that the 2080 was always on average faster than 1080Ti.

    The person you're quoting is saying Vega VII is same/faster in D2 and you're saying its because Nvidia stops optimizing drivers in old games. D2 is brand new game. Vega VII was ~5% slower at launch (average) than a 1080Ti. Is it really so hard to believe that Division 2, optimized by AMD, manages to recoup that 5% loss? not that Nvidia is purposely gimping it's older cards?

    Why do people immediately rush to conspiracy?
     
    Last edited: Mar 18, 2019
    Embra likes this.
  13. TestDriver

    TestDriver Member Guru

    Messages:
    118
    Likes Received:
    15
    GPU:
    1080 Ti, C49HG90
    Finally a game that makes me want to upgrade my very aging 3570k, getting subpar performance with my GPU compared to this article.

    Looking forward to the Ryzen 3000 series, then I will finally upgrade :)
     
  14. circeseye

    circeseye Master Guru

    Messages:
    249
    Likes Received:
    4
    GPU:
    Sapphire NITRO+590
    im on a amd 8350 with a rx590...1440p res high settings dx12 and it doesnt drop below 60fps (vsync is on)
     
    Undying likes this.
  15. TestDriver

    TestDriver Member Guru

    Messages:
    118
    Likes Received:
    15
    GPU:
    1080 Ti, C49HG90
    I'm getting 75fps in the benchmark at 1080p@ultra.

    But have seen in tests that this game loves more cores, so could be that your 8 core really makes the difference

    But this is the first game I have tested that you really see that my cpu in not up to par, in other games the difference isn't that big.
     

  16. nizzen

    nizzen Maha Guru

    Messages:
    1,208
    Likes Received:
    306
    GPU:
    3x2080ti/5700x/1060
  17. waltc3

    waltc3 Maha Guru

    Messages:
    1,151
    Likes Received:
    357
    GPU:
    AMD 50th Ann 5700XT
    I got the game free of charge with my RX-590 purchase back in December...and as the game is multiplayer-only with no single-player campaign, I would never have bought it myself. I may see about gifting someone who enjoys these games with my copy--if I can figure out how to do it through Steam.
     
    Undying likes this.
  18. Pixrazor

    Pixrazor New Member

    Messages:
    8
    Likes Received:
    1
    GPU:
    Sapphirre R9 Fury N
    @waltc3 man!
    I will gladly play it if you are giving it ^^
     
  19. The Goose

    The Goose Ancient Guru

    Messages:
    2,556
    Likes Received:
    118
    GPU:
    MSIrtx2080 superXS
    I had no issues in either the private or public test apart from a couple of connection drops in the private
     
  20. jbscotchman

    jbscotchman Ancient Guru

    Messages:
    4,868
    Likes Received:
    3,627
    GPU:
    MSI 1660 Ti Ventus
    Very gentleman of you. I would ask, but I really don't play MP much at all now days. Congrats to whoever gets it.
     

Share This Page