The Division performance, and impressions.

Discussion in 'Videocards - AMD Radeon Drivers Section' started by oGow89, Jan 29, 2016.

  1. billydydcoen

    billydydcoen Guest

    Messages:
    70
    Likes Received:
    0
    GPU:
    RTX 2060 OC TUF
    Can i play it on medium settings with a good fps on the rig in my profile?


    edit: @1080p
     
    Last edited: Mar 9, 2016
  2. fr33jack

    fr33jack Guest

    Messages:
    1,153
    Likes Received:
    4
    GPU:
    1050Ti @1.9/ 9.0GHz
    @billydydcoen, i'd say on minimum/med mix with some heavy effects turned off completely. You will be getting 30-40 fps as average...with occasional deeps to 20 or so...frequent in dark zone PVP area i'd say.
     
    Last edited: Mar 9, 2016
  3. OnnA

    OnnA Ancient Guru

    Messages:
    17,958
    Likes Received:
    6,813
    GPU:
    TiTan RTX Ampere UV
    Total War 2 and Atilla :)

    Also i have Played Kingdom Come Deliverance Beta and My R280x have 28FPS Low and GTX980 in the same location have 30FPS :3eyes:
    That was a shock for me. No comments :nerd:

    Overall game is based on CRyEngine and is not optimised at all IMO, to Play you need at last 16GB DDR3 !!
    Also i need to OC my CPU from 3.8GHz to 4.0GHz for smoother Play :)
    But if you see that Forest - Glorious Visuals

    Kingdom Come Deliverance v0.5 MSI R9 390
     
    Last edited: Mar 9, 2016
  4. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    Yeah the beta for Kingdom Come only has some rudimentary optimizations in this first public release build, kinda glad they had to delay the expected summer release (For Autumn I believe it was.) as I don't see how the game could have been ready in time in it's current state unless the in-house build is way ahead.

    Looks great though with all those advanced features like the "SVOGI" global illumination but it's very demanding too even when you scale back the settings.


    EDIT: As for Division once the servers stopped dying and the lag spikes disappeared it runs pretty nice despite AMD's probably pretty rushed initial optimization for the game and the NV co-operation via Ubisoft, according to that GeForce.com guide this is another one of those games that try to balance VRAM usage (75% of the GPU total.) and dynamically streams and loads content to try and keep things balanced, means it takes a while in more demanding areas before content like textures and LOD has fully streamed in but for how it looks (Despite being a "bit" scaled back since reveal, like all games kinda are these days.) I'd say it runs pretty acceptable and there's quite a few options to tinker with too although it's not quite perfect.
    (FOV slider would have been a nice addition, quick key for toggling the HUD elements off for screenshots would have been appreciated too but eh, for me at least those aren't critical, more important that they stabilize the servers.)
     
    Last edited: Mar 9, 2016

  5. Plug2k

    Plug2k Ancient Guru

    Messages:
    1,561
    Likes Received:
    34
    GPU:
    3090
    anyone have any idea how to access your season pass gear, stuff within the game, i cant locate my hazmat gear or anything else, do you get them from a vendor ? if so what vendor there are loads lol
     
  6. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    The game might not present constant situations, but the results from most websites concur that the data is more or less as presented. Even if you have variations, if multiple tests more or less present a certain performance, then that's what most people would be getting. I find very interesting the Guru3D benches actually.

    [​IMG]

    Unless it was a benchmark glitch, I have noticed three very interesting things. First of all, the GTX 770. Before anyone says about 2GB of VRAM, please look at the GTX 950. The 770 is 10% slower than the 7870. W T F. There is something wrong there. The 780Ti numbers also look off to me. It can't be at the 380x/280x level, unless Kepler has been seriously abandoned by NVIDIA, or it was THAT bad as a future-proof architecture. The last weirdness is the 980Ti/Titan. Despite NVIDIA marketing, I'm starting to wonder if Maxwell is actually memory bandwidth starved as the 980Ti/Titan are the only good-performing Maxwell part here on the back of its 384bit bus it seems.

    That was indeed the smartest thing they did. Now every major engine has to squeeze performance out of GCN to be competitive. It looks that the first results of that have been arriving the last year I would say, and things will get even better as time passes. Having a single driver architecture target (more or less), also helps with optimization. All indicators also show that the next generation of consoles will be AMD too, and I can't see them straying off x86/GCN, now that they are there. Iterative steps are always much more cost efficient, and the hardware itself seems fine.

    I believe that Mantle was developed using things they had to do for console APIs, which are very similar to Mantle/DX12/Vulkan. They are low-level x86/GCN APIs. No matter the differences between XboxOne/PS4, the target hardware is not that different down below. They took that expertise, there was a huge existing problem on the PC with the cruft around DX11, and they moved things forward. Of course their hardware has a head start. They even semi-admitted on their Reddit AMA that DX12 is heavily influenced by Mantle, which shouldn't be a surprise to anyone with a calendar and a brain.

    I would only expect a return on that stock if someone buys them out. I can't believe that they "cost" something like one and a half billion in market prices. That shows so much about the stock market I guess. Microsoft gave a billion for Minecraft, and you can get AMD with all its engineers and patents for one and a half. I would buy them if I could :p

    If the numbers from the websites are correct, it seems completely doable.

    The Division looks NOTHING like its reveal. I can understand why, but the fake reveals really need to stop. Or they should put "render target" or something in the engines. Up to now, only EA games using Frostbite seem to hold up. That thing about the VRAM actually sounds stupid to me. The reason that memory exists is so that it's full. What does that "75% full" mean? That you might get VRAM starved and not use 1GB of VRAM for example? A proper algorithm would use ALL of it, and swap from ram as needed.
     
  7. Undying

    Undying Ancient Guru

    Messages:
    25,477
    Likes Received:
    12,882
    GPU:
    XFX RX6800XT 16GB
    I asked HH to include Tahiti GPU in the review, i though he had one 280X laying around but apparently doesn't.

    Considering 280X and 380X ware 1-5% appart in all other games we can probably put 280X between 380 and 380X, thats couple of fps slower than 780ti, lol.
     
    Last edited: Mar 9, 2016
  8. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    That even depends on the situation. When tessellation and that extra gig is needed then the 380x is faster, but on the other hand the 280x has dem 300GB/sec of raw memory bandwidth. Either NVIDIA has stopped giving a **** about Kepler, or Kepler wasn't as forward-thinking as GCN is. Or simply nobody optimizes for it.
     
  9. SimBy

    SimBy Guest

    Messages:
    189
    Likes Received:
    0
    GPU:
    R9 290
    780Ti is one of the biggest disappointments. Consistently scoring low in recent games. This card was almost 700 EUR in europe. For comparison, custom R9 290 was 400 where I live. It went from being like 20% faster to 20% slower.
     
  10. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    Don't say that more than once. It summons the guy from the NVIDIA subforum with the 780Ti, who will paste the same 2014 benchmarks that prove that his card is the fastest in the universe :D
     

  11. oGow89

    oGow89 Guest

    Messages:
    1,213
    Likes Received:
    0
    GPU:
    Gigabyte RX 5700xt
    Don't know where you live, but i got mine for 250 €, new and custom, and comes with the highest overclock. At that point the gtx 780 ti was costing nearly 3 times mine. And as the gtx 970, it was almost 400 euros and the 980 was near 800 euros. Funny how things turned out. Was angry back then for saving some money on my card, now i watch the benchmarks and i giggle in the inside.
     
  12. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    I suggest GCN 1.0 products for maximum giggling of the insides :D
     
  13. zhalonia

    zhalonia Active Member

    Messages:
    99
    Likes Received:
    11
    GPU:
    r9 290 4gb
    my r9 290 here's my story : i was using 6.2 driver from amd but in the devision after a while got a crash everything froze had to hardreset my pc upon restart i noticed my fanspeed was locked at the same speed when the devision crashed was able to turn it down via msi afterburner but my cards fanspeed did not respond anymore .. in not a single game so i removed the driver and installed the newer 16.2.1 everything worked fine but got the same result again ! now im waiting petiently for 16.3 cause im pretty scared for my pc right now i could still play if i turn the fanspeed up but what about the crash .. what if i have it again .. might hurt my baby :( help is always appreciated ofcourse ! ;) bye !
     
    Last edited: Mar 9, 2016
  14. SimBy

    SimBy Guest

    Messages:
    189
    Likes Received:
    0
    GPU:
    R9 290
    250€ is a steal. Unfortunately where I live the cheapest one was around 395€. And that was almost 2 years ago.
     
  15. Zweite93

    Zweite93 Guest

    Messages:
    31
    Likes Received:
    0
    GPU:
    XFX R9 290X Core Edition
    280x - 37~45
    380x - 36~43

    WUT?
     

  16. Undying

    Undying Ancient Guru

    Messages:
    25,477
    Likes Received:
    12,882
    GPU:
    XFX RX6800XT 16GB
    380X is bandwidth starving card. It isnt surprise its not faster than 280X.

    This game or any other they perform similiar.
     
  17. nevernamed

    nevernamed Master Guru

    Messages:
    234
    Likes Received:
    4
    GPU:
    EVGA 1080 TI FTW3
    Game seems to run better for me than the last beta. The last beta had all sorts of weird FPS drops at weird times, frametime was a bit of a mess. In the brief time I spent with it last night, it seems to be running quite solid..
     
  18. zhalonia

    zhalonia Active Member

    Messages:
    99
    Likes Received:
    11
    GPU:
    r9 290 4gb
    *update* was able to fix my fanspeed but still weird that a crash can turn of automated fanspeed... reported the bug also on ubisoft forums so poeple don't burn their card(s) win10 16.2 & 16.2.1 crimson drivers
     
  19. Embra

    Embra Ancient Guru

    Messages:
    1,601
    Likes Received:
    956
    GPU:
    Red Devil 6950 XT
    Your PSU could be the problem. Can you give the model number??
     
  20. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    If those numbers from that Russian site are correct, I have no words. The Nano being as fast as 780ti SLI (with scaling), the 7870 faster than the 960, the 280x faster than the 780ti/970, the 680 50% slower than the 7970... WTF.
     

Share This Page