1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Radeon RX 480 for 1080p60 will be plenty for The Division 2

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Mar 13, 2019.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    35,082
    Likes Received:
    4,270
    GPU:
    AMD | NVIDIA
    Undying and airbud7 like this.
  2. airbud7

    airbud7 Ancient Guru

    Messages:
    6,776
    Likes Received:
    3,216
    GPU:
    pny gtx 1060 xlr8
    well that's good news for everyone ....I have a rx480 (4gb though:() in my htpc ....it is still just as fast as my 1060 6gb and sometimes faster/ more fluid.
     
    Keitosha and BlackZero like this.
  3. spajdrik

    spajdrik Ancient Guru

    Messages:
    1,698
    Likes Received:
    181
    GPU:
    Sapphire RX580 8GB
    I'm glad they fixed DX12 crashes I had during closed/open beta. And it's noticeably faster than DX11.
     
    airbud7 likes this.
  4. Mpampis

    Mpampis Member

    Messages:
    32
    Likes Received:
    24
    GPU:
    MSI RX480 8GB
    I played (or tried to play) the open beta 10 days ago.
    Most times I couldn't even get past the menu. The game would crash, and I did sent a few of the reports to the devs.
    A few hours before the end of the open beta, I lowered the settings and was able to play very smoothly. I'm on a Ryzen 7 1700 with an MSI RX480 Gaming X, playing on a 2560x1080 ultrawide freesync monitor.
    I've had problems while gaming before, so It's probably my system that caused the crashes.
    I liked what I saw.
     
    airbud7 likes this.

  5. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    10,213
    Likes Received:
    2,463
    GPU:
    1080Ti @h2o
    Not bad. Although I'm not too impressed with min / optimal / max specs, and that a 480 can drive 1080p medium settings with 60fps also is not surprising.
    The 480's a good card, but what's the big deal? I don't get it tbh.

    I'm more surprised they think that a 1660TI will be running this game at 1440p/60/high details.... for that to believe it, I would want to wait until I see benchmarks for this.
     
    BlackZero and airbud7 like this.
  6. no_1_dave

    no_1_dave Master Guru

    Messages:
    216
    Likes Received:
    3
    GPU:
    1080 Ti
    There is a big difference between the Radeon VII and the 2080Ti.
    If they're advising Radeon VII surely they should be mentioning 1080Ti or 2080, not the 2080Ti...
     
  7. learners permit

    learners permit Master Guru

    Messages:
    295
    Likes Received:
    1
    GPU:
    Xfire 290X
    Yes and a big FU to multi gpu users with 4K monitors who like more than 30 FPS! Thanks a million!
     
  8. kilyan

    kilyan Master Guru

    Messages:
    565
    Likes Received:
    4
    GPU:
    evga 1080 sc gaming
    Unfortunately not completely, the white screen bug is still there, the brightness goes randomly crazy
     
  9. HWgeek

    HWgeek Master Guru

    Messages:
    365
    Likes Received:
    254
    GPU:
    Gigabyte 6200 Turbo Fotce @500/600 8x1p
    Look at VRam requirement, looks like 8GB is the limiting factor for 4K max with RTX 2080.
     
  10. gx-x

    gx-x Maha Guru

    Messages:
    1,218
    Likes Received:
    76
    GPU:
    MSI 1060 6G Armor
    Good news is, the game is so slow paced, 30fps is plenty. That's about what I got with 570 on high details, before I uninstalled the junk.
     

  11. vbetts

    vbetts Don Vincenzo Staff Member

    Messages:
    14,361
    Likes Received:
    896
    GPU:
    Nvidia Geforce GTX 960M
    I've got a spare one that I've been tweaking with, Polaris is very fun to tweak actually! Currently have it stable at 1400/2200.
     
    Keitosha, Undying, airbud7 and 2 others like this.
  12. waltc3

    waltc3 Master Guru

    Messages:
    876
    Likes Received:
    211
    GPU:
    XFX 590 8GB XFire
    Since multi-GPU support (formerly Crossfire/SLI) has been officially integrated into D3d12, as opposed to being supported only by 3d IHV custom add-on driver packages outside the API in D3d11 & earlier--why are so-called "enthusiast" gaming sites ignoring it these days? If anything, it should be getting more attention as opposed to less, now that it's no longer the red-headed stepchild it used to be. When blockbuster games like Shadow of the Tomb Raider support it--and even have back-supported d3d12 multi-GPU support to Rise of the Tomb Raider--why is Guru 3d completely ignoring it? I'm certainly asking because I am surprised how well Crossfire and multi-GPU support work these days--had no idea how easy it is until I tried it with my RX-590/480 8GB setup back in December. Also, the AMD drivers all carry the old custom Crossfire profiles in every driver release. Works really well @ 3840x2160, btw (which is what I was hoping for when I bought it.)

    The D3d12 ShadowoftTomb Raider in-game benchmark, for instance, gets a ~95% scaling increase in framerates @ 3840x2160 over the RX-590 by itself, 95% of the eye-candy on (from a 33fps average 590 only to ~61fps 590/480 multi-GPU average--with peaks > 100 fps)--which I think is fantastic! It's an especially nice option for people who already own an RX-480 8GB and even a 580 8GB--I wouldn't recommend people go out and buy 2 590's at once, of course, but hundreds of thousands of people already own a 480/580 8GB, purchased a year or more ago, and for them the RX-590/4-580 8GB Multi-GPU option is a no brainer--if performance and value for the dollar motivates them. I'm very happy with the setup, atm. It's so easy to do, for instance, because now you can turn off Crossfire support directly in an individual game profile--or turn it on--no more rebooting and so on, and the requirement for matching MHz frequencies is a thing of the past (RX-590 @ 1.6GHz, RX-480 @ 1.305GHz), I was also glad to see! It's as near transparent as it can be--certainly an order of magnitude better than it was when I last tried Crossfire many years ago with twin 4850's!

    It would really be nice to see this information posted about new D3d12 game releases--I'm expecting to get the Div 2 any day now as the last game due me for the RX-590 purchase (Already have DMC5 and RE 2). Be nice to know if Div 2 was a multi-GPU title like Tomb-Raider, though! Someone has already said that frame-rate wasn't important in this game--which is fine--but I just like knowing these things, ya' know? It's just good information to have.
     
    airbud7 likes this.
  13. Rich_Guy

    Rich_Guy Ancient Guru

    Messages:
    12,151
    Likes Received:
    359
    GPU:
    Sapp. RX Vega 64 LC
    LOL @ Win 7 only for 1080p60, when Metro Exodus said Win 7 only for 1080p60 too, with low settings, when it runs exactly the same in 12 as it does in 11, and im running it @ 3440x1440 at Ultra on Win 7, and getting around 50/70fps :p

    EDIT:

    If you go to the actual game site to look at the specs, you get Win 7 n Dx11 for all. :D

    [​IMG]
     
    Last edited: Mar 13, 2019
    airbud7 likes this.
  14. gx-x

    gx-x Maha Guru

    Messages:
    1,218
    Likes Received:
    76
    GPU:
    MSI 1060 6G Armor
    do you have surface optimization ON in radeon settings Rich? Or did you put it to off to get rid of FP12 shaders and get back to FP16 as it is default (FP16 also default for nV cards)?
     
  15. airbud7

    airbud7 Ancient Guru

    Messages:
    6,776
    Likes Received:
    3,216
    GPU:
    pny gtx 1060 xlr8
    Thanks! ....good information and I agree
     

  16. Rich_Guy

    Rich_Guy Ancient Guru

    Messages:
    12,151
    Likes Received:
    359
    GPU:
    Sapp. RX Vega 64 LC
    Ive got it on.
     
    gx-x likes this.
  17. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    10,213
    Likes Received:
    2,463
    GPU:
    1080Ti @h2o

    mGPU has become even more of a step child. The thing was, as long as it was in the driver, the manufacturer could make sure it's supported properly, or at least help the devs with it. Some of the work (read: money) needed to make it work was put in by the GPU manufacturer (Nvidia's implementations SLI for instance, as well as CFX from AMD).

    But now, that work has to be done by the devs. And they are under much more preasure to get things working in a much tighter timeframe then ever before. Hence... they just don't do mGPU. It's as simple as that, it's a decision of the dev if he wants to press that kind of money into the adaption of mGPU services. And that money is not spent on mGPU but on marketing budgets, we know that the technical side of games is merely an annoyance when it comes to a game's revenue generated. The more money you put into development, the less profit they make. Hence, save lots of engineering hours for virtually the same sales numbers. Financially, it's a no-brainer to ignore mGPU.

    When it comes to your example with mGPU in SotT, the issue is, why should AMD / Nvidia want you to buy two lower end cards, when they really want you to buy one high end card? Where the margin for them is higher with one high end card? That doesn't make any sense. That's why, beginning with Pascal, I already hat the impression that Nvidia was deliberately limiting SLI of lower cards (as well as performance in SLI / scaling) to the advantage of buying something like a 2080TI instead of 2060SLI. Not to mention that they started with simply removing the connectors in the first place... that's a deliberate move on Nvidia's side.
    On AMD's side they're trying to help keeping CFX alive (and they have an advantage working only via the PCIe BUS rather than a cable, but that's just my opinion), but you see it also on the deline there because AMD simply does not have the money of hiring a lot of engineers to send them out to devs to help them implement mGPU in their games. Devs of games like AotS only can do this because they have enough money to work on it, and make it an example for benchmarks (2016 or so everybody was crazy about AotS benchmark, nobody played the game at all). Devs of benchmarks themselves have time to do this, but they're not even interested in it!

    All in all, with dx12 mGPU they took any chances we had of AMD / Nvidia doing the work, pushed it onto a side that has even less freedom to develop such "technological infrastructure". Devs usually only program for a single GPU anyway since consoles only have one... with the death of dx11 we will see the death of mGPU, I tell you. It's sad, but it never was what I personally wished dx12 mGPU should have been in the first place (no pooled VRAM, not adressable as one GPU for performance purposes etc.)
     
  18. kilyan

    kilyan Master Guru

    Messages:
    565
    Likes Received:
    4
    GPU:
    evga 1080 sc gaming
    I preordered the game today, it seems is doing well, i just fear the day one patch, but far from becoming a mess like it is anthem, at least i hope so...
     
  19. waltc3

    waltc3 Master Guru

    Messages:
    876
    Likes Received:
    211
    GPU:
    XFX 590 8GB XFire

    Multi-GPU support in SofTR is *excellent*, btw. Have 0 problems with it. Very impressive! I don't think you are realizing what moving it into the D3d API means--multi-GPU support should always have been in the API because who better to implement it in their game engines than the developers who build the engines? Bolting it on in the GPU drivers by AMD/nVidia after the fact, was always a kludge and a hack--which sometimes worked and sometimes didn't. Workarounds were implemented to get the OS to see two GPUs as one, for instance. That's history. Now all of that is in the OS in D3d/DX--much better! Utilizing multi-GPU in their engines is far, far more trivial for developers than RTX game support, for instance...I should think that is obvious! And it has a far bigger impact on gameplay than does RTX, etc. One reason we aren't seeing that much of it right now is because lots of developers are still trading off their older engines--to which a few D3d12 features have been bolted on for marketing purposes, more or less. I believe that newer game engines will see multi-GPU support go widespread--but we'll see.

    Well, it's obvious that nVidia doesn't want anyone buying 1060's to do SLI, or the cheaper RTX GPUS, isn't it? AMD supports multi-GPU with RX-480/580/590, so I'd say that is a fair indicator that AMD wants to sell you another card--if not the most expensive of the lot, then a less expensive card to pair with the AMD card you already own. Look, if developers don't have the money to support multi-GPU in their engines then they surely won't have the money to support RTX, eh? As I mentioned, RTX is far more onerous and complex to implement, and the benefits are paltry--questionable at best. Multi-GPU support, on the other hand, gives developers immediate and tangible benefits in their games because it greatly ratchets up the framerates for their customers. If they must choose between the two, multi-GPU support will win every time, imo.

    AMD's X-fire utilizing the PCIe 3.x bus works extremely well--and there's nothing stopping nVidia from doing the same. But as usual, nVidia wants to stick it to its customers and artificially manipulate them into paying more for less, basically. Everything is "proprietary" and comes at a cost when it comes to nVidia--look at how pathetic nVidia's support for Freesync is, currently, when AMD simply gives it to them! Look at the non-support for SLI--which nVidia also insists on licensing--in the lower end of even its freaky-expensive RTX GPUs! I'll be honest and say that I, personally, could care less what nVidia wants me to do because I'm not doing it...;) Ever.
     
    fantaskarsef likes this.
  20. waltc3

    waltc3 Master Guru

    Messages:
    876
    Likes Received:
    211
    GPU:
    XFX 590 8GB XFire
    Right now, I'm in the last 10GB's of the pre-load from UbiSoft--got my third free game...;) Guess they will open it up tomorrow!
     

Share This Page