Review: Star Wars Battlefront II PC graphics performance analysis

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Nov 15, 2017.

  1. Mysteryboi

    Mysteryboi Member Guru

    Messages:
    137
    Likes Received:
    21
    GPU:
    EVGA GTX TITAN SC
    Came for the comments wasn't disappointed even the conclusion summed it up nicely good job hilbert

    Boycott EA / Activision rubbish greedy companies that ruin game franchises / developer studios
     
    vonSternberg likes this.
  2. kroks

    kroks Active Member

    Messages:
    76
    Likes Received:
    12
    GPU:
    RTX 2070 SUPER
    haha drm
    stop buying ea games!!!
     
  3. Vipu2

    Vipu2 Guest

    Messages:
    553
    Likes Received:
    8
    GPU:
    1070ti
    I think the DRM is smallest problem this game have.
     
    Silva likes this.
  4. Keesberenburg

    Keesberenburg Master Guru

    Messages:
    886
    Likes Received:
    45
    GPU:
    EVGA GTX 980 TI sc
    Why Nvidia can not get the same performance with dx12 but even lesser performance, and with other DX12 games Nvidia get the seme or better performance. It smells fishy
     

  5. Nima V

    Nima V Active Member

    Messages:
    58
    Likes Received:
    9
    GPU:
    GTX 760 2GB
    Another game that performs better with DirectX 11. I wonder why some developers still insist on wasting time and money for implementing low level APIs despite all the problems they bring to PC games.
     
  6. Maddness

    Maddness Ancient Guru

    Messages:
    2,440
    Likes Received:
    1,738
    GPU:
    3080 Aorus Xtreme
    Fury-X still performs pretty well in this game, even at higher resolutions.
     
  7. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    With DX11 Nvidia controls like 90% of the work being done on the GPU. This allows them to tweak the dispatch queues and everything to maximize the full use of their pipeline. With DX12 the majority is entirely on the developer. Nvidia's entire architecture is essentially based on that dispatch/scheduler too - the magic that's allowed them to keep a performance/power lead over AMD with significantly smaller scheduler occurs entirely in the software/driver stack.

    Some devs are going to deep dive into Nvidia's architecture and try to optimize performance, other devs are going to phone in it - hit some internal threshold (Example: "We want to hit 1080Ti @ 4K @ 60fps average on our internal benchmark) then stop optimizing. It's not like further optimization is going to significantly increase sales, a majority of their customers don't even have access to DX12, and the nature of optimization is typically exponential in terms of time required. You can spend like 100-200 man hours getting 20 fps out of an architecture but the next 20fps might require thousands of hours.

    AMD on the other hand is in every xbox - so most devs are far more familiar with their architecture. Being in the consoles it also requires them to spend more time in general optimizing and since the code path for AMD/Nvidia is split at some point, AMD obviously will get way more attention, which bleeds over to their GPU's in the PC space.

    The full benefits of DX12 won't come until SM6.0 is out and DX11 is completely eliminated from the development scene. By that time you won't even recognize the benefits because you won't have DX11 variants of the games to compare them to.
     
    Last edited: Nov 21, 2017
    geogan likes this.
  8. Kaarme

    Kaarme Ancient Guru

    Messages:
    3,513
    Likes Received:
    2,355
    GPU:
    Nvidia 4070 FE
    Only Intel bosses believe that development should stop and things remain the same forever. I don't understand people who want us to be using dx11 still in 2030. Just like Intel wanted us to be using quad cores still in 2030. What would be the point of dx12 if it was the same as dx11, only with some superficial changes?

    The current GPU architectural pathway already reached the end of the road anyway. Gigantic monolithic designs aren't viable past a certain point. Things need to change, both in hardware and in software. It'll just take its time. Have a little faith.
     
    Silva likes this.
  9. AKDragonPC

    AKDragonPC Member

    Messages:
    34
    Likes Received:
    2
    GPU:
    1080 ti FTW3 Hybrid
    4K benchmarks with a 980ti OC?
     
  10. Solfaur

    Solfaur Ancient Guru

    Messages:
    8,007
    Likes Received:
    1,528
    GPU:
    GB 3080Ti Gaming OC
    After some more testing I came to the conclusion that Ultra shadows look a lot better than HTFS/PCSS for some reason. They also alow me to bump the resolution scaling to 150% @ 1440p 60 FPS vsynced, everything else maxed. I do get however get some drops in 50s in some multiplayer scenarios, but it looks so damn good that I can live with it so far, If I'd drop the scaling to 135-140% it would likely be 100% 60 FPS all the time.

    Before this I played maxed with HTFS shadows and just 100% scaling, it looked a LOT worse then. o_O
     

  11. Clawedge

    Clawedge Guest

    Messages:
    2,599
    Likes Received:
    928
    GPU:
    Radeon 570
    there is little difference between 4, 6 and 8 cores. Would this mean the game may only be dual threaded?
     
  12. Agonist

    Agonist Ancient Guru

    Messages:
    4,284
    Likes Received:
    1,312
    GPU:
    XFX 7900xtx Black
    No, its just single player doesnt rely on CPU that much at all. Online, you would be the difference far more. BF1 thrashes my cpu online, but barely uses half of it in single player. And the beta for this, used all 8 threads quite evenly and usually was around 85% all the time. The one great thing about any dice frostbite engine since Bad Company 2 on PC, they have been massively multi threaded.
     
  13. Keesberenburg

    Keesberenburg Master Guru

    Messages:
    886
    Likes Received:
    45
    GPU:
    EVGA GTX 980 TI sc
    Direct Disaster 12, if Nvidia getting lower performance with DX12 enabled DX12 is not ready on this moment and it's unfair.
     
  14. geogan

    geogan Maha Guru

    Messages:
    1,267
    Likes Received:
    468
    GPU:
    4080 Gaming OC
    Considering what DX12 was originally meant to do (remove the bottleneck of draw calls and allow magnitudes more draw calls than DX11), and the demos for it at the time, I am extremely unimpressed by its performance improvement over DX11 in almost everything in last few years. Why are game engines not taking advantage of the massive increase in draw calls? I don't get it.
     
  15. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,691
    Likes Received:
    2,672
    GPU:
    Aorus 3090 Xtreme
    Because .. . .. .. ALIENS !!!
     
    Silva likes this.

  16. ChisChas

    ChisChas Master Guru

    Messages:
    316
    Likes Received:
    73
    GPU:
    MSI Suprim X 4090
    I see the latest NVidia drivers provide SLI compatibility, Hilbert can you test two 1080i's to demonstrate how well this game scales up with two cards?
    The 4K results just prove there's no point in buying a 4K monitor at the moment as even if there was a 100Hz or better 4K monitor, Nvidia's best cannot provide the refresh rate. I have a perfectly OK Dell U2711 monitor but I bought it 7 years ago and feel frustrated that I'm being 'blocked' by poor card performance from upgrading.
    I know this would be more work for you but is it more interesting for us to have 3440 x 1440 (or thereabouts) monitor results? Is the widescreen 100Hz not a more viable gaming monitor than a 60Hz 4K one? Who would buy a 60Hz 4K monitor when the next gen NVidia TOTL cards should finally provide enough power for a 4K screen with a 100Hz refresh rate, well at least the Ti version will hopefully do so. Please don't tell me that you want a 4K 165Hz monitor as you'll be dead and buried by the time NVidia gives us the card to do this.
     
  17. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,389
    Likes Received:
    18,560
    GPU:
    AMD | NVIDIA
    I had that planned, but at a pace of 4 hardware changes per day allowed I am slowly losing my patience. I still need to test four cards and then three setups with different procs and am already locked out of the game for today until tomorrow.
     
  18. Nima V

    Nima V Active Member

    Messages:
    58
    Likes Received:
    9
    GPU:
    GTX 760 2GB
    Don't get me wrong. I don't want PC games to remain on DX11 forever, I just can't see what's the point of DX12. improvement in performance? better visuals? less work or cost for developers to implement? what benefit DX12 had for PC games?

    In my opinion this useless API had enough time to show what it can do and now it's time for a new API. maybe Microsoft should think about an improved version of DX11.
     
  19. Redemption80

    Redemption80 Guest

    Messages:
    18,491
    Likes Received:
    267
    GPU:
    GALAX 970/ASUS 970
    He pretty much means that DX12 and the like are long term solutions.

    Repeating myself from another thread, but the API has already shown what it can do in synthetic benchmarks.
    Just like everything in PC gaming (uncapped frame rates, multicore support. MGPU, ultrawide etc...) it's entirely up to the developer to take advantage of it.
     
    Agonist and Kaarme like this.
  20. Cave Waverider

    Cave Waverider Ancient Guru

    Messages:
    1,879
    Likes Received:
    663
    GPU:
    ASUS RTX 4090 TUF
    @Hilbert Hagedoorn Great tests, as always.
    Have you tested with Titan Xp or Titan X (Pascal)? The charts say Titan X (Pascal) while you mentioned Titan Xp in the above quote. Which one have you actually tested with and do the driver improvements apply for both or just one?
     
    Last edited: Nov 18, 2017

Share This Page