AMD Radeon Shows Strong performance with Battlefield V Closed Alpha

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jul 5, 2018.

  1. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    12,416
    Likes Received:
    4,678
    GPU:
    2080Ti @h2o
  2. Astyanax

    Astyanax Ancient Guru

    Messages:
    11,305
    Likes Received:
    4,242
    GPU:
    GTX 1080ti
    Truth, most of the guys that have the lower level know how are working on scientific stuff, or replicating video game consoles in emulation.
     
  3. Astyanax

    Astyanax Ancient Guru

    Messages:
    11,305
    Likes Received:
    4,242
    GPU:
    GTX 1080ti
    I just want to put this here, to demonstrate what it really looks like to play this game on a 580, vs a 1060 right now.




    When you're skipping frames.... all comparisons are nulled.
     
    warlord likes this.
  4. warlord

    warlord Ancient Guru

    Messages:
    2,761
    Likes Received:
    927
    GPU:
    Null
    It is unplayable. WTF. I would throw a punch at my PC if I was forced to play like this ****** mess. :eek:


    Advanced Microstutter Devices?
     
    Last edited: Jul 6, 2018

  5. Denial

    Denial Ancient Guru

    Messages:
    13,507
    Likes Received:
    3,036
    GPU:
    EVGA RTX 3080
    It's an alpha.. the entire point is catch stuff like that and fix it before release.

    Edit: Should also point out the youtube author replied to a comment saying the stutter is only present in the video capture but that AMD on DX12 has crashing and other issues.

    So idk but again, Alpha.
     
    Last edited: Jul 6, 2018
    fantaskarsef and warlord like this.
  6. warlord

    warlord Ancient Guru

    Messages:
    2,761
    Likes Received:
    927
    GPU:
    Null
    Surely I get it, just as far as I understand lower framerate is a thousand times more tolerable than frame skipping of this level. Slow motion you can play with, but stutterous one, nope.
     
  7. Astyanax

    Astyanax Ancient Guru

    Messages:
    11,305
    Likes Received:
    4,242
    GPU:
    GTX 1080ti
    Yep, just saying its a mistake to get involved with this tit for tat crap that goes on with amd and nvidia fanbois.
     
    Stormyandcold likes this.
  8. RooiKreef

    RooiKreef Master Guru

    Messages:
    410
    Likes Received:
    51
    GPU:
    MSI RTX3080 Ventus
    I suppose we will have too wait and see what the final outcome will be....
    It does however boggles my mind how DX12 run slower that 11. When the main purpose of DX12 was to be more efficient.
     
  9. Pimpiklem

    Pimpiklem Member Guru

    Messages:
    162
    Likes Received:
    49
    GPU:
    amd
    Seems as if primitive disguard and draw stream binning and back face culling isnt turned on as yet and showing the true speed of the cards without using those graphic tools.
    That is my guess.
    Normality will settle in after driver updates are applied i have no doubt.
     
    Last edited: Jul 6, 2018
  10. leszy

    leszy Master Guru

    Messages:
    326
    Likes Received:
    17
    GPU:
    Sapphire V64 LC
    Poor console game developers - they have been working with a low level API for only a few decades;)


    What about ARK, FFXV? What about regular delay in the publication of the DX12 games version, which appears a month or two after the first reviews, when the reviews of this product no longer interest anyone? What about the campaign aimed at persuading people that the API referring directly to hardware is unnecessary, because made a decade ago ancient API with huge restrictions puts in a better light a company that is still unable to create hardware that can effectively use the capabilities of the new API ?

    Do you suggest that NV has the ability to completely block the production of DX12 games? Because only then your argument about DX12 debugging tools makes sense.
     

  11. Denial

    Denial Ancient Guru

    Messages:
    13,507
    Likes Received:
    3,036
    GPU:
    EVGA RTX 3080
    The console devs are given targeted SDKs & documentation for a specific hardware feature level. That's radically different from PC development where the SDKs are fragmented, documentation is near to non-existent (looking at you Khronos Group) and you're targeting like 3+ generations of GPU hardware from three different vendors with radically different system configurations. Which is why even the best developers with top tier engine teams (DICE) have so many random issues with DX12 across various hardware configurations.

    How is Nvidia incapable of utilizing the API? AMD cards see a significant boost under DX12 because their pipeline is underutilized in DX11. There are gaps in execution on AMD hardware that cannot be filled by parallelized tasks in DX11 - DX12 remedies by allowing the developer to fill those gaps, thus higher performance. Nvidia has no gaps because their "software scheduler" that everyone crap all over for years actually turns out to be really good at scheduling and keeping the pipeline filled. That's not Nvidia lacking capability under DX12 that's AMD gaining under DX12 because they lacked decent scheduling under 11.


    I didn't suggest that - you implied it with this line: "so often we hear that a company "for unknown reasons" gave up publishing the ready DX12 version of the game . I'm afraid that this will soon become a standard source of easy money for game developers - all you need to do is publish information about the DX12 game production, and you can expect a quick cash injection from NV;)" which says to me "Nvidia pays developers to not use DX12" which is ridiculous because they publish multiple DX12 development tools, including their entire GameWorks library and shipped multiple DX12 titles.

    The main purpose of DX12 was to give developers lower level access to hardware which doesn't automatically mean more efficiency. It would be like if suddenly you could fully configure the ECU on your car. For the average joe they would look at it and be like "oh what does this knob do" and suddenly the engine stalls out and explodes. A NASCAR engineer would spend a couple hours tweaking it and get slightly better mpg or horsepower or something.

    It's the same thing with a game engine, the closer you get to hardware the harder it becomes to tweak. Every piece of hardware is different - not just architecture but even 1080Ti vs 1080 has varying degrees of execution bottlenecks/stalls/etc that can be tweaked and optimized for. DX12 allows you to go that low and do that - but it requires lots of time, hardware specific optimization, potential for regressions in performance, potential for increased visual artifacts, etc. Which is why Nvidia started putting out all these DX12 debugger/profiling tools so that devs can see what's going on at the GPU level and figure out why certain shaders are behaving differently on specific GPUs and whatnot.

    But all that stuff takes time to learn - from engineering of the debugging/profiling tools, to learning the in's and outs of specific architectures, to learning how those behaviors change when the game running on top of the engine is different, etc. It's not like "here's a DX12 checkbox hit it and magically gain 20% performance" it's like "here are a billion knobs, change one wrong and the game is broken on 2 different architectures, change 500 million correctly and you get like 5% performance increase on a single card because now you're not stalling for half a microsecond in that one shader"

    It's easy to see why game devs would shy away from that and just use DX11 and let Nvidia/AMD driver teams figure out how to properly execute at the low level for their respected hardware.
     
    Last edited: Jul 6, 2018
  12. cowie

    cowie Ancient Guru

    Messages:
    13,273
    Likes Received:
    345
    GPU:
    GTX
    you should see vega its not much better

    but I guess it just runs better then the 1060 somewhere
     
    airbud7 and warlord like this.
  13. Yxskaft

    Yxskaft Maha Guru

    Messages:
    1,471
    Likes Received:
    118
    GPU:
    GTX Titan Sli
    He says in the beginning that the game is fully playable and the shown framerate is valid, it's just the video capture that's stuttering for some reason.
     
  14. larsiano

    larsiano Member

    Messages:
    27
    Likes Received:
    4
    GPU:
    Asus RTX 2080 Ti OC
    Not true BF1 clearly shows a "optimized for AMD" logo at each startup. BF2 did have this and the other titles are not worth mentioning
     
  15. RealNC

    RealNC Ancient Guru

    Messages:
    3,575
    Likes Received:
    1,740
    GPU:
    EVGA GTX 980 Ti FTW
    The title should be "NVidia shows poor performance with Battlefield V Closed Alpha".

    AMD numbers seem normal. NVidia numbers look like crap.
     

  16. leszy

    leszy Master Guru

    Messages:
    326
    Likes Received:
    17
    GPU:
    Sapphire V64 LC
    I'm sorry but what you write does not make sense. DX12 is the main API in XBOX One X. This means that probably all major gaming studios are currently creating games for DX12.
    "The Scorpio Engine SoC features the firmware and" special hardware "to" integrate support for DX12 "and maximize the performance of the Microsoft's API." The Xbox One X's performance optimizations extend to the CPU, as well.
    "https://www.tomshardware.com/news/xbox-scorpio-engine-soc-details,35282.htmlWhat's more, I'm sure you know it. Consequently, I do not understand what you are trying to achieve in this discussion.



    Games are losing performance when running the DX12 version on NV cards and you write that NV cards can effectively use the capabilities of the DX12 - this is contrary to the basic logic. I have no idea why you thought that the performance of AMD cards in DX11 has anything to do with it. The performance of AMD cards in DX11 is lower because the architecture of both companies is very different, and that NV dominates the market, it is obvious that the game code during porting is optimized for NV drivers, but this has nothing to do with the loss of NV card performance with DX12 games. The DX12 code is neutral to the card manufacturer's code, and the impact of the driver is very small. Of course, we can create a driver that will take over part of the tasks of the original game code, and NVidia is doing it quite effectively for now, but it's only ersatz and the price is the decrease in performance.
    I hope that in the next generations of NV cards (hopefully the nearest one) NVidia's architecture will be improved enough that discussions of this type will be pointless. Certainly, we will all gain on it, regardless of which company we prefer. It must be remembered that the main problem when it comes to the scale of DX12 integration, in fact, is not the issue of the card maker, but still a large percentage of computers that could not run such games. Nobody will create games that trigger 100,000 drawcals if half of the potential players could not use them. In today's DX12 games, the hard limits set by DX11 are still adhered to.
     
  17. Denial

    Denial Ancient Guru

    Messages:
    13,507
    Likes Received:
    3,036
    GPU:
    EVGA RTX 3080
    How does this change what I said? The Xbox One X is one piece of hardware with one configuration and the XDX kit can wrap D3D11 titles. How do you think all the old Xbox One games prior to the DX12 update currently work? The difficulty with PC is that there is no single XDX kit giving you a framework to build your game engine in. There is no "one hardware configuration" - there are millions of them.

    No it's not contrary to basic logic - I explained this already. When a game runs on DX12 the responsibility of what occurs between the game and the hardware is on the developer and not Nvidia. On DX11 that responsibility is handled by the vendor, aka Nvidia/AMD. It goes back to my ECU/Car analogy - Nvidia is the NASCAR engineer. Their GPU driver is simply better at managing it's hardware than the game devs are at utilizing DX12. For AMD it's the opposite because AMD's DX11 driver is inefficient at scheduling and keeping pipeline filled due to lack of multithreaded commandlists and not just because "nvidia dominates the market" - although that does play a role. Bottom line is an RX580 is nearly a tflop faster and has more memory than a GTX1060. It should be winning in every DX11 title but doesn't do to inefficiency in the driver. DX12 allows for multithread on AMD but even better it allows the gaps created by inefficient scheduling to be filled thanks to Async Compute. Nvidia on the other hand has no gaps on DX11 - their cards run much closer to 100% of their hardware "potential" and as such the burden in DX12 for a developer to essentially achieve what Nvidia has in their driver is significantly more difficult and thus performance is often lower. That and the majority of DX12 games are just wrapped which provides basically no improvement and creates CPU overhead - which would affect Nvidia worse because their GPU driver is more CPU reliant.

    I think as developer tools improve, engines improve, debugging/GPU profiling tools improve, etc more and more games will start being optimized properly for DX12 (and not just wrapped).. but as it currently stands it's very hard to do on PC, very few studios will give developers time to do that kind of work and the ones that do (even the best ones like DICE) constantly screw up the implementation and have all kinds of compatibility/performance issues. The guy Johan Andersson from DICE worked with AMD on developing Mantle and low level API's - still works on Frostbite 3 - is probably one of the most qualified people to develop a DX12 game and they have trouble doing it. Just consider that for a second.. it's clearly more difficult to do then you're claiming it is.
     
    airbud7, fantaskarsef and Maddness like this.
  18. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,679
    Likes Received:
    352
    GPU:
    MSI GTX1070 GamingX
    That's the best summary of your posts on this subject.

    You're reaching, ignoring history and making assumptions. There's been enough examples where PC implementations of DX12 were buggy for AMD that it's overall impact on performance was negated. We also have examples where DX12 performed worse on AMD than DX11, despite DX12 seemingly having higher frame-rates. Below, I've even thrown-in an example where AMD's DX11 and DX12 performance was the same.

    Check these;
    -----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
    Look at the minimum fps here.
    https://www.extremetech.com/gaming/...son-shows-uneven-results-limited-improvements
    -----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
    Deus Ex: Mankind Divided (AMD sponsored). I quote "Switching over to DXMD's DirectX 12 renderer doesn't improve performance on any of our cards, and it actually makes life much worse for the Radeons. The R9 Fury X turns in an average FPS result that might make you think its performance is on par with the GTX 1070 once again, but don't be fooled—that card's 99th-percentile frame time number is no better than even the GTX 1060's. Playing DXMD on the Fury X and RX 480 was a hitchy, stuttery experience, and our frame-time plots confirm that impression.

    In the green corner, the GTX 1070 leads the 99th-percentile frame-time pack by a wide margin, and that translates into noticeably smoother gameplay than any other card here can provide while running under DirectX 12."
    https://techreport.com/review/30639...x-12-performance-in-deus-ex-mankind-divided/3
    -----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
    Total War: Warhammer DX12 boost for AMD still can't match Nvidia's DX11 performance
    https://www.pcgamer.com/total-war-w...md-still-cant-match-nvidias-dx11-performance/
    -----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
    Vermintide 2. I quote a user running a Ryzen and AMD GPU "Problem with DX12 atm is it introduced massive amounts of stuttering in Vermintide 2 and makes the game crash sometimes for me. In DX11 this gets fixed, however DX12 did provide somewhat better fps in more complex areas, in DX12 i never dropped below 60 ever almost while in DX11 i had a few parts in a few missions where I'd drop to low 40s for very brief periods."
    https://www.reddit.com/r/Amd/comments/8561a2/warhammer_vermintide_2_dx11_vs_dx12_1440p_extreme/
    -----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
    Quantum Break. A game that was initially DX12 only, but, later released on Steam along with a DX11 path. Here's a great example where any assumptions that a game would be best on DX12 goes out the window. Nvidia runs QB better than AMD cards can in DX12. On top of that, AMD cards run the same under both DX11 and DX12, breaking AMD api assumptions.

    -----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
    Maybe those titles above aren't big enough "AAA" games for you.

    In that case, how about Star Wars: Battlefront 2?

    I quote "Thankfully you can play in DX11." (lol)

     
  19. leszy

    leszy Master Guru

    Messages:
    326
    Likes Received:
    17
    GPU:
    Sapphire V64 LC
    Take a look at the Universal Windows Platform websites.
    https://docs.microsoft.com/en-us/windows/uwp/index
    Do not forget that win32 and win64 are also MS products like UWP.
     
  20. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,809
    Likes Received:
    3,366
    GPU:
    6900XT+AW@240Hz
    Please, revisit this post and fix it. I can tell you something about DX12. When BF had DX11/12/mantle. Those two later had issue or two, sometimes crash, and often bit lower fps before game matured. But, they delivered much smoother experience.
    Vermintide 2 plays very well. If user with Vega 64 and i7-7700K thinks there is issue with DX12, he may take step back from "Ultra" details and turn off nVidia's effects. Because on Fury X + Ryzen 7 2700X, I had flawless experience.

    Why are you even quoting dummies on Reddit? If they used RTSS for fps limiting, they would not have stutter. It is 1st advice you give here. But last thing you get to see in all those comparison tests when they intend to show difference. Truth? End result is same when you use fps limiter.

    When you are linking article, maybe it is better to check sources and what they say. Minimum fps is apparently better on DX12 side here.
    [​IMG]
    They wanted to make some stupid agenda against DX12. And make it look like it is not needed, unless you have Bulldozer or other low IPC high core count chip. But they are dumb. You'll always have games which play well on modern HW. Sitting on old hard to thread API will result in limited visuals and shortcuts.
    If you want evolution of visuals, forget that DX11 ever existed. Forget that someone uses 99th percentile test. If you sit him in front of screen where one system measured worse and other much better, he would not be able to tell difference. Or will claim that DX12 with High core CPU is better than that 99th Percentile champion with DX11 and 4 core CPU.

    Then, DX12 is not backwards compatible to DX11. If you make effect specifically for DX12, you'll need another for DX11. At that point, you no longer have comparable APIs. But hey, I am sure, there is someone who went and compared DX7 vs DX8 as no shaders and shaders looked same to him.
     

Share This Page