Asynchronous Compute

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by Carfax, Feb 25, 2016.

  1. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,771
    Likes Received:
    382
    GPU:
    MSI GTX1070 GamingX
    My GTX980m is actually slightly slower than a real GTX970, but, better than GTX960. You could say it's really a GTX965. I actually don't know what my fps was during alpha cause I never bothered to look. 30-40fps is just my guess from looking at the alpha chart, although, this is probably gonna be better with the proper beta.

    I'll update when beta is live.
     
  2. -Tj-

    -Tj- Ancient Guru

    Messages:
    17,414
    Likes Received:
    2,092
    GPU:
    Zotac GTX980Ti OC
    People judge gfx cards performance based on leaked alpha/beta demos/benchmarks? Now I've seen it all. :grin:
     
  3. Undying

    Undying Ancient Guru

    Messages:
    18,372
    Likes Received:
    6,973
    GPU:
    RTX 2080S AMP
    You have to start somewhere. That's why Alpha/Beta stages exist, to actually see how game runs early.

    Ofc things change in final product but not always. ;)
     
  4. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,256
    Likes Received:
    565
    GPU:
    RTX 3080
    It's still the same benchmark of a hacked alpha version which has an automatic setting mode for every card so you can't even be sure that you're running the same load on AMD and NV.

    No, not "every new game released" perform better on AMD. The amount of such games has increased b/c of console optimizations favoring GCN in general but still, for 2015 there where more games favoring NV h/w actually and for 2016 there are already 2 games favoring NV for 4 games favoring AMD 2 of which aren't that big of a favor really (XCOM-2 and The Division runs just slightly better on AMD h/w, something which may easily change with patches/drivers down the line). So it's pretty much a tie at the moment.

    AMD didn't always have better h/w on paper. Kepler h/w vs HD7000 line was hit and miss for AMD with some Kepler cards beating AMD's h/w rather heavily on paper as well (670 vs 7870 for example; 770 was better than 7950 on paper; both 780 and 780Ti were obviously better than 7970 on paper, etc). Most of you who speculate on this tend to forget that NV had beaten AMD twice in time to market and for both 700 series and 900 series there were half a year when you've had to compare them to 7000 series and 200 series respectively with NV winning heavily against the AMD lineup top to bottom in both specs and performance.

    It's funny how you people start with AMD's advantage in "every new game" in DX11 and them proceed to explain this with DX12 async compute.

    The reason to delay the driver is pretty obvious from what results AMD has in async compute with some of their cards getting significantly slower with it. Async compute isn't an on/off switch, it a freaking complex thing and you have to make sure that by enabling it you will actually get a speed up and not a speed loss.

    Pretty much always.
     
    Last edited: Mar 19, 2016

  5. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,771
    Likes Received:
    382
    GPU:
    MSI GTX1070 GamingX
    What is the point of alpha/beta benchmarks then? Please explain. Doesn't this also make AoTS benchmark pointless? Right?

    Bare in-mind, the Doom Alpha was a real thing that you could be part of.

    No-one is saying that alpha and betas are final performance numbers.
     
  6. CrazyGenio

    CrazyGenio Master Guru

    Messages:
    453
    Likes Received:
    38
    GPU:
    rtx 3090ti
  7. narukun

    narukun Master Guru

    Messages:
    217
    Likes Received:
    24
    GPU:
    EVGA GTX 970 1561/7700
    Because Hitman beta is the same if not worse in terms of performance after the release, yea 1 game, then another (doom), etc. I hope im wrong because i bought and trust nvidia.
     
    Last edited: Mar 19, 2016
  8. moab600

    moab600 Ancient Guru

    Messages:
    6,414
    Likes Received:
    298
    GPU:
    Galax 3080 SC
    Nvidia will beat AMD in doom probably or be beaten by small margian. If nvidia will lose just like in alpha in that big game, it is doom for them
     
  9. Undying

    Undying Ancient Guru

    Messages:
    18,372
    Likes Received:
    6,973
    GPU:
    RTX 2080S AMP
    Nvidia need to improve like 50% to actually be competitive. You think gonna be this magically easy?

    [​IMG]
     
  10. Redemption80

    Redemption80 Ancient Guru

    Messages:
    18,495
    Likes Received:
    266
    GPU:
    GALAX 970/ASUS 970
    Any reason why we are talking about DX12 async compute for Doom when it's an OpenGL game so obviously won't be using it.

    Unless im missing some and the game is now DX12.

    Looking at Doom, it's clear something has to fixed with the game, did a patch for GOW not just claw back alot of performance for AMD cards?
    Alot more than 50% from what I hear.
     

  11. -Tj-

    -Tj- Ancient Guru

    Messages:
    17,414
    Likes Received:
    2,092
    GPU:
    Zotac GTX980Ti OC
    That game is still beta otherwise they would release full game.



    And yeah alpha/beta stuff doesnt mean anything, starswarm is a perfect example how Oxide SoA works.


    Doom will run fine too, nv will fix it or id6 belive me.


    Thats why I said that before, pointless to make any conclusions based on some buggy alpha/beta stuff... Especially Amd users who seem to be happy for nothing. Lol
     
  12. Yxskaft

    Yxskaft Maha Guru

    Messages:
    1,485
    Likes Received:
    120
    GPU:
    GTX Titan Sli
    OpenGL provides the option of using extensions, perhaps ID uses one that's similar to DX12 Async compute
     
  13. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,256
    Likes Received:
    565
    GPU:
    RTX 3080
    [​IMG] [​IMG]
    960: 22->23
    380X: 30->34
    970: 31->32
    290: 35->42
    980: 37->38
    290X: 39->46
    980Ti: 49->47
    FuryX: 50->60

    Another source (beta - release):
    [​IMG] [​IMG]

    960: 20,6->24,5
    390: 36,1->46,8
    980: 35,7->42,8
    980Ti (different cards but pretty close): 49,0->61,0

    Yap, clearly "worse".
    I wonder if you even bother to check the data or if you're just go straight with AMD's press release lines.
    If you look at the results it's pretty obvious that they've specifically omitted optimizing for Maxwell h/w - and what a coincidence it is that this is AMD's Gaming Evolved title!

    id need to improve, not NVidia. The game clearly was not optimized for NV's h/w in the alpha. +50% and even more are a typical thing in performance optimizations.
     
    Last edited: Mar 19, 2016
  14. CrazyBaldhead

    CrazyBaldhead Master Guru

    Messages:
    300
    Likes Received:
    14
    GPU:
    GTX 1070
    Nvidia isn't competitive in 2016. What a jokester. No surprise coming from someone who doesn't even game and just goes around hating on Nvidia nonstop for some absurd reason.
     
  15. narukun

    narukun Master Guru

    Messages:
    217
    Likes Received:
    24
    GPU:
    EVGA GTX 970 1561/7700
    I did and man.. the only card that got a nice boost is the 980ti, but its the only card that worth in nvidia right now.
     

  16. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,256
    Likes Received:
    565
    GPU:
    RTX 3080
    All NV's cards got a boost judging from the second benchmark. AMD's cards got a bigger one but that's to be expected from an AMD-sposored title - I don't see what's new about this.

    Blah, confused 980ti with 780ti. Time to go to sleep.
     
    Last edited: Mar 19, 2016
  17. narukun

    narukun Master Guru

    Messages:
    217
    Likes Received:
    24
    GPU:
    EVGA GTX 970 1561/7700
    I understand your point, but you're using the 980ti, if you try to understand how is to pay 340$ and see that I could buy the r9 390 instead and get a lot better performance for the same price, you won't feel good.
     
    Last edited: Mar 20, 2016
  18. EdKiefer

    EdKiefer Ancient Guru

    Messages:
    2,755
    Likes Received:
    301
    GPU:
    MSI 970 Gaming 4G
    Even the 970 got decent boost % wise in that last set of graphs (31.5>36.5 ), just over 15%.
     
  19. Ieldra

    Ieldra Banned

    Messages:
    3,490
    Likes Received:
    0
    GPU:
    GTX 980Ti G1 1500/8000
    Pay 340$ for a 390 and get a lot better performance than 980Ti.

    Sure. Go the AMD website and download some more ram while you're at it.

    Does it matter than you easily get +25% performance over stock on pretty much any maxwell card ? Nope.

    Does it matter that Hitman is just as heavily AMD leaning as Crysis was to nvidia? Nope.

    Does it matter that there's NO DATA to back up your point, does it matter that an overclocked 980Ti will outperform a Fury X in Ashes, despite async being broken and lowering performance? Nope.



    What matters is that you portray AMD as the benevolent saints of PC hardware.

    Count the number of titles in which a 390X beats this 970 by more than 5%

    http://www.guru3d.com/articles-pages/zotac-geforce-gtx-970-amp-extreme-core-review,1.html
     
    Last edited: Mar 20, 2016
  20. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,256
    Likes Received:
    565
    GPU:
    RTX 3080
    What? Previously to 980Ti I've had 970 and 770 4GB prior to that. That 770 is still working just fine in a different PC. If I would be looking for a card for $300 right now I'd get myself a 390 as it's obviously a better card than 970 at the moment but, again, all of you who are saying this now fail to account for the time when both 700 and 900 series came to market.

    700 series launched half a year before AMD was able to launch 290/X cards and almost a year before the update of 7900 cards to 280/X. 900 series launched 9 months earlier than that 390 card of yours came to market. That's 9 months of 900 series beating AMD's 200 series in pretty much every aspect. Then AMD's new series came and it kinda caught up - THIS IS A NORMAL THING TO HAPPEN ON THE MARKET. Each new generation of videocards one ups the previous one because otherwise there is no point in its existence. I don't understand why it's suddenly such a big issue that Radeons 300 are better than GeForces 900 which came out nearly a year earlier. This was expected and this is how things should be - otherwise the 300 series would be a total failure.

    If you think that you're better off with AMD - go ahead, it's your choice. I'll be switching to Pascal in the meantime because I'm not in the mood to wait for the next three years until my Radeon will beat the three year old Geforce in some new game while producing 25 fps there, I'd much better play on a cooler, quieter and faster card right now.
     

Share This Page