Asynchronous Compute

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by Carfax, Feb 25, 2016.

  1. But it does. Nvidia does not have to push the envelope because they simply have not been forced to, due to AMD's weak offerings.

    Not that there is much to judge from at the moment, but Maxwell was not released with DX12 in mind.

    And if AMD was able to offer a proper response to their Nvidia hardware counterparts at the time of their release, and not a couple years down the line, things would be different.
    Consumers have traditionally been judging GPUs from both companies on their merits at the time of their release and that is what has driven Nvidia's plans for the future.
     
  2. CrazyGenio

    CrazyGenio Master Guru

    Messages:
    455
    Likes Received:
    39
    GPU:
    rtx 3090
    Nvidia never offers better products they just offer them first and faster no matter if they are incomplete (gtx 970) just to fill the market before the competition, since a lot of people are just mindless monkeys that just want to waste their money on the top and new hardware every year fooled with useless features like mfaa, temporal dsr exclusiveness, and whatever the **** just to catch hyped fools like apple and samsung is doing every year a new iphone o gadget is out.

    Amd is stupid too, because they let nvidia fill the market, knowing a lot of people was angry about the 3.5 fiasco i would have lauched the r9-300 series the same day of those news, but no, they just waited like 7 months and they just launched the same cards as r9-200 just with 8gb and with 3 or 4 fps more than their counterpart.

    Another example is announcing a product more powerfull and cheaper than titan X just with HBM just after the launch of 980ti, the 980ti was getting all the attention meanwhile the furyX is just a card bought just by curiosity just like 980 users.

    They were, if you see the box of your card it says Ready for DirectX12, and they were claiming that they worked close with MS to get the better optimizations and performance.

    They even added the wathever the **** is FL 12_1 meanwhile all amd current hardware only have 12_0
     
    Last edited: Mar 16, 2016
  3. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,872
    Likes Received:
    446
    GPU:
    RTX3080ti Founders
    OK, hypothetically: 3 yrs down the line with the latest game the Fury X gets 30fps and 980Ti gets 28fps at low/medium settings at 1080p; is that "future-proofing" even meaningful anymore?

    Personally, I don't think it is.
     
  4. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,125
    Likes Received:
    969
    GPU:
    Inno3D RTX 3090
    If the Fury X gets 45fps and the 980Ti gets 27 14 months from now, it would be kind of a great deal. Especially if the 390 was getting 28. I don't believe it will be that bad, since the top offering from NVIDIA actually has some hardware in it, and it didn't go completely chip with 256bit buses and pathetic 2048 shader configurations. But the 980? The 970? They are already not recommended, what you believe will be their fate in let's say a year. Don't forget, games target consoles so no matter the HW you have on PC, you are getting a multiple of the PS4/Xbone experience, so that 980 SHOULD hold.

    Is anyone here trying to get my point? The "future" for GCN is literally now. There will be no more "patience" required. All game engines are tweaked to sh*ts for GCN on the consoles. This is just the first wave of PC ports hitting. By the middle of this year it will be a disaster for Maxwell (as it is for Kepler), but people in here will believe it's somehow OK because Pascal is out so everyone who shoved $400-$1000 dollars for a GeForce card can go f*ck themselves because "that's how the market works".

    That's how the market works with a company offering the cheapest possible hardware at the highest possible price, while gouging out everything they can from software developers. Do you know what's their nickname in developer houses? The Software Mafia. Google it if you don't believe me. How people can support this kind of anti-consumer behavior with "HURR DURR CAPITALISM TAKE MY MONEY" kind of "thinking" is beyond me. And as for the actual topic, no Async Compute and DX12 will offer next to nothing for Maxwell.
     
    Last edited: Mar 16, 2016

  5. Barry J

    Barry J Ancient Guru

    Messages:
    2,803
    Likes Received:
    152
    GPU:
    RTX2080 TRIO Super
    Amd really needs DX12 and async compute . NVidia does not need DX12 as DX11 works for them really well and there GPU's are fully utilised

    AMD needs DX12 so they can utilise all of the GPU it is a shame AMD didn't sort there DX11 driver to give AMD users better performance.

    DX12 will give Amd User more performance due to poor DX11 drivers

    DX12 will give NVidia users almost nothing as the GPU's are already being used.

    DX12 Async compute is coming but so is new hardware designed to use it better from both AMD and Nvidia.

    FuryX is a disappointment but 390X is awesome and seems to be getting stronger.
     
  6. Redemption80

    Redemption80 Guest

    Messages:
    18,491
    Likes Received:
    267
    GPU:
    GALAX 970/ASUS 970
    The thing is, there is nothing to suggest that in a years time there will be any daylight between Fury X and the 980ti.

    It's nothing but wishful thinking from AMD fanboys.
     
  7. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,125
    Likes Received:
    969
    GPU:
    Inno3D RTX 3090
    This could be copy pasted 15 months ago about the 290x and the 780Ti.
     
  8. HeavyHemi

    HeavyHemi Guest

    Messages:
    6,952
    Likes Received:
    960
    GPU:
    GTX1080Ti
    We get it, you hate Nvidia.
     
  9. Agonist

    Agonist Ancient Guru

    Messages:
    4,284
    Likes Received:
    1,312
    GPU:
    XFX 7900xtx Black
    But what he said is true.

    Nvidia cards dont age well, AMD does.
    But you have to wait for AMD cards to kick ass, where Nvidia cards are fast as hell right out of the gate.

    This is from personal experience as well.

    I had GTX 670 sli and GTX 970 sli before my 290x xfire.

    Im one of the rare ones that isnt a fanboy and love both sides, but I do have a soft spot more for Radeon gpus because in my opinion offer a better value over time and usually are the underdog.

    The difference in DX12 in Rise of the Tomb Raider is huge when I paired a GTX 670 4gb and R9 280 3GB with a Athlon II X4 630 @ 3.5ghz. 1440x900 res.

    The FPS difference in DX12 with the R9 280 was huge at 21 fps vs DX11.
    There was only 10 fps difference on the GTX 670 in DX12 vs DX11.
    We all know how good DX11 is with Nvidia vs AMD driver performance.
    Both of these cards are quite close and equal in performance over the rears. But I have seen the 280 getting faster while the 670 has fallen off.
     
  10. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,886
    Likes Received:
    1,015
    GPU:
    RTX 4090
    You made me try to remember when was the last time I spent $700 on a videocard... Was it Voodoo 2?.. No, that was cheaper... You know, I think that this has never happened! Even the current 980TiG1 I have I've got for $650.

    If you're trying to say that GCN card owners don't need to buy new videocards as often as NV cards owners then you're most likely wrong because I know for a fact that you can still play on a 680 right now as it is able to run any new game on the market (hell, even a GTX460 can theoretically) - and it won't be so much worse than 7970 which was released kinda at the same time as to be something completely unusable compared to the glorious 60 fps of the Radeon.

    As I've said already in both cases you're looking at a subpar experience. Here's the results of 680 and 7970 from the last batch of gamegpu benchmarks in 1080p (680/7970):

    1. Hitman: 24/30 (both are borderline unplayable)
    2. Need for Speed: 38/47 (same subpar experience basically; bad engine for NV)
    3. The Division: 43/62 (bad engine for Keplers as we've already figured out)
    4. Gears of War UE: 47/50 (even)
    5. Primal: 35/40 (basically unplayable; bad engine for all NV cards)
    6. Firewatch: 62/58 (even)
    7. Street Fighter V: 60/60 (even, as fps is locked on 60)
    8. Dying Light: The Following: 50/54 (even)
    9. XCOM-2: 41/44 (same subpar experience)
    10. RotTR: 21/27 (unplayable on both)
    11. Homeworld DoK: 72/74 (even)
    12. Dragons Dogma Dark Arisen: 93/95 (even)
    13. JC3: 40/47 (the same subpar experience)
    14. AC Syndicate: 35/38 (unplayable on both basically)
    15. SWBF: 48/61 (bad engine for all NV h/w)
    16. Rainbow Six Siege: 63/78 (bad engine for all NV h/w)
    17. DiRT Rally: 60/68 (same experience basically)
    18. Fallout 4: 38/39 (almost unplayable on both)
    19. CoDBO3: 38/54 (bad engine for Keplers)
    20. MGS5: 45/35 (subpar on both)
    21. Mad Max: 56/47 (bad engine for GCN)
    22. Batman Arkham Knight: 47/58 (a nice win due to +1GB of VRAM probably)
    23. The Witcher 3: 26/24 (both unplayable)
    24. PCARS: 35/31 (both unplayable)
    25. GTA5: 36/43 (both are almost unplayable)
    26. BFHardline: 58/69 (a nice third win in a FB3 title)

    So that's basically one year back. What do we have as a result?
    - 26 games in total
    - 8 (30,8%) of those are basically unplayable on both (<35 fps on average)
    - 7 (26,9%) of those are providing the same subpar experience on both cards (36-59 fps on average)
    - 6 (23,1%) of those are equally playable on both (>60 fps)
    - 5 (19,2%) of those are providing a measurably better experience on 7970 than on 680

    Out of those 5 games 3 are using the latest edition of Frostbite3 engine which doesn't run that well in comparison on all NV h/w so that's not really Kepler architecture issue.
    1 is The Division where Keplers aren't doing too hot.
    And another 1 is CoDBO3 where Keplers are dying as well.
    So that's two games for the last year (7,7%) where 680 specifically is struggling when compared to 7970.

    And who cares that only 1/5 of games last year are actually providing any meaningful benefit to 7970 users when compared to 680 while a ****ing ONE THIRD of all games of previous year are basically unplayable on both cards equally, right? Unless you plan on playing just 2/3 of games coming out this year I think that you'd better go and change that 7970 of yours to some 390 or Fury card. A 680 owner is looking at the exact same number really with the only exception being that his card will providing a subpar experience in half of playable titles while a 7970 owner will have this subpar experience in 1/4 of all titles. This however should be easily fixable with lowering graphics setting on 680. Yeah, a real "tragedy" for Kepler owners.

    Both FuryX and 980Ti will go the way of dodo once 14/16nm GPUs will hit the market. Part of the reason for such longevity of AMD's cards lies in the fact that there has been no production process change since the launch of 7970 and 680.
     
    Last edited: Mar 17, 2016

  11. HeavyHemi

    HeavyHemi Guest

    Messages:
    6,952
    Likes Received:
    960
    GPU:
    GTX1080Ti
    You're just pointing out (as others have) how relatively poorly AMD did with DX11 not how Nvidia cards don't age well. I'm not sure why this never sinks in.
     
  12. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,872
    Likes Received:
    446
    GPU:
    RTX3080ti Founders
    No dude. Performance ofcourse won't be as high as when these cards first released, but, Nvidia have consistently provided better "day one" experiences.

    Gow: Less problems on Nvidia
    Hitman: Smoother on Nvidia

    This trend is going to continue despite these games being made on GCN systems. Nvidia's drivers are better and even dx12 cannot change this. If you only look at fps scores, sure, AMD is doing fine. When you look at how the games actually run especially on initial release...I'd have to give it to Nvidia.

    When you look at overall api support...same again, it's always been Nvidia.

    Look, Mantle turned into Vulkan...bam!!! Nvidia are on it already with beta driver support! Some things won't change.
     
  13. dgrigo

    dgrigo Guest

    Messages:
    17
    Likes Received:
    0
    GPU:
    TitanX
    I have bought a Titan X after my 290x, and gave the 290x to my son.
    Seriously Nvidia is a company that manipulates devs and users.
    But everyone on either side of the camp eats what both companies serve to them.
    Easily accepts what they promise or say and the bad thing is not any tech related site (Magazine) ever question them.
    They even go totally silent when they are told to do so.
    IN my life i have changed something like 20 cards or more, from them 3 were AMD (ATI) 3 NVIDIA Quadro's, 1 Evan and Sutherland, 2 3Dlabs and the rest were Nidia's Gforce.
    I am far away from been a funboy to either vendor but seriously NVidia is the company i hate most because of their practices and closed software tech.
    I agree Nvidia age very fast, Gameworks are there only to manipulate frame rate.. software to manipulate everything from compute (Cuda) to openGL...
    They were first to separate the same processor to different segments just because of openGL and when we could just change the ID of the processor by moving 1 resistor on Gforce and make it a true Quadro , they started producing separate chips, you wanted double precision ? NO! nvidia is a Greedy company and they will pay their greed as new technology gets invented.
    Their promises ... we will do async compute on software driver side.. were is it?
    ARK was about to release dx12 in Augoust but no... :p
    Microsoft kills Fable and lionhead.. why? :p
    They want to be fast and implement gameworks to dx12 so they can then come and say.. see we are faster than AMD..
    I think tech sites need to do a more aggressive work and inform the clients nicely on whats going on..
    After last years Hype and Microsoft nvidia presentation with NVidia and the claim they were working 2 years already with Microsoft was another big lie...

    Anyhoo, i am sure nobody cares as I don't care too.
    I just give some food for thoughts..

    Will i buy NVidia again, sure i will , but sure i hate them for their practices.
    Will i buy AMD again? sure i will and I feel so much better for them.

    PS: if this is the performance we get from DX12 , nobody needs it, nvidia and microsoft failed, nobody wants it on windows 7 :p

    Edited for PS
     
    Last edited: Mar 17, 2016
  14. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,872
    Likes Received:
    446
    GPU:
    RTX3080ti Founders
    So you hate their practices, but, have no problem buying a top-tier card?...lol. Many words come to mind which I will not post here.

    I sense you think that Lionhead closure had something to do with Nvidia? Good story bro.
     
  15. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    I really don't get why people keep blaming Nvidia for ARK's delay when the ARK devs specifically said that both companies had driver issues. We know AMD has problems with the whole 60hz/directflip thing not to mention performance issues/crashes in both Hitman and Gears of War. So it's not like it's that far-fetched that ARK devs ran into these problems, plus a slew of nvidia ones, and that's the fixes they are waiting for.
     

  16. dgrigo

    dgrigo Guest

    Messages:
    17
    Likes Received:
    0
    GPU:
    TitanX
    You can tell what you want , I already said i don't care, as for Fable Legends i thing that Microsoft had enough publicity with how fast is DX12 with their partner and for them (microsoft) spreading of OS is most important atm than bad publicity.

    Seriously you can say what you want , as I said everyone has his opinion and is respected as well.

    Whats the problem with hate? you can hate your boss, will you leave the job? :p
     
    Last edited: Mar 17, 2016
  17. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    I don't even get what your trying to say..

    Microsoft closed an entire studio because the API they are actively trying to push into everything was too fast? What?
     
  18. dgrigo

    dgrigo Guest

    Messages:
    17
    Likes Received:
    0
    GPU:
    TitanX
    My bad i was sarcastic on that lines..
    I say that Microsoft didn't wanted bad publicity about dx12 so they closed the studio faster than they could, just before the game got released.
    Bad publicity would have been if the dx12 was not fast as they were promising at the start.
    The studios they closed offcourse was on the list of their actions and if i am not wrong they are European Studios, but if the publicity of the game with dx12 was full of problems at launch that would be a bad publicity and I am sure the whole windows 10 carrot for most ppl to upgrade are games that runs only in windows 10 (DX12).
    Something that bothers me from August 2015 is the total silence from everywhere about DX12 after all the hype from 2014 to Augoust 2015.

    Anyhoo, I said my personal opinion, and why i hate Nvidia... whoever don't like it, no need to bash me, done nothing wrong but said an opinion.
     
    Last edited: Mar 17, 2016
  19. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,872
    Likes Received:
    446
    GPU:
    RTX3080ti Founders
    So, the game being crap had nothing to do with it?

    You're opening-up a whole can of worms with your conspiracy theory here and implying Nvidia had something to do with it. Get real bro.

    Where as you see Nvidia shady business practices, I've always thought it's good that they worked with everyone. Do you know how many devs have publicly stated that Nvidia helped them and AMD/ATi didn't even give a ****? and now with dx12 suddenly AMD is the gold underdog? Don't make me laugh. Nvidia has a much better track record than AMD in-terms of getting games to work. Don't even talk about opengl.

    Nvidia bought out Ageia many years ago and Gameworks is basically the fruit of that continued labour. It's used across all systems and adds VISUAL quality to games. Where is the AMD equivalent? Where are the AMD industry adopted effects?

    You must realise that AMD are in financial trouble. They must pull-out all the stops and "get their **** together". Sneaking AC into dx12 was a big coup for them and now their future rests with devs adopting their platform as primary dev systems.

    What do you think happens though when Nvidia has the full dx12 feature set? I think AMD will run out of things to champion...
     
  20. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,125
    Likes Received:
    969
    GPU:
    Inno3D RTX 3090
    I posted Guru3D benchmarks that show much higher differences than these, even the 370 reaching 770 levels of performance. You give GameGPU whatever benchmarks with no links. And here we are. You still haven't said if that 680 was a good choice or not.
     

Share This Page