AMD: NVIDIA PhysX Will Be Irrelevant

Discussion in 'Frontpage news' started by Guru3D News, Oct 3, 2009.

  1. Redemption80

    Redemption80 Guest

    Messages:
    18,491
    Likes Received:
    267
    GPU:
    GALAX 970/ASUS 970
    For DX11 to take off, alot of things have to happen.

    XP desktops have to become the minority - I still think thats a while off.
    ATI and Nvidia both have to support it 100% - That could go either way at the moment.
    PC games need to become nearly as profitable as console a game - Not likely
    Or ATI/Nvidia need to pay the developers to add DX11 features - A possibly, and ATI have got the ball rolling with Dirt2, but will they keep doing it?

    If not, we will have to wait until the PS4/Xbox720 or whatever it will be called.

    sykozis, i'm pretty sure any port of PhysX for ATI would more than likely be slower, but if the games keep building up, and with no other GPU physics competitor, ATI will surely feel the pressure
    The Nvidia drivers have now been hacked, so you can now use a ATI and NV card for PhysX, but its hardly ideal for ATI that every PC could potentially have an NV card of some sorts in it.
     
  2. buddyfriendo

    buddyfriendo Guest

    Messages:
    3,404
    Likes Received:
    5
    GPU:
    2070 Super
    Are you saying I can now buy myself a 9600GT for PhysX and use the latest drivers with this "hack"? If so, got a link?

    If so, would you guys say the 9600GT is bare minimum for PhysX or can I go lower than that since it'll be processing nothing more than PhysX?
     
    Last edited: Oct 4, 2009
  3. Exodite

    Exodite Guest

    Messages:
    2,087
    Likes Received:
    276
    GPU:
    Sapphire Vega 56
    Fixed.

    You pay for something you expect to get it, your original analogy fails at the basic level as my graphics card is incapable of jealousy.

    Yet, anyway.
    Considering the quality of their PR decisions of late that doesn't require any significant effort.
    There's no shortage of people who feel that nVidia is entirely justified in disabling HW PhysX when an ATI GPU is detected, due to some some monetary argument I can't seem to understand logically, but at the same time there are others arguing that nVidia would help ATI get HW PhysX working on their GPU's out of the goodness of their hearts? o_O
     
  4. Redemption80

    Redemption80 Guest

    Messages:
    18,491
    Likes Received:
    267
    GPU:
    GALAX 970/ASUS 970
    http://forums.guru3d.com/showthread.php?p=3297432#post3297432

    96SP is the bare minimum you can go with, preferably higher though.

    A cheap used 8800GT would be ideal.

    Its not the card that feels the jealousy, its the card makers, so Nvidia.
    Come on, keep up!!

    My argument is because they want to win over the less tech minded who wont work out hacks, or are sitting on the fence on what card to get.
    Makes perfect business sense, risk a few bucks to make alot more, its only budget cards that the ATI people need for PhysX anyway, well at the moment.

    No one said NV would help ATI with HW PhysX out the goodness of their heart, they would obviously do it for a fee, this is a charity.
    If ATI saw sales of Nvidia cards going up around the time of games with GPU PhysX, and their cards going down, then it would definetly get them thinking.
     
    Last edited: Oct 4, 2009

  5. ElementalDragon

    ElementalDragon Ancient Guru

    Messages:
    9,351
    Likes Received:
    30
    GPU:
    NVidia RTX 4090 FE
    Exodite: There is no arguing. it's FACT! Quite a while back, NVidia offered ATI the option of using PhysX..... as to whether it would have been free or not i don't remember.... and ATI said no. fact.

    I didn't read the entire thread..... only most of the first page and some of the 3rd... but i find some people's comments quite hysterical.

    Like those saying that they "know" PhysX will die off with DX11. Highly unlikely seeing as not only are there several games coming out for PC using the PhysX engine.... but PhysX is also quite popular on game consoles as well. Or saying that "open standards" would be better. uh huh.... and who exactly would MAINTAIN the system behind that "open standard"? Microsoft? We see how long it takes them to fix an issue with just about anything. You'd notice a glitch in the "standard".... and have to wait a month or a few til MS patched it, and pray they wouldn't screw something else up in the process.

    The one thing that really hit home though (and i mean on the laughter scale)....

    PC's are built around a set of standards compatible with all hardware? I could see that in a sense.... since... say... Windows and Linux work on pretty much any hardware. But you can't run a certain piece of hardware with a driver that isn't suited for it, nor can you run that piece of hardware if the drivers aren't compatible with a new OS. Just like you can't use certain pieces of hardware with other pieces of hardware. AMD and Intel both have their own socket types.... and hell, Intel seems to be releasing a new socket damn near every 4-6 months it seems now. Is there really a need to have a different socket for i5 than there already was for i7? Seems like every time a new CPU comes out anymore..... the previous standard is no longer the new standard. LGA 775 had DDR2.... but some boards supported DDR3. LGA 1366 or whatever supports DDR3, same as the new socket for the i5's. Video cards seem to eventually come out with different connection types, or a different amount of pins per connection. Used to be where one or two 6-pin connections was enough.... but then "oh no... we need a 6 pin and an 8 pin connection".

    PC's set the bar for standards.... in that they probably have more "standards" than anything else in existence.
     
  6. salanos

    salanos Maha Guru

    Messages:
    1,301
    Likes Received:
    0
    GPU:
    GeForce GTX980 4GB (Ref.)
    I suddenly wonder, when do we see true 3D liquid physics entities that react to footsteps, rumbles, and ripple dynamically to objects inside? Instead of liquid bodies that have predefined movement and a 2D sprite for a splash?
     
  7. Fguillotine

    Fguillotine Member

    Messages:
    31
    Likes Received:
    0
    GPU:
    Xfx Gtx 285
    try Cryostasis with patch 1.1. Dynamic water is really amazing. Wish it could be included in Batman AA...
     
  8. Stukov

    Stukov Ancient Guru

    Messages:
    4,899
    Likes Received:
    0
    GPU:
    6970/4870X2 (both dead)
    Nvidia disabling PhysX when ATI card is detected is like Ford turning off your engine or headlights because you installed a non-Ford part in your car.

    I thought about buying a cheap Nvidia card since I have Windows 7 now, to add PhysX support, not that I need it or desire it greatly, but thought it would nice to have for a few extra bucks.

    Now that Nvidia had gone and pulled this not only will I not be buying an extra card to try PhysX out, but I simply won't buy another Nvidia card period. Because when you buy a video card from Nvidia, they support that card, not the other parts in your system. What's next, Nvidia graphics card shuts down if an AMD CPU is detected?

    When they disable PhysX when ATI card is detected doesn't harm ATI users, it harms NVIDIAS customers, because they still bought the Nvidia card in their system. There are laws against this kind of behavior when dealing with auto*******, I know that if I had purchased a card and they disabled a function because of a competitors card, I would most certainly be contacting a lawyer and exploring my legal options.

    Edit* uh, why is automobi1es (as in cars) starred out?
     
  9. Fguillotine

    Fguillotine Member

    Messages:
    31
    Likes Received:
    0
    GPU:
    Xfx Gtx 285
    maybe Ati should do their own drivers to run physX on Ati cards... i guess you want 3D Vision glasses working on Ati cards with nvidia drivers too... :confused:
     
  10. Stukov

    Stukov Ancient Guru

    Messages:
    4,899
    Likes Received:
    0
    GPU:
    6970/4870X2 (both dead)
    Except there is a problem, these aren't equal.

    PhysX works on Nvidia graphics card installed including secondary adapter. Hell, it even works on CPU.

    3D Vision glasses only work on Nvidia cards as the primary adapter.

    One works until Nvidia purposely disables it in the driver, another only works with a Nvidia card as primary graphics (because an Nvidia card must be used to render the graphics). Your comparison fails because neither are equal. Not even close to the same scenario.
     

  11. Exodite

    Exodite Guest

    Messages:
    2,087
    Likes Received:
    276
    GPU:
    Sapphire Vega 56
    Exactly.

    Unfortunately for nVidia the consumer market isn't as inclined to respect the "feelings" of a for-profit company quite as much as that of human partner.

    Thus, sucky and nonsensical analogy.
    The thing that amuses me about this, in a sad sense mind you, is that you - and others with you - seem to be under the illusion that this implementation would have been cost-free for AMD and resulted in a HW PhysX implementation for their GPU's that would have been competitive with nVidia's own offering.

    nVidia is removing support for HW PhysX if an AMD card is detected and blaming it on compatibility, stability and performance issues. You expect AMD to not only embrace similar circumstances but to actually pay their main competition for the pleasure?

    And for nVidia to provide a fair implementation made for competing hardware at that.
     
  12. UnrealGaming

    UnrealGaming Ancient Guru

    Messages:
    3,454
    Likes Received:
    495
    GPU:
    -
    Actually I don't have an ATI card, and when I saw what an amazing piece of technology GT300 is, doubt I will ( at least not anytime soon ). So I really don't care if "X" driver disables PhysX when ATI card is detected. Maybe I would, if I got 5870, which I almost did. ( and then i saw gt300 specs ... )

    Well, Nvidia paid billions for PhysX technology. Anyone sane enough wouldn't expect ATI to get PhysX for free . And if they wanted for their users, too, ( read everyone ) to have GPU accelerated PhysX , they would certainly make a deal with NV . And since ATI & NV are, more or less, the only GPU makers, PhysX would pretty much become an "open" standard. And then, later on, it could be easily ported to OpenCL. ( even tho CUDA is an OpenCL ... )
    On the other hand, ATI could be going for their own thing. GPU accelerated HAVOK. Which they demonstrated few times. But , their own words, " current GPU's don't have enough power to do both Physics and graphics at acceptable framerates" . And now, since they have 2.72 Tflops GPU... maybe they'll go for it . It would actually be crazy to start building some completely different Physics engine, when we have amazing Havok and PhysX.
     
  13. Covert

    Covert Maha Guru

    Messages:
    1,187
    Likes Received:
    0
    GPU:
    Asus HD 5770 1000/1300
  14. Cybermancer

    Cybermancer Don Quixote

    Messages:
    13,795
    Likes Received:
    0
    GPU:
    BFG GTX260OC (192 SP)
    m o b i l e s is filtered by our spam filter
     
  15. Redemption80

    Redemption80 Guest

    Messages:
    18,491
    Likes Received:
    267
    GPU:
    GALAX 970/ASUS 970
    http://www.youtube.com/watch?v=r17UOMZJbGs&feature=player_embedded

    The majority of the consumer market doesn't give a sh1t what a company does as long as they get the product.

    Nvidia paid money for PhysX, so we are all assuming they would charge ATI too, they aint some charity, its a given that they would be charged money, it doesn't have to be mentioned in every post.

    And yes i do expect ATI to embrace and pay for it, they have come up with nothing to rival it, just all this talk about what "may" happen in the future, usually nonsense about gameplay physics, and playing down eye candy physics.
    Excuse me Mr Ati, eye candy is you livelihood.
     

  16. GC_PaNzerFIN

    GC_PaNzerFIN Maha Guru

    Messages:
    1,045
    Likes Received:
    0
    GPU:
    EVGA GTX 580 + AC Xtreme
    Man I heard about ATis GPU physics back in the old days.

    Where is it? I still have my X800 but the empty physics promises still aren't here. :/

    [​IMG]

    :cussing:
     
  17. gavomatic57

    gavomatic57 Member

    Messages:
    35
    Likes Received:
    0
    GPU:
    Palit Geforce GTX 275
    "ATI - The way it's meant to be promised"???
     
  18. Kaleid

    Kaleid Ancient Guru

    Messages:
    2,826
    Likes Received:
    353
    GPU:
    7900xt
    physX and the fact that you can only get it through one developer is one of the reasons why PC gaming is heading closer to irrelevancy.

    Very very few care about having two graphic cards in the first place. And the number of people who do so because of physics is even lower.

    It should have been from the very beginning on open standard with no license cost through directX. Or multicore CPU's which everyone will have in the end.

    I'm not at all interested in physX and expect the format to die. I want more effort put on AI instead.
     
  19. Redemption80

    Redemption80 Guest

    Messages:
    18,491
    Likes Received:
    267
    GPU:
    GALAX 970/ASUS 970
    Poor sales compared to the wii/360/PS3 is why PC gaming is heading close to irrelevancy and because no devleopers want to support any games, no other reason.

    Microsoft withdrew PC Alan Wake so they could get the 360 one out quicker.
    EA thinks the PC is only worthy of PS2 ports of some sports game, and shouldn't get all of them.
    Eidos and Codemasters had to paid to add PC specific features to Batman and Dirt2.

    And you could add a list of another dozen developes who have released console ports that look the same as the 360 version, but have a resolution option.

    1 card supporting a standard is better than zero cards supporting it in my opinion.
     
  20. Stukov

    Stukov Ancient Guru

    Messages:
    4,899
    Likes Received:
    0
    GPU:
    6970/4870X2 (both dead)
    Yeah but why I don't get it?
     

Share This Page