What games actually use the multi-threaded SSE supported PhysX 3.0?

Discussion in 'Videocards - AMD Radeon Drivers Section' started by hulawafu77, Mar 28, 2013.

  1. hulawafu77

    hulawafu77 Guest

    Messages:
    191
    Likes Received:
    0
    GPU:
    7970M
    Remember in 2010 when Batman was released, it became excrutiatingly obvious that the PhysX code was so old, that it was running x87 code and Nvidia said it was too much work to compile it for SSE? (Terrible excuse.) Also claimed that game developers did it on purpose since they only care about console and not PC, and specifically compiled the x87 code instead... I find that hard to believe that game developers would choose to screw over PC on purpose...

    Well, Nvidia said wait for PhysX 3.0 since previous code was so old, they rewrote it. I couldn't help but notice, in 2012, Borderlands 2 still uses the old x87 version of PhysX. At every other game I've played that uses PhysX has run like crap too, Planetside 2, and now WarFrame also.

    I'm curious, what game actually uses the PhysX version that Nvidia claims they did not purposely cripple for PC? Where are these games? Where are these games that Nvidia claims PhysX is CPU friendly?
     
  2. CPC_RedDawn

    CPC_RedDawn Ancient Guru

    Messages:
    10,409
    Likes Received:
    3,077
    GPU:
    PNY RTX4090
    Why would Nvidia make their technology (that they purchased for a nice price) to other hardware manufacturers?

    They want to sell as many GPU's as they can and having it run only on their hardware and properly on their hardware will sell more GPU's that way.

    Stick to turning off PhysX all together, its a pointless addition that adds nothing that new or groundbreaking to the table.

    Until there is a new unified physics engine that can run on ANY CPU and ANY GPU then we will have to make do with what we have.
     
  3. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    NVidia didn't purposely cripple PhysX at all. PhysX was intentionally crippled by Ageia. NVidia bought Ageia and their intellectual property (which included PhysX) in 2008. Ageia intended PhysX to be run on a dedicated physics processing unit, which processed x87 instructions.

    I can not find any list that shows the PhysX version for games supporting PhysX.
     
  4. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    We have Bullet Physics already....why do we need a second to do the same thing? Bullet will run on any Intel CPU, any AMD CPU or GPU and any NVidia GPU.... It's OpenCL based.
     

  5. hulawafu77

    hulawafu77 Guest

    Messages:
    191
    Likes Received:
    0
    GPU:
    7970M
    And what games uses Bullet Physics other than the useless 3DMark benchmark.

    And yes, it would seem Nvidia did. Since everyone dropped x87 and depracated it, modern CPU will automatically use SSE, Nvidia had to constantly write in flags on purpose to tell CPU to use x87. That was the big issue, not Aegia, but Nvidia. And Nvidia had to rewrite PhysX since it uses their useless CUDA, so it would run on their board rather than on a separate board as Aegia did. It would seem Nvidia had a lot more to do unoptimizing PhysX than Nvidia lets on.
     
  6. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,097
    Likes Received:
    2,603
    GPU:
    3080TI iChill Black
    Bullet is used in Rage Engine GTA4, etc. And some other more indie like games. Kinda sad its not so widely spread..

    As for Physx3, well its still crippled and needs more optimizations. All that fancy talk tech didnt do much,
    https://developer.nvidia.com/physx-sdk-v31
    maybe 10-15% faster compared to physx2.


    Games with physx3

    Warframe
    The secret world
    Arma3?
    PlanetSide2?
    Hawken?

    Any U4E game in the future
     
  7. hulawafu77

    hulawafu77 Guest

    Messages:
    191
    Likes Received:
    0
    GPU:
    7970M
    Thanks Tj. Yeah I noticed that Warframe has PhysX3 files. It's really amazing, how little PhysX does in that game, and how much damage it does to performance. If soloing it's fine, but once you have 3 other players using their skills with lots of adds, framerate drops nearly 60-70% on my machine. Before it will be running around 150 FPS, then it drops to 30 FPS. It's infuriating how bad PhysX is. I don't have this in Havok games.
     
  8. trocio2

    trocio2 Guest

    Messages:
    484
    Likes Received:
    0
    GPU:
    GT 630 1GB DDR3 GK208 Kep
    I hate physx.
     
  9. Lowki

    Lowki Master Guru

    Messages:
    631
    Likes Received:
    14
    GPU:
    RX 7800 Xt
    Nvidia might like it to play on other hardware if they get paid royalties for ever game that uses PhysX.
     
  10. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,097
    Likes Received:
    2,603
    GPU:
    3080TI iChill Black
    here it doesnt drop that much, idk from avg ~90ish to 45-50ish fps, but when fps drops gpu usage also drops to ~ 50%, cpu usage is always the same.. Its like the code doesnt know how to properly utilize gpu resources and instead of maximizing gpu usage it stalls it with unwanted calls.

    Same thing by U3E streaming :infinity:
     

  11. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Use of PhysX 3.xx SDK means nothing as Planetside 2 proven.
    Maybe PhysX 3.xx SDK is optimized to run flawlessly on CPU, but Planetside 2 does not allow to enable CPU PhysX for any particle related effects.
    It's in use just for vehicles movement and to be honest that is one thing on game which is not impressive.

    And to tell just a bit more ugly stuff. My little i5 can handle 20k fluid particles with 120 iterations per second.
    Planetside 2 at huge battles uses just few thousands. And engine is limited to 30/60 iterations per second based on type of effect. So no reason why even old PhysX, if it was multi-threaded, could not handle Complete CPU PhysX delivery for Planetside2.

    And that's problem! Before they allowed CPU because no multithreading was implemented in game. (What kind of developer will decide to implement feature which does not work? There was a lot of pushing.)
    Now when demand to support multithreading from box was fulfilled, AAA class developers skip CPU PhysX completely?

    Even while I want to support Arma III will not buy it because they use PhysX. nVidia lost me over that software long time ago and as info for 3.xx SDK came out I was really optimistic with high hopes.
    Those are now in dump where they should have been from start.

    Who played TR3 could notice that even without special physical engine like Havoc/Bullet there was partially destructible world and it gave 10x more realistic impression than PhysX particle generator which just pops material out of nothing without removing pieces from something.
     
  12. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,097
    Likes Received:
    2,603
    GPU:
    3080TI iChill Black
    Well PlanetSite2 now uses Apex turbulence which uses physx3? as a backbone, idk i didnt play it after they implemented these Apex particles.

    Actually i deleted the game because i hated its crappy performance and from what i've read its still crappy even on much better system then mine.
     
  13. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Yes, they after long time re-enabled PhysX effects, but only on GPU. NO way to run it via CPU.
    Not on system with AMD GPU anyway. Maybe on nVidia system it is possible to run it via CPU :D
    Platform locked stuff, not really for me.
     
  14. Mr Terry Turnip

    Mr Terry Turnip Guest

    Messages:
    423
    Likes Received:
    0
    GPU:
    MSI 1060 Gaming X
    Nvidia are manipulative rip off artists, not an honest business who give a rats tit about there consumers.

    EVERYTHING just boils down to that.

    And the many people who (always - it's the same people) jump all over me to defend Nvidia are plain and simply just blind to it.


    Everything is a gimmick/trick/excuse to BS more money from us, most of it is done under the table

    Quite frankly if I was not constantly laughed off this board I would be fully expecting a (Nvidia) sniper to take me out from the window opposite, then bury me in PhysX rubble.

    Luckily Nvidia can clearly see I am UNABLE to damage there reputation amongst these hardcore fanatics and Nvidia are in no need to take action.

    So yeah, how's everybody doing today? :banana:
     
    Last edited: Mar 28, 2013
  15. Spets

    Spets Guest

    Messages:
    3,500
    Likes Received:
    670
    GPU:
    RTX 4090
    The misinformation in this thread and hate bandwagon is hilarious.
     

  16. NiColaoS

    NiColaoS Master Guru

    Messages:
    719
    Likes Received:
    75
    GPU:
    3060 12GB
    The problem is, many games do not even run without Psysx even on Low, at least. So, you do need PsysX installed or the appropriate dlls which the game provides. Those devs get paid well to be lazy and avoid the implementation of physics without the use of psysx. They should provide for both options with and without... but...
     
  17. Reclusive781

    Reclusive781 Ancient Guru

    Messages:
    2,601
    Likes Received:
    1,040
    GPU:
    RX 6700(non-xt)
    Oh please explain.
     
  18. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    Both AMD and Intel processors will still execute x87 instructions, however, it's limited to a single thread.

    The SDK was written by Ageia to use the x87 instruction set specifically. nVidia had nothing to do with PhysX using x87. nVidia only made the modifications necessary for PhysX to run on their GPUs through CUDA. PhysX was never, at any time, even remotely "optimized" in the least bit for running on a CPU.

    As for the games using Bullet...it's a pretty pathetic list...

    Source

    These are ALL of the games I could find listed as using Bullet Physics and the list is comprised of console, PC and others.
     
    Last edited: Mar 29, 2013
  19. meatloaf2

    meatloaf2 Guest

    Messages:
    16
    Likes Received:
    0
    GPU:
    MSI Twin Frozr 3 7850 2GB
    I am not a fan of proprietary tech that runs like crap on my system (AMD), I honestly don't mind with games like arkham city that have "extra" stuff with physx, I just turn it off.

    However there are games that use physx primarily and cannot be turned off, that is incredibly annoying, and I think that falls more on the developer than on NVIDIA. I'm sure there are cases that NVIDIA pushes the use of physx, but in the end it is the developers at fault here, unless NVIDIA has more control than I think they do.

    All I can say is NVIDIA needs to put less CUDA cores in it's GPUs and focus on OCL support, like AMD. I know CUDA has it's place in the production environment for rendering and such, but I think it should stay there. Have professional cards for that, and take CUDA completely out of consumer cards. I'm sick of gimmicks that make games unplayable for me.

    And get rid of Physx..
     
  20. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Just to add GRAW:
    http://www.youtube.com/watch?v=InTZ6MhgzeE
    This is from time PhysX was Ageia. It was about few particles here and there and a lot about real world physics/destruction.

    If you still have this game just try it on your CPU, it will be single threaded x87 but will work perfectly on modern HW.
    And experience is much better than today's PhysX games where PhysX is limited to cloth simulation and particle effects from shooting walls.

    At time nV got it, there was support for multi-threaded CPU x87, which would provide sufficient resources to do all that stuff in real time.
    But would you like to buy something just for marketing purposes and then allow developers to turn it against you and make it perfect on products of intel/AMD?

    And as OP wrote, SDK 3.xx should be automatically able to multi-thread stuff and support SSE instructions. We will just have to wait for games which will show how true that statement nV gave was.
    (And nV stated that developers may decide to turn off multi-threading at will... How Funny Warning.)
     

Share This Page