Will ATI respond to Nvidia's Experience ?

Discussion in 'Videocards - AMD Radeon Drivers Section' started by vejn, Dec 14, 2012.

  1. vejn

    vejn Maha Guru

    Messages:
    1,002
    Likes Received:
    0
    GPU:
    MSI 7870 TF3
    Will there be some future ATI only component turned on in drivers like PhysX ?
    It would be nice to CCC have optimization settings like in Nv Experience.
     
  2. The Mac

    The Mac Ancient Guru

    Messages:
    4,408
    Likes Received:
    0
    GPU:
    Sapphire R9-290 Vapor-X
    Fat chance, PhysX is wholly owned by Nvidia.
     
  3. Legendary_Agent

    Legendary_Agent Master Guru

    Messages:
    888
    Likes Received:
    0
    GPU:
    Asus HD7970 DirectCU II
    Frankly i dont understand the fuss about physx, everything it does can be done smoothly on a cpu when coded right, i mean, i dont see it as something out of this world at all.
     
  4. nhlkoho

    nhlkoho Ancient Guru

    Messages:
    7,726
    Likes Received:
    347
    GPU:
    RTX 2080ti FE
    I haven't really played with the Nvidia experience too much because when I installed it and ran a scan for my games, it found 2 out of the 70+ games I have installed on my machine.
     

  5. Black_ice_Spain

    Black_ice_Spain Ancient Guru

    Messages:
    4,555
    Likes Received:
    0
    GPU:
    970GTX
    and physx is nothing, it's just a matter of license, and not a great library IMO (except when developers get paid to include it and only it...), the real tech is CUDA; and ati can also be programmed with OCL so....


    i think at physx first days, someone was able to run it on AMD, but probably he was "contacted" by nvidia (for good or for bad, idk).
     
  6. vejn

    vejn Maha Guru

    Messages:
    1,002
    Likes Received:
    0
    GPU:
    MSI 7870 TF3
    Can all PHYSx games run on CPU with ATI card ?
     
  7. atimaniac

    atimaniac Master Guru

    Messages:
    261
    Likes Received:
    0
    GPU:
    Palit 1060 3GB
    yes,u need only to install physx software,nothin else.
     
  8. Legendary_Agent

    Legendary_Agent Master Guru

    Messages:
    888
    Likes Received:
    0
    GPU:
    Asus HD7970 DirectCU II
    Yes but there are some problems making it stutter, i dont know why, but its certainly not because my cpu cant handle a stupid poli animation.

    Infact other physx games like nfs are done entirelly on the cpu, frankly i still prefer havok, much smoother experience overall and its done on the cpu too.

    In a world of personal computer standard, i find it disgusting to limit a feature to only 1 company rather than make it available for any company free of charge, things like ageia physx do nothing to contribute to that standard.
     
  9. nhlkoho

    nhlkoho Ancient Guru

    Messages:
    7,726
    Likes Received:
    347
    GPU:
    RTX 2080ti FE

    Nvidia offered it to AMD. AMD declined.
     
  10. alexrose1uk

    alexrose1uk Active Member

    Messages:
    78
    Likes Received:
    0
    GPU:
    R9 290 PCS+ 4GB@1150/1500
    I seem to remember it was for silly money hence the talks fell through, especially as AMD was looking into more open OpenCL projects such as Bullet, rather than something proprietary available only under restrictive license from its main competitor.

    Looking at Nvidias history of not being great to work with I'm not surprised they were declined.
     

  11. nhlkoho

    nhlkoho Ancient Guru

    Messages:
    7,726
    Likes Received:
    347
    GPU:
    RTX 2080ti FE
    No I think Nvidia was offering it for free and ATI declined. Physx is a free API for developers so AMD would probably pay a small licensing fee to allow the tech on the AMD cards. I doubt it would be a large sum of money though. Nvidia would then have a huge advantage in the physics arena and I'm sure AMD doesn't want to give them that.
     
  12. Lane

    Lane Ancient Guru

    Messages:
    6,361
    Likes Received:
    3
    GPU:
    2x HD7970 - EK Waterblock
    If you dont know what effect have the optimisation you just let it as "appliccation controlled"..

    I dont see the interest of use any setting premade as nvexperience. it dont do anything more to let the game choose the setting for you ( most games allready detect your hardware and use setting optimised for your hardware ( rarely accurate anyway ).

    PhysX ? There's 2 problem with physic engine prioprietary or not:

    - Game developpers use allready the one they developp for the engine they use... So basically untill you pay them really well, they will not use any other one, as it bring more problem on coding/licence of what they want.

    - Bullet OpenCL exist, but even if it is used a lot in the cinema and professional 3D industry, not so much in games, and even if it was the case, you will not specially know it as developpers can use it as they want ( include some code in their engine or the whole one ) .

    Normally, Havoc have been rewrited by ATI for take gpu shaders usage, but Intel have buy Havoc then and so block the ATI work on it.

    - About the whole Nvidia have offer it, they was never a real proposition, it was one guy from Nvidia who have launch during an interview: we believe in PhysX and we can even offer it to AMD if they want. ( a legend say the CEO of nvidia have call some AMD guys by phone for tell them to it they will ) ..
    For the good remember, the Creator of Ageia / PhysX and who was in charge of PhysX and CUDA for Nvidia is working now for AMD.. ( The way Nvidia use PhysX and CUDA is the main reason why he have leave Nvidia )

    In reality you cant make run PhysX if you dont use CUDA.. and so whatever the hardware will not make it run easely as the instructions used by Nvidia for PhysX are not working with AMD hardware. ( they are not standard instructions so.. whatever you cant .. + There's no interest for AMD to use a technology they dont have the control of the developpement

    Specially, PhysX is extremely slow if not coded for run on any other hardware

    I dont think AMD want to see Nvidia PhysX on all games when they cant control or even licence the code / hardware needed for make it run on their hardware.

    For bring AMD possibility to use PhysX, it should be completely rewrited at the source and use standardised instruction, library and appliccations.

    When you see Nvidia using PhysX as a marketing sell point, you dont imagine they want AMD users got the best experience with it if AMD had got the possibility to have it ? and so destroy there marketing argument ? its not candy land.

    And on the AMD side, do you want them to promote a proprietary library/code/software for their opponent ? no ofc. ( without even have the right to touch the code or optimise it for their hardware.)
    If nvidia want to offer PhysX, they do it by open the code and library. push it in OpenCL etc.

    Even if AMD have access right now to PhysX in games, how will it run? will it be used as argument for Nvidia for promote their hardware like they have do in the past on some games ?
    - Is AMD will be able to code the driver, or does AMD will need to give all the infos on their hardware/future hardware to nvidia ?
    - Is Nvidia will play it fair ? or not ? Will it be ruined performance side on AMD hardware ?
    - Is Nvidia will not decide to put some special instruction/codes in it for differenciate PhysX by Nvidia and PhysX by other.
    - Is Nvidia will use the last update of the library engine with AMD hardware or just bring the last optimisation to Nvidia drivers? ... Who will provide the support on driver ? AMD or Nvidia ? What link it will have with Nvidia driver/optimisation by game who are not possible on AMD driver ?
    - Is the game developper will optimise PhysX for Cuda shaders? or with AMD shaders too? ..
    - Does AMD will need provide hardware change on future generation for better support ? and without access to the code how will they do that ?
    - PhysX on Nvidia cards dont use simply the library, there's licence to obtain for make it run on gpu hardware, will Nvidia sold/licence them to AMD? ( i doubt )

    - Is Nvidia will not just decide then when 60% of the games will use PhysX by the GPU to remove the access to AMD gpu's.
     
    Last edited: Dec 14, 2012
  13. seb4771

    seb4771 Active Member

    Messages:
    81
    Likes Received:
    0
    GPU:
    R9 280x dcu II 1070/1600
    Hi all,

    Just for note : Borderland 2 running "well" with PhysX for AMD at MEDIUM but need best CG + CPU.

    In my computer, i'm playing at +/- 35 fps with medium physX (i5 2500k@3.5 + 8 Go + 5970@850/1150), i choose playing with physx because it's very beautiful effects at compare without physx.

    PhysX it's not very "important" for playing, same at Dx11 Tesselation effect, add a new immersion for game but need BIG more GPU/CPU ressouce.

    If a next time, ATI automatic-program exist, no chance for me to play borderland 2 with physX :(
     
  14. Lane

    Lane Ancient Guru

    Messages:
    6,361
    Likes Received:
    3
    GPU:
    2x HD7970 - EK Waterblock
    The problem in Borderland 2 is even if you can find with medium setting the fps are acceptable, they are still much lower of the fps you obtain by the gpu ( without saying the game was allready well optimised for Nvidia cards without that ).. Just because the CPU instructions are completely slowed down by the fact they treat PhysX on threads who are allready used. ( simple use free CPU threads /usage time will be enough for increase the performance ) .

    in medium Physx you loose 30% of performance vs nvidia cards.
    In High Physx you have half of the performance of the Nvidia cards. ( 7970 vs 680 .. 33fps vs 70 fps at 1920x1080 )

    Higher is the physX setting, lower is the usage on the CPU of AMD system ( the threads are stalling and so the cpu need to stop request or keep them in wait ) . When in reality, PhysX should use a particular thread for itself on the cpu ( basically like audio on Dirt3, or BF3, use their own "core/thread" for process the audio.. or like Havoc do for physic )
     
    Last edited: Dec 14, 2012
  15. Rich_Guy

    Rich_Guy Ancient Guru

    Messages:
    12,343
    Likes Received:
    475
    GPU:
    MSI 2070S X-Trio
    Experience is just a gimmick.
     
    Last edited: Dec 14, 2012

  16. mikeysg

    mikeysg Ancient Guru

    Messages:
    2,456
    Likes Received:
    167
    GPU:
    PC VEGA64 Red Devil
    I install nV Experience software and ran it, it did not detect a good majority of my games. It also set PhysX for many of my games to Low, and I have an FX8120 @3.7ghz + 2x GTX670 , whaddupwiddat?! I updated my driver and 'poof', it was wiped from my system, couldn't reinstall as the 10k beta tester quota had been met. Bah, I would rather set PhysX manually anyway. Btw, with my cards, I should be able max out 99% of games out now @1080P, yet recall seeing some game setting at less than max. I guess it was to ensure better framerate, but if 2x GTX670 cannot do these games at max, I wonder what single gpu users' setting would be like.

    Edit - Though I do not do it all the time, I do play game in S3D, which was the primary reason why I had built this SLi rig in the first place......PhysX was NEVER a consideration. It is an 'enhancement' I can do without.
     
    Last edited: Dec 14, 2012
  17. Legendary_Agent

    Legendary_Agent Master Guru

    Messages:
    888
    Likes Received:
    0
    GPU:
    Asus HD7970 DirectCU II
    Im pretty sure they didnt offer it, they asked amd permission for nvidia to develop amd drivers themselves and we know that rival companies arent skittles and unicorns...

    offering means offering the sdk for that company and as far as i know nvidia certainly did not offer it.
     
    Last edited: Dec 14, 2012
  18. hulawafu77

    hulawafu77 Member Guru

    Messages:
    191
    Likes Received:
    0
    GPU:
    7970M
    Uh wasn't that simple. It had strings attached and AMD declined. Whatever it was, be sure it was in the best interest of Nvidia. C'mon don't be naive, Nvidia doesn't do anything unless it gives them an edge over their competitors, they are a public corporation after all.
     
  19. Fox2232

    Fox2232 Ancient Guru

    Messages:
    9,860
    Likes Received:
    2,240
    GPU:
    5700XT+AW@240Hz
    I will sum it shortly: PhysX was just marketing gimmick for nVidia.
    If they wanted to run it smoothly on any system, they would take OpenCL coding path.

    But then everyone would know that PhysX is far from demanding when faced by regular Quadcore CPU. And that todays generations of nVidia are inferior when it gets to compute.
    That would be bad for marketing, therefore nV is keeping illusion about nV GPU>>CPU.

    Truth is that if I take full OpenCL power of i5-2500k@4.4GHz vs HD 7970 GHz vs GTX 680. It would end up in this way (Higher better):
    Code:
    i5-2500k : HD 7970 : GTX 680
    229k     : 1225k   : 295k
    This is from Luxmark, which does raytracing. Scene with 2M polygons. Point is to calculate light/shadow accurately. And uses incredible amount of vector based light scattering. Vectors are vectors.

    Now Considering biggest overlooked facts of all:
    - This is peak compute performance of each hardware.
    - 50-75% of Quad core CPU sits idle in PhysX based games.
    - Only minor part of nV GPU is used for PhysX, rest goes for rendering of actual game.
    - Minimum amount of unified shaders to run PhysX game is 32 = nV mobile 9600m GT and it still renders game while doing PhysX (poorly but does).
    - Mid End nV GPU with fraction of shaders still run PhysX games fluently

    Taking this we would have i5 with 115k free power at worst. While 25% of 1536 (384) shaders on GTX 680 would have 74k performance rating.

    Now seeing 2 CPU cores being above 384 nV shader cores in math stuff is plain weird. But truth is that 9600m GT could use 16 + 16 cores to PhysX + Rendering.
    Another thing is that nV went Rendering way instead of Compute way for 6xx series.
    Last point is that if nV used even as little as 15% of resources for PhysX, it would automatically mean 15% down in framerate. It would be around 240shaders compared to few in 9600m GT.

    Short Summary: Wait for SDK 3.xx games, where nV promised full CPU multi-threading even if Game coders do not invest any time into optimizing it.
    If they does not work smoothly then PhysX is still just Marketing Gimmick. (Reminds me that I should check Planetside 2 again if they enabled PhysX since it is said to be SDK 3.2)
     
  20. chanw4

    chanw4 Ancient Guru

    Messages:
    2,293
    Likes Received:
    0
    GPU:
    ASUS GTX 670 4GB GDDR5
    What i find it funny is that the Nvidia crowd keep using this statement without any proof and doesn't realise all the string attached to it if it is true and AMD accepted it.
     

Share This Page