TressFX vs Physx

Discussion in 'Games, Gaming & Game-demos' started by brutlern, Mar 8, 2013.

  1. brutlern

    brutlern Master Guru

    Messages:
    295
    Likes Received:
    3
    GPU:
    Gigabyte 1070 X Gaming
    Just read this article regarding TressFX on another site, in short, "TressFX will be fully playable on GTX680, Nixxes Working On Lara’s Hair Collisions"
    It seems AMD are letting nvidia run their tech. Why is that? Given the fact that nvidia will never ever allow Physx to run on AMD cards.

    Also, Crystal Dynamics and Nixxes just realized they need to fix the nvidia issues because they don't want to alienate nvidia users from their game, but other devs, who decide to use Physx, don't think twice about the potential customers (AMD owners) they are losing.

    It all a bit wonky, and seems that AMD are getting the short end of the stick no matter how you look at it.

    Can anyone shed some light on this situation?
     
    Last edited: Mar 8, 2013
  2. SLI-756

    SLI-756 Guest

    Messages:
    7,604
    Likes Received:
    0
    GPU:
    760 SLI 4gb 1215/ 6800
    I'm a fan of both, both green and red cards have their weaknesses /strong points, the Vram on 7000 series is a big win for example, the PhysX on nvidia cards is a big win too (if you go for it that is, and if you do then you generally need to shell out more for a nvidia card of course).
     
  3. KingpinZero

    KingpinZero Master Guru

    Messages:
    916
    Likes Received:
    2
    GPU:
    MSI Armor GTX1080
    TressFX uses DirectCompute, which basically makes it working on everything that support this tech.

    Until now, kepler cards were severely crippled in DC or OpenCL calculations, due to nvidia not caring that much.

    All this TR fuss is making nvidia think twice, and beside Nixxes outstanding support, they really need to fix this thing once and for all.

    As i see it - TressFX tech is awesome because its not tied to any brand, althought its an AMD developed technology.

    This is a proof of mature and responsible marketing/development, they could have locked this feature (as nvidia does) to amd only cards, but instead its open. Open to those that have hardware enough powered to run it.

    I just wish Physix follows the same idea of technology instead of being exclusive. I guess more devs would use the tech in this way, instead of alienating a large portion of users.

    Same is happening on the tablet world, tegra is getting alot of exclusives when the same things could run on any other hardware.
     
  4. brutlern

    brutlern Master Guru

    Messages:
    295
    Likes Received:
    3
    GPU:
    Gigabyte 1070 X Gaming
    I would also like it if Physx were open, but it's not going to happen, so what does AMD gain from allowing nvidia to use TressFX?
     

  5. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,097
    Likes Received:
    2,603
    GPU:
    3080TI iChill Black
    ^
    Physx is open, only HW physx isn't, but to tell the truth AMD users aren't missing much, dont be fooled by those flashing FX its just a gimmick and on top of that its a botlenecked code, still.


    TressFX is a direct compute physics engine for hair rendering, it has no connection with nvidia physx engine.

    Also its a open engine so it runs on all direct compute capable gpus, well those that can handle so many Tflops.

    Nvidia doesnt have any stuff like that in real world, only in tech demos (Nalu new dawn and hair tech demo y2010) and those are also based on direct compute + tessellation.

    The only game that kinda uses nvidia soft hair is Eve online, but idk how it looks i've only read this article once and also its based on APEX for simulation - its base for physics calculations is physx though.
    Edit: kinda like in 1st video, but its very elastic. Imo almost to much,
    http://www.nvidia.com/object/apex_clothing.html

    TressFx aint that better either.. Im really curious how far can they improve it in upcoming TressFx Tombraider update.
     
  6. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    Maybe it would turn out worse for AMD if they made it a proprietary tech? Just a guess.

    Right now they used a different tactic to gun for Nvidia. Instead of making it proprietary tech, they developed it so that it runs under DirectCompute, which GK104 is known to be weak at. Hopefully, more technologies like this should push Nvidia to keep up strong compute performance as it had done before up to the GTX580, and now, Titan (you can enable better compute performance at the expense of clockspeeds and boost).

    As the technology is open, more developers would be willing to adopt it, better relationships with devs, the AMD name gets advertised.

    Right now, it's a WIN-WIN situation for AMD to introduce TressFX the way it is. There are 2 possibilities in the future:

    1) Nvidia ups compute performance again, and the technology runs on both equally (if it is optimized properly and the two brands employ similar architectures)

    2) Nvidia competes on power consumption, temperatures, high clocks, and push gaming performance and separate between their workstation cards and gaming solutions. This would always leave Nvidia crippled in techs that use Compute.

    2 would be very advantageous for AMD but probably not of much interest to developers who prefer to have a wide base that can access their graphics features.

    1 would be advantageous for us as customers as developers would be interested to push compute technologies that patch up some graphics deficiencies we currently face, such as hair animation (TressFX).
     
  7. CSF90

    CSF90 Guest

    Messages:
    208
    Likes Received:
    0
    GPU:
    Gigabye GTX 980Ti OC
    Is Fermi/Kepler's weakness with DirectCompute due to hardware design or driver optimisation? The recent announcement that TressFX will eventually run just as well on NVIDIA hardware would imply the latter?
     
  8. MrRixter

    MrRixter Member Guru

    Messages:
    197
    Likes Received:
    0
    GPU:
    AMD 295x2
    nvidia offered physx license to ati a long time ago. ati refused.
     
  9. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    That's only partially correct, there was a number of other things involved though I also don't know the specifics of that deal.
     
  10. eclap

    eclap Banned

    Messages:
    31,468
    Likes Received:
    4
    GPU:
    Palit GR 1080 2000/11000
    Of course nvidia will care about DC performance from now on, granting there's enough good titles using TressFX. they don't want to look stupid in charts. Makes sense. I think it's a good think that something remotely tied to realistic physics came out and is available on both camps' cards. Not to sound rude, but Physx is not a selling point imho, I can count good games that use it on one hand.
     

  11. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    Hardware design. It's to do with the ratio of double precision (FP64) to single precision. On GK104 Kepler, it's 1/24. On Tahiti, it's 1/4. On Titan, it's a whopping 1/3, equivalent to the Tesla K20X.

    Titan, however, comes with the DP performance crippled as GK104 to allow for higher clocks and boosts within TDP. You go to Nvidia control panel, and you enable the extra Double precision performance, and you sacrifice some clockspeed and GPU Boost 2.0, if I'm not mistaken.

    This explains Radeons' higher power draw than GK104 Kepler as well. The Radeons do not limit clockspeeds or disable boost to stay within TDP, while having double precision performance unleashed.
     
  12. TheReproducer

    TheReproducer Guest

    Messages:
    3,102
    Likes Received:
    0
    GPU:
    Nvidia RTX 3080
    Level with me chief, did I make a big mistake getting a 660ti instead of a 7950?
     
  13. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    The only issue worrying me on the 660Ti is the potential memory bus issue.
    http://www.anandtech.com/show/6159/the-geforce-gtx-660-ti-review/2

    If, in fact, it does not turn out to impact future performance, or you'd be upgrading your 660Ti before this issue becomes real (if it did), then no, the 660Ti is a very good card otherwise.

    You'd get the benefit of Nvidia drivers and features as well, I suppose.

    I generally recommend the 7950 because it's faster overall, and OCs like crazy, and has more VRAM and bandwidth, and no potential memory issue. Also comes bundled with great AAA titles, previously Hitman, Sleeping Dogs, Far Cry 3, and now 2 of these (don't remember), Crysis 3, Bioshock Infinite, Tomb Raider. In general, the 660Ti has weaker hardware than the 7950.

    The 7950 has some frame latency issues that are in the process of being solved (continuous improvements). The 660Ti doesn't. I wouldn't worry about the frame latency issues currently on 7950, it'll pass.
     
  14. ---TK---

    ---TK--- Guest

    Messages:
    22,104
    Likes Received:
    3
    GPU:
    2x 980Ti Gaming 1430/7296
    over clock the vram on that card and you will have increased performance, only worry about "potential issues" when you are faced with them
     
  15. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    Yes, throw out the card you have bought and get a newer card. Just like Nvidia would like you to do.

    You'd have me believe OCing VRAM from 6GHz to 7GHz, a 16.66% increase at the most, would help overcome the 33.33% narrower memory bus?
     
    Last edited: Mar 17, 2013

  16. ---TK---

    ---TK--- Guest

    Messages:
    22,104
    Likes Received:
    3
    GPU:
    2x 980Ti Gaming 1430/7296
    me? nah these cards are running damn near perfect why would I want to throw them out.
     
  17. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    I'm not talking about your cards dude. 680s don't have any memory issues.

    Everybody worries about any potential issues they might face.

    If a specific brand has many of its cards dying after 1 year, would you buy these cards? They're doing fine now, why worry.

    Clearly, this would be a much more serious issue, but not worrying about potential issues in the future is not a good idea.
     
  18. ---TK---

    ---TK--- Guest

    Messages:
    22,104
    Likes Received:
    3
    GPU:
    2x 980Ti Gaming 1430/7296
    which specific brand is dying after one year? I dont think anybody on here would be buying that card here on the forums.
     
  19. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    He's starting to sound like an AMD Marketing employee.....lol.

    A lot of people were scared away from the HD7800 series due to black screen issues....
     
  20. Mannerheim

    Mannerheim Ancient Guru

    Messages:
    4,915
    Likes Received:
    95
    GPU:
    MSI 6800XT
    PhysX is just and "wow effect" without any real physics. Nothing fancy that coudnt be done by todays fast cpu´s.

    Mainly adding objects in the game.
     

Share This Page