Ethic-wise, has AMD done anything wrong?

Discussion in 'General Hardware' started by Espionage724, Nov 4, 2013.

  1. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    All 3 run pretty even in the ethics department. Like I said, they all do whatever they deem necessary to gain any advantage they can get.
     
  2. Pill Monster

    Pill Monster Banned

    Messages:
    25,211
    Likes Received:
    9
    GPU:
    7950 Vapor-X 1100/1500
    ^I'm not being stubborn, I'm crusading for the newbie who builds his first bang for buck BF4 system based on performance tests he saw in TechSpot. That's who I care about....not the websites themselves. ;)

    One of my pet peeves in this industry is misguided information, end users should know..deserve to know the truth as it applies to them.


    And yes I agree 100% on hardforum,...u read my mind....and they never test apples to apples. :nerd:
    I'm not jumping down any throats btw....so don't think that. U can tell this because I'm not swearing..lol :p
     
    Last edited: Nov 6, 2013
  3. IcE

    IcE Don Snow

    Messages:
    10,693
    Likes Received:
    79
    GPU:
    3070Ti FE
    In the defense of the single player benchmark, a lot of sites don't like to use multiplayer mode because it's very difficult to get every test to be the same. It's not about misinformation, it's about ensuring consistency.
     
  4. Espionage724

    Espionage724 Guest

    So far, the only thing I've seen against AMD was their apparent benchmark cheating (still would like a source on this though). Although that's not good, I'd have to place that as pretty minor, in-comparison to, bribing OEMs to badmouth a competitor's product with slander and then drop support for said products (Tier 0).
     
    Last edited by a moderator: Nov 6, 2013

  5. Pill Monster

    Pill Monster Banned

    Messages:
    25,211
    Likes Received:
    9
    GPU:
    7950 Vapor-X 1100/1500
    The CPU test doesn't need to be consistent if the sole aim is to give players an idea of hardware requirements (vs processor comparison).


    If TS can't do a reasonable MP test like sweclockers and other respected HW sites are able to, then the CPU bench shouldn't be included at all.
     
    Last edited: Nov 6, 2013
  6. Apatch

    Apatch Guest

    Messages:
    55
    Likes Received:
    0
    GPU:
    Asus Strix GTX1080 OC

    Then read this http://www.phoronix.com/scan.php?page=news_item&px=MTQ4MDE

    Do you remember AMD lying for several years that OpenCL will be new physx, yeah you need just to wait few years and still buy our product, Mantle could end same way.

    For me both firms looks like liers, they want only to sell their products.
    Imagine a playing bad and good cops ;)

    I would not be suprised if so-far high prices of mid/high end graphic cards would be an effect of AMD/NVIDIA collusion. Weak sells forced lower prices now, but for how long ?

    So everyone thinking that some firm is better then other is just a sheep.
     
  7. nhlkoho

    nhlkoho Guest

    Messages:
    7,754
    Likes Received:
    366
    GPU:
    RTX 2080ti FE
  8. Darkest

    Darkest Guest

    Messages:
    10,097
    Likes Received:
    116
    GPU:
    3060ti Vision OC V2
    I'll wait for a more reputable source before believing that tbh. Toms Hardware is a joke.
     
  9. nhlkoho

    nhlkoho Guest

    Messages:
    7,754
    Likes Received:
    366
    GPU:
    RTX 2080ti FE
    I never noticed your avatar before and I almost spit my drink all over my keyboard.
     
  10. Darkest

    Darkest Guest

    Messages:
    10,097
    Likes Received:
    116
    GPU:
    3060ti Vision OC V2
    Haha. Ramsay Snow is brilliant.
     

  11. Espionage724

    Espionage724 Guest

    Ah, yeah I've noticed that HDMI thing too (noted in first post); although I hear mixed reasoning for it.

    I don't recall the OpenCL thing, but such a claim probably wouldn't be ideal to make, considering developers who used PhysX instead probably got huge monetary incentives to do so... Not to mention Unreal Engine 3 (which a lot of games used) has PhysX pretty integrated, which would give developers all the more reason to use it.

    No need for name calling :) I still find my brand preference to AMD pretty justifiable.
     
  12. nhlkoho

    nhlkoho Guest

    Messages:
    7,754
    Likes Received:
    366
    GPU:
    RTX 2080ti FE
  13. Redemption80

    Redemption80 Guest

    Messages:
    18,491
    Likes Received:
    267
    GPU:
    GALAX 970/ASUS 970
    No idea if AMD lied about it or just failed, but they did spend years promising OpenCL GPU physics.

    Even if Nvidia did pay devs to use PhysX there is nothing ethically wrong with that, it never made the games worse for AMD users just "better" for Nvidia users.
    As for UE3, that used PhysX a good while before Nvidia bought it.

    In my experience people pick and choose what bad publicity to believe or disbelieve based on their band preference, and not the other way round.
     
  14. Espionage724

    Espionage724 Guest

    OpenCL from what I understand though could be used for GPU physics, if someone wanted to write an engine for it (only engine I'm aware of is Bullet).

    I'm open to seeing what bad stunts AMD pulled; but so far the stuff I'm hearing isn't anywhere close to what Intel and NVIDIA has done, in my opinion anyway.

    That would depend on if NVIDIA really chose to keep CPU-accelerated PhysX crippled. Seeing as PhysX is a selling point for their GPUs, I could see them keeping the CPU experience terrible, if not making it worse even over time. Not to mention their effort in preventing Hybrid PhysX...
     
  15. Redemption80

    Redemption80 Guest

    Messages:
    18,491
    Likes Received:
    267
    GPU:
    GALAX 970/ASUS 970
    The GPU stuff was added in on top of the existing cpu effects, not once was cpu accelerated PhysX crippled.

    What people forget is that the GPU effects were "crippled" on Nvidia GPUs as well, essentially Nvidia used Cuda to emulate an Agea PPU.

    They did try and use it to their advantage, but that is just business nothing unethical about it, AMD are doing the same thing with Mantle now.
     

  16. Darkest

    Darkest Guest

    Messages:
    10,097
    Likes Received:
    116
    GPU:
    3060ti Vision OC V2
    It's a better source for sure.

    Which proves my point regarding Toms Hardware.
     
  17. nhlkoho

    nhlkoho Guest

    Messages:
    7,754
    Likes Received:
    366
    GPU:
    RTX 2080ti FE
    Nvidia didn't cripple CPU physx. The original API written by Ageia was written in x87 long before Nvidia bought them.

     
  18. ---TK---

    ---TK--- Guest

    Messages:
    22,104
    Likes Received:
    3
    GPU:
    2x 980Ti Gaming 1430/7296
    yikes this crippled x87 BS is still going on?
     
  19. mbk1969

    mbk1969 Ancient Guru

    Messages:
    15,645
    Likes Received:
    13,647
    GPU:
    GF RTX 4070
    If I remember right it is called floating point instructions. Which is a part of x86 instructions now.

    Edit:

    When was 80486 and when was PhysX? Why CPU PhysX is labeled x87? Anybody saw the CPU PhysX sources? Do they use floating point instructions?

    Edit: OK. I`ve found source of that x87 label:
    And btw:
     
    Last edited: Nov 7, 2013
  20. Espionage724

    Espionage724 Guest

    I'm aware of this now (was pointed out earlier); I was saying that NVIDIA could of left it in x87 as long as they did just to make the GPU experience shine more. But really that's just conspiracy talk :p (unless someone has an actual source saying otherwise).

    As for how optimized PhysX 3 may be, I somehow doubt it would still be on the level it could be (gotta keep PhysX a GPU selling point), but who knows.
     

Share This Page