nvidia "locks-in" developers

Discussion in 'Frontpage news' started by kd7, Jun 20, 2014.

  1. MAD-OGRE

    MAD-OGRE Ancient Guru

    Messages:
    2,905
    Likes Received:
    0
    GPU:
    SLI EVGA 780 Classifieds
  2. ---TK---

    ---TK--- Guest

    Messages:
    22,104
    Likes Received:
    3
    GPU:
    2x 980Ti Gaming 1430/7296
    Those links are rumors that never came true. look at the gk110 690 and the bogus memory bus on those cards. and its tom hardware too
     
  3. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    You made him forget his point, which is...

    It happened on AMD watch. Along with € 2850 Titan-Z.
     
  4. MAD-OGRE

    MAD-OGRE Ancient Guru

    Messages:
    2,905
    Likes Received:
    0
    GPU:
    SLI EVGA 780 Classifieds
    OK this is not the one I saw but is basically the same thing.

    "During CES 2012 we sat down with NVIDIA and they wouldn’t talk about Kepler on the record, but off the record that they expected to see more from the AMD Radeon HD 7970. From our face-to-face conversation with NVIDIA we walked away with the feeling that they were underwhelmed by what they saw and that Kepler would be able to easily leap frog the Radeon HD 7900 series."


    http://www.legitreviews.com/nvidia-kepler-gk-104-gpu-specifications-leaked-should-amd-worry_12324
     

  5. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    More flexible than this?
    http://physxinfo.com/index.php?p=gam&f=all
    Thats 500 games flexible!

    This is simply not true. The idea is simple, but G-Sync is so complicated it would make my head boil if I fully understood it. That much I know.

    FreeSync - imagine making a demo that does not even demonstrate the desired effect. That was AMD's hastily made first demo.
    Every G-Sync monitor bought means one enthusiast locked with NV for foreseeable future. They need to react first, and so they have.

    In second demo they've got the desired effect, but somehow forgot to include perf overlay. Alas it seems that it ran at constant 45fps.
    Very convenient and very little information about the underlying mechanism. And no white paper. But much confusion.

    Make no mistake, they would be all over it, if they had killer tech that buries competition. And is FREE!
    I have no doubts that G-Sync is a better performing choice.

    The question is how much better, is FreeSync still going to be "good enough" (my prediction is just that),
    and most importantly for AMD - WHEN are FreeSync monitors coming out.

    They pretty much need it now, so I don't exactly take for granted everything they are promising.
     
  6. Confirmative.

    Nvidia has officially marketed its 700 series for
    The official slides for 1st time showed the percentage based performance increase over 580 in 780 review. Not something that was adverted the same way with 680. Although I still think that 580 true successor is 780 Ti which is fully unlocked as well.

    Even some of the reviewers were skeptical of a shot PCB of what is supposed to be a high end part.
     
  7. kd7

    kd7 Guest

    Messages:
    151
    Likes Received:
    4
    GPU:
    7970m
  8. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    PhysX was developed strictly for gaming and can only be modified by NVidia.

    Bullet, being an open source API, can be modified by anyone that decides to use it...... THAT is flexibility. With PhysX, devs are forced to use it "as is". With Bullet, the devs can add whatever code they need if it doesn't already exist. Bullet also works well in both gaming and movie creation.....
     
  9. WhiteLightning

    WhiteLightning Don Illuminati Staff Member

    Messages:
    30,795
    Likes Received:
    3,968
    GPU:
    Inno3d RTX4070
  10. pbvider

    pbvider Guest

    Messages:
    989
    Likes Received:
    0
    GPU:
    GTX

  11. (.)(.)

    (.)(.) Banned

    Messages:
    9,089
    Likes Received:
    1
    GPU:
    GTX 970
    Saw that the other day. One of the mods over at Overclock dot net made some interesting points. Post #2

    http://www.overclock.net/t/1496837/...tle-gameworks-freesync-and-more#post_22440172
    Originally Posted by Alatar (Forum Moderator)
     
  12. Goldie

    Goldie Guest

    Messages:
    533
    Likes Received:
    0
    GPU:
    evga 760 4gb sli
    amd and their schoolboy approach is one of thre reasons i'll never touch them.
     
  13. kd7

    kd7 Guest

    Messages:
    151
    Likes Received:
    4
    GPU:
    7970m
    1. You want humongous amounts of tesselation, go ahead and do it but don't prevent the other party from doing their own optimisations. With GW, you can't, I think that is pretty clear.
    It doesn't matter if you and me say that R. Huddy is a liar, what matters is that nvidia says that. Will they? R. Huddy said that nvidia puts clauses in the contracts with developers that forbids them from optimising for AMD and challenged nvidia to make their contracts public and compare them to AMD's.
    You wanna prove he is a liar, GO AHEAD. That is what matters, not what the fanboys say.

    2. About Mantle, I don't know why all the fuss. Does AMD aim to gain advantage with it? OF COURSE THEY DO. Is Mantle preventing nvidia from doing their own thing or doing their own optimisations in DX or even Mantle if they chose to use it? NO


    and by the way, Ryan is generally more on the green team than on the red.
     
  14. (.)(.)

    (.)(.) Banned

    Messages:
    9,089
    Likes Received:
    1
    GPU:
    GTX 970
    Taken from my post above..

    Both Red and Green are full of it. The best you can do really is not let marketing promises decide your purchases for you and just do your research before putting down any cash and best to wait for atleast 6 months before buying a new gen gpu.
     
    Last edited: Jun 21, 2014
  15. kd7

    kd7 Guest

    Messages:
    151
    Likes Received:
    4
    GPU:
    7970m
    You don't get it. If nvidia and intel don't want to go with it, they don't have to. And I do think they have perfectly valid concerns not to jump in it, at least for now.

    But AMD is NOT shoving it down their throat, and that is the big difference with GW. Once nvidia locks-in the developer (and it looks like they have, unless nvidia disproves this) EVERYONE has to use it whether they want to or not.
     
    Last edited: Jun 21, 2014

  16. (.)(.)

    (.)(.) Banned

    Messages:
    9,089
    Likes Received:
    1
    GPU:
    GTX 970
    Im not and i dont think anyone else here is arguing that they are being forced to, the point is that they want to but they've been denied crucial information from AMD.

    Look, competition is great, but Nvidia's driver while back proved that normal path of optimization can do just as good, if not better, than going to all the effort of making a brand new api that will need years of refinement and marketing just to get support.
     
  17. ---TK---

    ---TK--- Guest

    Messages:
    22,104
    Likes Received:
    3
    GPU:
    2x 980Ti Gaming 1430/7296
    Thats where this is heading, sometimes threads like these are just veiled amd vs nvidia threads and are started for flame wars.
     
  18. (.)(.)

    (.)(.) Banned

    Messages:
    9,089
    Likes Received:
    1
    GPU:
    GTX 970
    I really dont get it tbh. I dont like either company, its just that intel and nvidia have always been worth the extra coin imo. Id go amd if they could show me something that make ditching the 580s worth it, but as it is, not even nvidia and game developers have done that yet.
     
  19. kd7

    kd7 Guest

    Messages:
    151
    Likes Received:
    4
    GPU:
    7970m

    Well they are saying that for now it is in beta and later it will be available. do I blindly trust them? NO
    If in 2015 is not yet available then imho they would have shot themselves in the foot.
    About how you chose to optimise is a strategic decision, if nvidia is right then AMD will pay for it dearly, it could possibly even kill them completely and that would be fair.
    Competition means let us BOTH do our thing and see what works better in the end.
     
  20. Riffmaster

    Riffmaster Guest

    Messages:
    103
    Likes Received:
    0
    GPU:
    MSI 580 Lightning Extreme
    Could this be Microsoft's fault because DirectX wasn't cutting it anymore? So the devs asked AMD for something close to the metal (or along those lines), and now comes AMD's answer, Mantle. Nvidia (and Intel per (.)(.)'s post) doesn't want to be subjected to AMD's whims, so they lock in to devs via Gameworks and/or some other contractual stuff.
     

Share This Page