The post I am talking about I am sure I read on this site. I have not found it yet, but here are a few others pointing out that the GK104 is going to be the 560 replacement. http://www.tomshardware.com/news/Nvidia-Kepler-GPU-GeForce-600-Series,14642.html http://www.tomshardware.com/news/Nvidia-Kepler-GK104-GeForce-GTX-670-680,14691.html
Those links are rumors that never came true. look at the gk110 690 and the bogus memory bus on those cards. and its tom hardware too
OK this is not the one I saw but is basically the same thing. "During CES 2012 we sat down with NVIDIA and they wouldn’t talk about Kepler on the record, but off the record that they expected to see more from the AMD Radeon HD 7970. From our face-to-face conversation with NVIDIA we walked away with the feeling that they were underwhelmed by what they saw and that Kepler would be able to easily leap frog the Radeon HD 7900 series." http://www.legitreviews.com/nvidia-kepler-gk-104-gpu-specifications-leaked-should-amd-worry_12324
More flexible than this? http://physxinfo.com/index.php?p=gam&f=all Thats 500 games flexible! This is simply not true. The idea is simple, but G-Sync is so complicated it would make my head boil if I fully understood it. That much I know. FreeSync - imagine making a demo that does not even demonstrate the desired effect. That was AMD's hastily made first demo. Every G-Sync monitor bought means one enthusiast locked with NV for foreseeable future. They need to react first, and so they have. In second demo they've got the desired effect, but somehow forgot to include perf overlay. Alas it seems that it ran at constant 45fps. Very convenient and very little information about the underlying mechanism. And no white paper. But much confusion. Make no mistake, they would be all over it, if they had killer tech that buries competition. And is FREE! I have no doubts that G-Sync is a better performing choice. The question is how much better, is FreeSync still going to be "good enough" (my prediction is just that), and most importantly for AMD - WHEN are FreeSync monitors coming out. They pretty much need it now, so I don't exactly take for granted everything they are promising.
Confirmative. Nvidia has officially marketed its 700 series for The official slides for 1st time showed the percentage based performance increase over 580 in 780 review. Not something that was adverted the same way with 680. Although I still think that 580 true successor is 780 Ti which is fully unlocked as well. Even some of the reviewers were skeptical of a shot PCB of what is supposed to be a high end part.
aaaaaaand he's done it again, about 30 min in (see a pattern?) http://www.maximumpc.com/no_bs_podcast_226_-depth_interview_amd_graphics_guru_richard_huddy "nvidia puts a clause in the contracts with developers that says they are not allowed to optimise for AMD"
PhysX was developed strictly for gaming and can only be modified by NVidia. Bullet, being an open source API, can be modified by anyone that decides to use it...... THAT is flexibility. With PhysX, devs are forced to use it "as is". With Bullet, the devs can add whatever code they need if it doesn't already exist. Bullet also works well in both gaming and movie creation.....
Why arent more developers using more bullet if its so great ? http://en.wikipedia.org/wiki/Bullet_(software) When i look at the games i only see dated games
Saw that the other day. One of the mods over at Overclock dot net made some interesting points. Post #2 http://www.overclock.net/t/1496837/...tle-gameworks-freesync-and-more#post_22440172 Spoiler Originally Posted by Alatar (Forum Moderator)
1. You want humongous amounts of tesselation, go ahead and do it but don't prevent the other party from doing their own optimisations. With GW, you can't, I think that is pretty clear. It doesn't matter if you and me say that R. Huddy is a liar, what matters is that nvidia says that. Will they? R. Huddy said that nvidia puts clauses in the contracts with developers that forbids them from optimising for AMD and challenged nvidia to make their contracts public and compare them to AMD's. You wanna prove he is a liar, GO AHEAD. That is what matters, not what the fanboys say. 2. About Mantle, I don't know why all the fuss. Does AMD aim to gain advantage with it? OF COURSE THEY DO. Is Mantle preventing nvidia from doing their own thing or doing their own optimisations in DX or even Mantle if they chose to use it? NO and by the way, Ryan is generally more on the green team than on the red.
Taken from my post above.. Both Red and Green are full of it. The best you can do really is not let marketing promises decide your purchases for you and just do your research before putting down any cash and best to wait for atleast 6 months before buying a new gen gpu.
You don't get it. If nvidia and intel don't want to go with it, they don't have to. And I do think they have perfectly valid concerns not to jump in it, at least for now. But AMD is NOT shoving it down their throat, and that is the big difference with GW. Once nvidia locks-in the developer (and it looks like they have, unless nvidia disproves this) EVERYONE has to use it whether they want to or not.
Im not and i dont think anyone else here is arguing that they are being forced to, the point is that they want to but they've been denied crucial information from AMD. Look, competition is great, but Nvidia's driver while back proved that normal path of optimization can do just as good, if not better, than going to all the effort of making a brand new api that will need years of refinement and marketing just to get support.
Thats where this is heading, sometimes threads like these are just veiled amd vs nvidia threads and are started for flame wars.
I really dont get it tbh. I dont like either company, its just that intel and nvidia have always been worth the extra coin imo. Id go amd if they could show me something that make ditching the 580s worth it, but as it is, not even nvidia and game developers have done that yet.
Well they are saying that for now it is in beta and later it will be available. do I blindly trust them? NO If in 2015 is not yet available then imho they would have shot themselves in the foot. About how you chose to optimise is a strategic decision, if nvidia is right then AMD will pay for it dearly, it could possibly even kill them completely and that would be fair. Competition means let us BOTH do our thing and see what works better in the end.
Could this be Microsoft's fault because DirectX wasn't cutting it anymore? So the devs asked AMD for something close to the metal (or along those lines), and now comes AMD's answer, Mantle. Nvidia (and Intel per (.)(.)'s post) doesn't want to be subjected to AMD's whims, so they lock in to devs via Gameworks and/or some other contractual stuff.