There is no way AMD will stop making GPU's. They need the advancing technology for integration in upcoming APU's which are their main focus right now.
Yes and no. Nvidia has a knack for shoving their opinions down people's throats with a plunger, and if you resist then they leave you behind for good; it's either their way or not at all. This is a real turnoff to developers who prefer a more "open" or cross-compatible API/ABI. On the other hand, very often nvidia is right about their ideas and they have the results to prove it. To make matters worse, nvidia very often keeps their IP to themselves. Anyway, for those of you who don't want to see AMD go away, this is what happens when you decide to pay for those extra 10FPS. The way I see it, get the cheapest GPU that plays the games you want at the screen resolution you want at 60FPS (or maybe 120 or 144 if that's what your monitor supports). In most cases, AMD falls under this category. If you're going to make yearly upgrades, this often works fine. My GPUs are starting to show their age at this point but they've been holding up for me pretty well since I got them. To be fair though, I don't use much AA.
AMD are finished im telling you, its the end of the road for them unfortunately. Such a shame tbh, they were great at one time but no more.
Are you trolling or really just so much of an intel/nvidia elitist that you honestly think this is the end of them? According to benchmarks, AMD (GPUs) are barely behind overall. With XB1 and PS4, their income has probably stabilized. If they play their cards right with their next-gen CPU architecture, they might start gaining some progress again. As of right now, AMD's greatest weaknesses are performance-per-watt and driver stability. For multi-threaded tasks, their CPUs are decent for the price. But the power consumption of their products is where they fail. As someone who depends on electric heat, I honestly don't care about the higher TDP. At least that means my place will warm up during the winters while I do something productive. Radiators are the epitome of inefficiency.
I totally agree I dislike AMD's products but without competition there is way less advancement and prices skyrocket
So what should AMD do? Should customers be forced to buy their products so they stay afloat? Should the government bail them out to prevent a monopoly? It's nobody's fault but their own that their business is failing.
AMD won't go, they're probably the most undervalued stock on the market atm. However, I can see them dropping the less profitable side of their business. I'm hoping they dump support for the high end pc chips, I just can't see were that's going anymore. I just want a nice quite quality low end rig, most new games suck the dogs anyhoo.
Of course it's their own fault I'm not offering a solution, just stating that if they continue to plummet the consumers won't be the winners that's for sure
@kendoka15 and nhlkoho No, it isn't their own fault. Sure, they had some management issues and a crappy marketing staff, but both intel and nvidia are pretty anticompetitive companies. AMD played it clean and they lost because of it, and now they're paying the price. Though ATI never would have got to where it is today if AMD didn't buy them out, I feel like today AMD's CPU dept is holding back their graphics dept. Customers don't have to be forced to make AMD stay afloat, but customers are not obligated to buy the fastest product around either. If you REALLY feel better about yourself by getting the fastest thing around, then go ahead. But the problem is the average person looks to people like us for opinions. AMD products are perfectly fine for the average user, but I'm sure many intel and nvidia fanboys will never recommend AMD strictly due to their own biases. Intel and nvidia unquestionably have better products overall, but 9 out of 10 people don't need that premium.
Again, I agree. I can't speak for others but I've recommended countless times AMD to anyone looking for value (except for the 970 which is pure gold price wise) I use Nvidia because I wanted the best GPU I could get that was under 1k I used to have an ATI card
I 100% agree with you. But for CPU's - who's the bigger name? Not one of my non-techie friends know who AMD is but everyone knows Intel. Intel also has a processor for every price range. GPU's is where its really going to hurt consumers. I'm not an AMD guy but I recognize that they have good GPU's. If it weren't for the xbox1 and ps4 I don't think they would be around much longer.
Intel has been the bigger name because they've fought dirty since the 90s. AMD and Cyrix were basically just there to keep Intel in check. Then when Cyrix went under, AMD got a few extra sales but by the time they had a product that could outperform Intel they were too late - Intel already bought their way into 1st place. Then when Apple switched to Intel, AMD went on a continuous downward slope. If Apple supported AMD instead, I'm pretty sure Bulldozer never would have happened. Also, AMD's name is INCREDIBLY boring, and they don't have a catchy slogan. Intel is an acronym and a fun memorable one. They also have an AMAZING slogan - "innovators of tomorrow". Even "Intel inside" was super catchy. Intel's marketing dept really knows what they're doing. I'm not sure GPUs would hurt consumers that much. Most people don't buy discrete GPUs, and intel's graphics have caught up enough to play modern games at low settings. Nvidia currently is expensive but not as stupidly expensive as intel. That being said, right now I think the CPU market is in greater distress. Intel's prices are really getting carried away and VIA isn't even an option to consider. As a linux user, I don't REALLY care, because I can just switch to ARM or PPC if I need to.
all research is on performance per watt right now, so that's basically AMD quite behind... I dislike that, but its true. Anyways Intel needs a competitor, otherwise they'll have to face anti-monopoly laws, so i don't think they'll just kill AMD like that. Anyways, Intel is ahead of AMD on CPUs (also on $$ and resources, so its normal), but on GPUs architecture AMD is not exactly that behind... its "only" a matter of finding some1 who can make smaller transistors. Plus they got the best APUs/CPU integrated GPUs currently. My next GPU will probably be a 970 tho :/
They're not dead 'till they're dead and anyone saying straight up "they're dead" doesn't know the industry very well And I challenge ANYONE who disagrees at fisticuffs. Let's settle this like gentlemen
I'm not sure about that. That obviously would make a significant difference but not enough. I think AMD's drivers are what needs the most work. In a perspective of just numbers, AMD's hardware is and almost always has been superior to nvidia's, but they usually fell behind. There are obviously architectural differences between how the two brands make their products, but with enough driver updates, AMD GPUs eventually catch up. A few years down the road and I'm sure the 290X will outperform the 980 in just about every test, assuming AMD keeps up with maintenance. AMD is at their weakest at release date, whereas it wouldn't surprise me if nvidia polishes their drivers for what people are most likely going to benchmark first. Anyway all this being said, if the drivers improve enough, AMD can use "lesser" hardware, and therefore it will run more efficiently. For the record, I do own AMD, intel, nvidia, and ARM based systems, though I use my all-AMD system the most. As a computer enthusiast, I think its best to diversify your systems and divide your workload between them based on what the hardware is best at. Nvidia and intel aren't #1 in all applications (though at this point intel is pretty much #1 in everything).
Kind of a blanket statement don't you think? Power efficiency falls into superiority of hardware and that's been Nvidia's game for a while, especially with the last two generations (though AMD has of course always been the cost-effective alternative, even now lowering prices of the 290 and 290x according to a frontpage article) The custom 290x's stay cool but consume a TON of watts compared to Nvidia's equivalent If you provide the numbers you referred to I'll agree wholeheartedly