Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jan 18, 2019.
Availability won't be the issue. Demand will be.
AMD Denies Fab Sell-Off - 07/28/2008 11:23 AM
AMD is denying a report that appeared in the Austin...
I don't know why this made me laugh...but it was coincidentally the first story headlined on the article page above the comment section...
AMD has surely been underwhelming in their pitch and sell of the VEGA VII to say the least. Nvidia head, Jensen Huang, said the Vega VII "has lousy performance" and AMD response is Meh? Well doesn't that instill confidence in the customer to run out and buy one in February? Why AMD shrunk down their VEGA 10 chip and overclocked it and are trying to pass it off as a different Gpu is beyond me. When they had a chance to bring something new to the gamers that had been waiting years for a new designed chip not just a shrunk down version of last chip. Nvidia missed the mark with their RTX joke of an offer of $1200 so you can game with one game that utilizes ray tracing. AMD has let the window of opportunity close on them to wow the gaming community. Pass on RTX -Pass on VEGA VII.
AMD can only respond with what they have. It is a 'meh' product. While I appreciate the hardware accelerated features of Turing, the price makes it a 'meh' for me and I'm someone who dumped over 2K on the OG TITAN Kepler when they first came out.
Pointless to do so. It'll cost them a lot more to fab up 8GB versions and then have the supporting software for it than just binning the MI50's and releasing them with cut down FP64. For AMD they are actually selling a bunch of cores that would fail the MI50 bin but can be used perfectly well for gaming.
Why would you listen to Jensen on AMD performance? He said that purely to disparage the device. It's a stopgap high end GPU that only exists because Jensen got greedy with pricing. He's annoyed because the cost to AMD to retrofit the MI50's to be consumer cards is minimal and therefore the overall cost of production is down due to lack of R&D costs, offsetting the HBM2 costs and allowing it to be released and competitive at the RTX2080 price point.
Nvidia lose out more due to the new Turing Arch costing more to get to customer's, having the terrible RTX launch and the sudden shift in public opinion which really seems to have hit the company's share price hard.
I'v said it atleast 10 times and i'll say it again. AMD only needs to produce a good card at a great price to beat Nvidia in sales. So many folks are royally pissed at Nvidia atm. Really, amd just needs the price to be right and a card that can game well. No funny stuff like RTX ect. Lets see how the VII compares first and what the real price will be.
And you guys are also missing the point of the VII:
1) As pointed out by others here, it's a good gaming card, not the best, but a good one, and a COMPUTE BEAST. I wonder what all that compute power could be used for? Hmm, I don't know, maybe DXRT and DirectML?? It's obvious AMD is partnering with Microsoft to offer a competitor to RTX via DirectX12.
2) Also mentioned earlier, the VII is a stopgap card until Big Navi.
3) VII caters not only to gamers, but professional content makers, which makes the card a really affordable pro card. At $699, I'm sure some pro workstations can drop 2 or 3 of them in, and make a compute beast setup. These customers will buy this card, and thus provide revenue to AMD, whether or not gamers adopt it.
4) I'd like to see what this card could do on water.
5) Buildzoid discussed his musings on the VII, and determined that HBM2 was required to achieve the 1TB bandwidth, to overcome an initial Vega design of starving the GPU from the previous design. (He reckons that 800GB transfer rate would be the sweetspot for Vega design) To do the same with GDDR6 would require a 512bit setup, and take up too much real estate on the card for memory paths, and require a thicker PCB.
6) NVIDIA hate is at a high, so VII + FreeSync2 is very attractive. If MGPU became more prevalent in game support, then VII in Crossfire would be brutal, esp iff card cost drops later as production ramps up. (Tho I am sure retailers will milk this card too and jack up price)
I wouldn't kick it out of bed.
I don't believe anyone is missing that is your opinion of the value of the GPU. You're basically reciting the same 'value' arguments that AMD users have relied on... well... since they have not been that competitive. They gained some ground in the CPU space. Other than using TSMC's 7nm, they lost ground to Nvidia. I personally see you and other AMD fans as missing that. But, that is also, just my opinion.
Actual sales vs. steam stats numbers. Or some benchmark which minority of users runs. Gaining/Losing ground? Prove it.
I was pointing out AMD'S response to Juang comments, or more that they had no response. And that caused concern for me that they wouldn't defend their product and yell from the tallest peak that they indeed have a great product. Their response was, we don't want to get caught up in comments. Really That's their hard sell?That was my point. Not that Juang said something.
AMD has always been known as the underdog that cares about their customers and try's to give them a good product at a better price than Nvidia. But this time around AMD priced their card the same as a 2080. Have they not learned a lesson from Nvidia's stock dropping 50 percent after their Rtx release debacle? You would think AMD would say to themselves right we always come in at a better price and this is THEE best time to utilize that policy and come in say for argument sake at $650. Unfortunately they came in at same price as the 2080 and people are lumping them in with the Nvidia hating.
So I feel AMD also missed their opportunity to be the hero.
These cards offer a whole lot more of future proofed performance than that of team green for the same amount of money. Not to mention the availability of 16gb of ram opposed to 11....oh and that's HBM2 instead of Nvidias GDDR6.
I loved his 'cya tweet' - "Notably, this is still twice the native FP64 rate of all other Vegas." yeah yeah - shutup...
Don't forget I also stated that NVIDIA wiil be an irrelevant company in 2025. Their business practices have given them immediate profits, and also alienated themselves from their OEM partners, both cards and monitors. Apple will have nothing to do with them, and neither will Intel. Microsoft is leveraging DirectX (and to an extent Vulkan) to undermine proprietary solutions, just like they did to Creative Labs. A couple of monitor vendors stopped making Gsync Module monitors, and I'm sure this will grow with NV's adaptation of VESA AS units.
And with the money AMD has gotten from Sony and Intel to their R&D, they can finally ramp that up and design some silicon, and benefit from input from their partners. After all, thanks to NVIDIA's business practices, AMD has the console business locked in.
What sort of nonsense claim is this?
Nvidia has ~80% market share of consumer desktop GPUs. They have close to total dominance in the laptop market and basically total dominance in the professional market (workstation, servers, datacenters, supercomputers).
They have been pushing for better deep learning hardware that leverages GPU parallelism. Their CUDA programming model is way more popular than OpenCL and is often more performant.
Neither DirectX nor Vulkan would have an effect on Nvidia similar to what Vista did to Creative soundcards. It's not like Nvidia are leveraging completely alien technologies through fringe, frail APIs ...
1) Raytracing is being done through DirectX
2) GameWorks is game-dependent (and perhaps, at most, DirectX-dependent)
3) DLSS is API agnostic (given how it works)4)
PhysX is open-source and predominantly CPU based these days (as well as de-facto for the popular game engines).
4) G-Sync is transparent to games and other 3D applications
You have the choice to use CUDA (preferable) or OpenCL to leverage the capabilities of their GPUs, although CUDA will often perform better for them.
You now have the choice to use G-Sync or Adaptive Sync ("FreeSync") displays with their GPUs. They already got enough money out of selling G-Sync monitors. Even now, the vast majority of FreeSync monitors do not hold a candle to the proper G-Sync implementations that do not black out, do not flicker, have wide refresh rate ranges, and have variable overdrive. Let people cheap out on some of those shady FreeSync monitors and see for themselves what they're getting - hackish, half-working solutions that should have been relegated to the dustbin long ago. I've yet to see a proper FreeSync monitor with a wide refresh rate range, proper HDR FALD implementation, variable overdrive, and no other issues. Seems like we'll have to wait for upcoming TVs to get (some of) that. Isn't that pitiful?
Apple hasn't used Nvidia hardware since forever... not sure why that would be worthy of mention.
AMD has had the console market since 2013. Your point regarding Nvidia's business practices regarding the console market practically misses the Nintendo Switch, which is a very high-selling portable console that uses an Nvidia Tegra GPU. AMD keep almost falling into bankruptcy, and if it weren't for their recent successful win with Ryzen, would have still been red-lining now.
If Nvidia is going to be irrelevant in 2025, I shudder to think what would happen to AMD's graphics division.
The right answer is that no one is going anywhere because these companies, unlike what fanboys of either company like to project onto them, still, to an extent, mostly know what they are doing - and the markets they are targeting. Have you seen where AMD's graphics division went? Gunning for the midrange, a reasonable target given their inability to match Nvidia's latest and greatest. Have you seen where Nvidia went? Trying out new, early adopter technologies which will prove to be the next step in graphics, since they're practically competing with themselves at the high end at this point. They do not care about providing good value for money with RTX as people at that point simply went for Pascal GPUs.
When you reduce the industry leader into embarrassing himself with an immature pathetic response like he gave, you actually don't need to say anything. You've won that round. Besides, out of the other side of his mouth, he's making moves to counter the VII. The 1160 and I'm sure they have an 1180 up their sleeve just in case. This will cannibalize their 20 series equivalents given RTX won't be of much real value for a couple of years and will most likely have less proprietary alternatives and DLSS the same.
The VII is a great chess move whether or not you personally find it appealing.
Good questions and points. With PS5 and Next Xboxes based on NAVI this is surely to lead to some optimization for Vega and Navi on the PC.
I kind of agree with this from a particular perspective. By 2025 I think most games will be streamed and AMD is getting a leg up with all the major players, Alphabet, Amazon, and Microsoft with Radeons as the streaming GPUs in the datacenter. EPYC growth in the datacenter space can't hurt either.
However, NVIDIA will still be big with AI and content creators though AMD is growing here as well.
As if they're not getting any optimizations now? All current AMD GPU architectures are based on GCN. We've heard this tape since 2013. AMD GPUs still compete the same way they did with Nvidia GPUs before the consoles released. PC releases aren't just optimizing for console GPUs and forgetting about Nvidia's architectures. GPU market share is 80% Nvidia. Why would game developers miss out exactly?
Games that are primarily PC are not necessarily receiving the best optimization for AMD architecture. I belive those that are multiplatform and big on consoles are such as Forza and Tomb Raider reboots. Also FPS' are a PC thing and not really optimized that much for consoles or the AMD architecture. That could possibly change with Keyboard/Mouse support on Xbox but I doubt it and hope not. I think FPS' are killing creativity in game development.