AMD Vega To get 20K Units Released at launch and new Zen+ chatter

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, May 9, 2017.

  1. holler

    holler Master Guru

    Messages:
    228
    Likes Received:
    43
    GPU:
    Asrock 7900XTX Aqua
    AMD is betting on the long term. HBM is the future because of packaging and power savings. the ability to have shareable memory on die is huge. imagine 4+ GPUs on one card. with vulkan and directX12 handling mgpu much better AMD could have an advantage down the road. that is how they're innovating. the current nvidia iteration (1080) isn't much different then Fermi (GTX 480) if you think about it.
     
    Last edited: May 9, 2017
  2. MorganX

    MorganX Member Guru

    Messages:
    142
    Likes Received:
    15
    GPU:
    Nvidia 3080 Ti
    I definitely could have worded my post better, but ... This.

    Specifically, gimping GPUs to and extreme extent IMO, to maximize profits due to a monopoly, keeps the least common denominator very low which in turn, prevents developers from targeting a more capable baseline.

    The same goes for Intel and cpu cores. To say that their monopolies stifle innovation does not necessarily mean they themselves are not innovative.
     
    Last edited: May 9, 2017
  3. sammarbella

    sammarbella Guest

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    Nvidia is not guilty to don't have competition in GPU market from AMD like Intel was a de-facto monopoly in desktop CPUs for the last 5 years due to sub-par AMD CPUs performance.

    Competition should sell competitive products at similar performance and lower price to force innovation and better specs for customers.If the best AMD can do is provide a similar performing product at similar or even higher price (Fury X i'm looking at you) this is not real competition.

    IF Ryzen "refresh" really provides a 15 % improvement in IPC (both meanings..:D) over initial versions that could force Intel "innovate" and sell better performing CPUs at a lower prices.

    I bought a Nvidia Shield TV 2015 in late 2016, Nvidia relaunched the EXACT same product without a couple of ports a few months ago (in 2017), same specs, same SoC, same price.It's still the best media box in the market after 2 years.

    Who is forcing Nvidia to improve the best when there is no competition?

    When you sell your product without competition : you are selling the best product customers can...buy.

    If Vega performance is similar to 1070 and his price is similar it will be another "innovative" AMD HBM fail.
     
  4. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,125
    Likes Received:
    969
    GPU:
    Inno3D RTX 3090
    Listen here at 42:20.

    HBM is another class of performance from GDDR5x, both for latency and for effective bandwidth. It's not even close.

    I'm sure that there will be a Vega card on the 1070 performance bracket. Everything that will decide this is going to be the price. Judging by initial rumored availability (which could be wrong), the mainstream part of Vega will most likely come later.

    AMD has had, and still has, a huge perception issues. Youtubers and tech writers are doing a better job explaining how their tech works and progresses over time, than they do. Look at chips like Hawaii, as an example. AMD dug a hole in their own product by initially not investing in drivers for it (which was the stupidest thing they could ever have done), and packaging it with subpar cooling solutions. The same product that was "destroyed" by Maxwell is still faster than it, despite being a year older.

    Same story with the Fury. They promised OC headroom for the Fury X, which was a mistake. They didn't promote the vanilla Fury at all, despite it being in the same price range as the 980 and destroying it at the same time in performance.

    They have a true, big, marketing issue.
     
    Last edited: May 9, 2017

  5. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    The overpriced GPU argument is boring.

    GTX8800 cost $600 at launch in 2007. The architecture had a R&D budget of $475M. Discreet GPU sales in 2006 were 85M. Discreet GPU sales in 2015 were only 44M.

    Nvidia claims Pascal cost them "several billion". Their R&D budget is ~350M per quarter so about $1.4B per year and pascal took 2 years. I doubt all of it is Pascal itself, so let's just assume Pascal $2B.

    https://venturebeat.com/2016/05/06/...pps-360-degree-photo-art-and-3d-audio-for-vr/

    Cost per transistor for basically everything after 28nm stalled, it isn't making GPU's cheaper yet GPU's have more and more transistors each year.

    So basically Nvidia is charging roughly the same price as it did for it's GPUs in 2007, the overall market is only half the size, the R&D budget has increased ~3x and they don't even have the benefit of transistor cost scaling.

    AMD has it even worse because of HBM and watercooling on it's consumer cards eating into it's margins. The analyst takeaway after the giant stock drop the other day (I lost $5000 on it myself) was basically AMD's margins are bad and they burned through 30% of their remaining liquid cash. And yet people want AMD to release a 1080Ti watercooled HBM monster sized card for like $500..

    It's also why I laugh when people say Titan X/Tesla/Quadro cards are over priced. Those margins are essentially subsidizing the cost of consumer GPUs. Every company that buys a DGX-1 with 8, $10,000 GPU's in it are basically allowing you to buy your geforce card as cheap as you are. It's the whole reason why Nvidia branched into all these different high margin markets in the first place.

    Edit: That also brings me into the resolution argument. For the past several years you needed the absolute best single chip card in order to game in the latest titles at 1080p. 1080p covers 90% of the gaming market. So basically 90% of gamers had to pay $500+ to enjoy newest games at reasonable framerates. Now that is covered by ~$250 1060/480/580s, which also have lower margins. And with Volta/Navi or whatever the Polaris replacement is, it's going to get even more ridiculous.. it's going to be like $150 for a 1080p card capable of handling every game at highest settings - especially now that the consoles are pushing 4K and effectively pausing graphic as the new horsepower is going to be entirely used for the resolution.

    Eventually people will switch to 4K as the monitors come down in price. But 90% of the market isn't switching overnight.
     
    Last edited: May 9, 2017
  6. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,125
    Likes Received:
    969
    GPU:
    Inno3D RTX 3090
    I don't know if you have any experience with corporate finance, or finance at all. Take all that "R&D" amounts with huge grains of salt. Companies that are designing chips have standard running costs that they can present however they like for marketing purposes. AMD needs roughly $600m to run for a quarter, NVIDIA probably needs something more than that. Whenever they present you with "R&D" costs they simply ad up their running cost for the months it took to produce a chip. The purpose of the whole company is R&D anyway. Don't buy too much into it. They just have to cover their running cost from sales from the new architecture and that's it. There is no initial investment to currently recuperate, this has happened since the first product each of them launched.
     
  7. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    Well yeah, it's more complicated then I'm making it out to be. There are all kinds of costs that have gone down and up and other factors I'm not even considering - but R&D costs for these companies definitely gone up even if it's not in the ~$2B range, the discreet GPU market is definitely smaller than what it was and transistor price scaling has essentially stalled.

    Also, I edited my post after you quoted, but in the edit I said the vast majority of people (90%) don't even need Vegas/Ti/1080s. It's probably why AMD is only shipping 20K Vega chips. It's also probably why they did what they did with Polaris. They knew the RX480 would hit 90% of the market and it would be fine.

    Idk, I just don't see the issue with the pricing but then again my financial status, where I live, etc probably all plays a role into my perception of it.
     
  8. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Hell, Yeah!
    But even this time it will be miracle to get decent 12" notebook.
     
  9. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,125
    Likes Received:
    969
    GPU:
    Inno3D RTX 3090
    Ah yeah, I agree completely. But pricing is a bit ridiculous I think, especially if you're a bit of an old timer. I might be factually wrong, but my gut tells me that the 2560 shader/256-bit bus GTX 1080 shouldn't cost close to $700 at launch, no matter the performance. It's subjective, but I still feel like I'm getting robbed.
    You have to consider relative performance into it too. I know that a computer is a computer and it has myriad of uses, and that you'll get much better graphics etc out of it, but consider something like the Scorpio. With the same amount of money as an uppder-middle range GPU, you get a whole system that is basically plug and play and will give you true 4k games and a ton of apps on top. The total cost of ownership over time will most likely be higher if you are an avid gamer, but if you're a tiny bit more casual than that, you will have to invest at least double the money to get comparable performance from a PC.

    All the articles and videos comparing PC to console performance never mention the horrendous frame pacing issues of all the gimped systems they suggest, and neither they mention that there aren't any real and reliable tools to achieve that on a computer. It's no surprise that the PC market is actually declining year over year, percentage wise.

    And I'm not sh*tting on NVIDIA only. The "good" Polaris cards are on the 260+ euro side, which is the same price as a Switch, for a small chip with a 256-bit bus on a small PCB. Things have changed and the middle class that used to be the main consumer of all this stuff has had its income squeezed out the last decade. We can't keep pretending that things are as they used to be.
     
  10. rm082e

    rm082e Master Guru

    Messages:
    717
    Likes Received:
    259
    GPU:
    3080 - QHD@165hz
    It cost that much because there was no competition to force the price down. Don't complain about a publicly traded corporation trying to maximize profit - that's their Prime Directive. It's the first rule of business that influences and dictates all other decisions. It's foolish to expect a company to make less money just to keep the consumer happy.

    Complain that we are lacking two companies, on equal footing, who are fighting for the same customers in the same market. That's the real problem here.

    If Sony were putting out the PS4 without the competition from Xbox, or vice-versa, the last console launch would have been $600, with an upfront 10 year sales cycle. Why? Because they could have gotten away with it. Microsoft was able to drag out the previous generation a lot longer than anyone expected because Sony wasn't in a position to force their hand sooner.

    And the competition they did have was amazing. Microsoft wanted to force people into their DRM box with mandatory Kinect connection for $500. Sony slapped them in the face with a parody video making fun of their anti-consumer vision, and came in a hundred bucks cheaper. Result? Microsoft was forced to walk back just about every major plan they had for the Xbox One due to competition.

    This is what we need in the enthusiast GPU space. Thankfully, we've finally got some competition in the CPU market.
     

  11. Loophole35

    Loophole35 Guest

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    How about you look at it this way 690M transistors for $800 msrp (8800Ultra) vs. 7.2B transistors for $700MSRP (1080).
     
  12. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,125
    Likes Received:
    969
    GPU:
    Inno3D RTX 3090
    That's why I said that my opinion about it is completely subjective :)
    I still feel like getting robbed. It's a feeling, not an objective truth. Your argument is obviously correct.

    I remember the days when there were 4-5 companies. ATi, NVidia, 3dfx, VideoLogic/PowerVR, Rendition, Intel (anyone remember the 740), S3...

    What we have now is a travesty.

    This kind of calculation is so superficial man. You are forgetting a lot of things.

    First: Complex PCBs were harder and more expensive to make back then. A PCB for a 256-bit memory interface was much more expensive to make in 2006 than it is in 2017.

    Second: The actual production cost for a chip is a factor of the chip's size, the wafer's size, and the maturity of the process. Take a look at the size of the G80 chip and the size of GP104.

    G80 on top.

    As you can see from the PCIe connectors, the cards are basically to scale.
    [​IMG]

    Here's an image of a TSMC 130nm wafer, used for the 8800 GTX, from the Techpowerup review of the card.

    [​IMG]

    That wafer contains 118 dies there. Sure, it's more cheap to make it today, but not back then. In contrast, a modern TSMC 16nm wafer is 300mm in diameter. It contains 180 GP104 dies and it has similar build costs.

    Let me quote an analysis by the investment website The Motley Fool.

    That chip costs less than $50 to make. This obviously isn't the whole deal since you have the running costs of the company, the PCB itself and the extra components, profit margins for everyone in the chain etc. But in no way it ends up being "like the old cards were priced" at $700. That's a $250-$300 GPU there.

    A small historical sample of NVIDIA's profit margin should put that theory to rest. Their profit margin in 2006 was at 18% and the current one sits at 31%. There are a lot of ways to streamline and increase it, but the major on is giving you (comparatively) less for more.

    So yeah, I get it market wise. Just don't tell me it's like it used to be, because it isn't.

    TL;DR: My issue isn't that NVIDIA is upselling. Any company that could get away with it would do the same. My issue is people saying that NVIDIA is not upselling and that prices were the same for similar manufacturing costs.

    /babyrage
     
    Last edited: May 9, 2017
  13. rm082e

    rm082e Master Guru

    Messages:
    717
    Likes Received:
    259
    GPU:
    3080 - QHD@165hz
    *sigh*

    [​IMG]
     
  14. grndzro7

    grndzro7 Member

    Messages:
    23
    Likes Received:
    0
    GPU:
    7850 @ 1200/1400
    It isn't, the rumors started when Hynix pulled it's 2ghz HBM2 from it's website because AMD has an agreement for all the top end HBM from Hynix. There is no basis for the HBM2 shortage rumor.

    I wrote a blurb based on https://www.youtube.com/watch?v=m5EFbIhslKU&t=3s here https://www.reddit.com/r/Amd/comments/69voz3/opinion_if_vega_performance_regardless_of_the/dh9wh5m/

    It should fill you in on why Vega is such a big deal.

    Nvidia has abused it's TWIMTBP program by locking AMD out of game optimizations by withholding critical code. adding in stupid stuff like a tesselated wall, tesselated terrain below water level, and cheated on graphics quality to dominate benchmarks.
     
    Last edited: May 9, 2017
  15. fry178

    fry178 Ancient Guru

    Messages:
    2,067
    Likes Received:
    377
    GPU:
    Aorus 2080S WB
    Not saying NV isnt trying to make the most from selling their products, but not looking at the top end model (not the largest market share nor an indicator for price increase), and comparing the price for the 2ND and 3Rd fastest card from the past 15y, I dont see it.

    Besides that, check how much prices for bread/groceries went up over the past 10-15y.
    How many of you, complaining about nv milking, are happy we're paying more to get less quality food products??
    Cutting cost there will save you more than spending 1-200 less on a gpu that I keep for years...
     

  16. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    I'm not asking why Vega is such a big deal. I understand all the technical improvements made to the architecture. Solfaur said that 20K is nothing. I don't know if I agree with that but w/e. MorganX responded "You gotta start somewhere when you innovate" implying that some innovation of AMD's is limiting it's supply to 20K.

    You mean like when AMD added TressFX code into Tomb Raider (2013) at the last minute and screwed up Nvidia's performance?

    http://www.eurogamer.net/articles/2...b-raider-pc-players-plagued-by-geforce-issues

    Or maybe you're referring to this?

    Nvidia got it less than week before shipping, AMD got it two months before shipping.. or did they?

    https://www.youtube.com/watch?v=-i8K5M98eME

    Weird. Here is a demo of Hairworks in Witcher 3 a year before it was released. It's possible the code got added a few months before it launched but we wouldn't know because when Project Cars came out and this thread was made:

    http://www.reddit.com/r/pcmasterrac...my_word_if_we_dont_stop_the_nvidia_gameworks/

    And AMD's Richard Huddy responded on twitter:

    "Thank for supporting/ wanting an open and fair PC gaming industry." to the thread.. which of course blew up everywhere about how Nvidia was sabotaging performance again.. well it turns out AMD doesn't check in very often with developers:

    https://arstechnica.com/gaming/2015...s-completely-sabotaged-witcher-3-performance/

    So maybe the code was added 2 months before or maybe AMD only checked it two months before release. All I know is the source for Hairworks is now available and last time I checked there was no major gain in performance by some driver AMD released in response to it. Turns out the performance issues was just poor geometry performance of their architecture and not some intentional ploy like Richard Huddy suggested with this statement:

    "We were running well before that... it's wrecked our performance, almost as if it was put in to achieve that goal."

    Also the Crysis 2 water bull**** has been debunked a hundred times. When you turn on wireframe mode it removes the culling. Both the CryEngine developers have stated this along with people who mod the engine.

    As for the tessellated walls:

    Here is the default Crysis setting: http://abload.de/img/fulltessellation2jrki.jpg

    Here is AMD's "Optimized" setting:
    http://abload.de/img/amdreducedtessellatioclrrn.jpg

    So much for it not being necessary. Similar issues happen when you optimize tessellation with Nvidia's Godrays. Areas around dense objects like fences look like crap.

    Don't get me wrong. Nvidia has done it's fair share of crap - but recently there has been a ton of misinformation and nonsense being spread about stuff like this and I'm tired of reading it.
     
    Last edited: May 10, 2017
  17. Redemption80

    Redemption80 Guest

    Messages:
    18,491
    Likes Received:
    267
    GPU:
    GALAX 970/ASUS 970
    Yeah, i thought the general consensus these days was that the Crysis 2 tessellation scandal was just lies made up by AMD fanboys and people with no understanding of culling.

    The misinformation is what puts me off switching to AMD, it just feels like a distraction that is used to push GPU sales for AMD, even things like async compute have been hijacked and are now just a marketing gimmick.
     
  18. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,125
    Likes Received:
    969
    GPU:
    Inno3D RTX 3090
    Believing that AMD are the "good guys" is equally blind as not believing that NVIDIA is fleecing everyone with current prices for the hardware given.

    Just a couple of years ago AMD locked out of VSR everything below a 7970, for no reason at all. There were people with HD 4/5/6000 cards using it just fine. They even attempted to cut it for the 7000 series with some cr*p about "missing scalers" that were proven wrong by people like me running modded 7970's to 280x's, and the simple fact that the feature was literally working for everything. Even the current limits it has are artificial. Soon after no official AMD rep is in this forum, and everyone has moved into the walled garden of ignorance that the official AMD forums are. A place where if your technical knowledge is enough to call them on their bullsh*t you get shadowbanned.

    They never acknowledged how bad their DX11/OpenGL driver is/was.

    The did the same in the past with supersampling AA support in older cards and gave equally cold explanations along with truths about the "market".

    They are as sh*tty or worse than NVIDIA, they just lack the market size for them to step on their clientele. My only argument for them is that after all this time having to promote and support open software (because there was no other way), they most likely have a more open and collaborative company culture compared to NVIDIA, because it was an adapt or die issue for them.
     
  19. HeavyHemi

    HeavyHemi Guest

    Messages:
    6,952
    Likes Received:
    960
    GPU:
    GTX1080Ti
    It is amazing that still gets put out as fact. The developer Crytek, pointed out that culling was disabled in wireframe mode which is the only time you'd see it.
     
  20. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,872
    Likes Received:
    446
    GPU:
    RTX3080ti Founders
    I agree with everything you said except the above quote. I don't think AMD are more open or collaborative. Many devs have already shared their experience with us about AMD and it's a lot of negative comments.

    For many devs, it seems like AMD doesn't really care. This situation might have changed in "very" recent times, but, the fact of the matter is Nvidia are and have been much more supportive of devs. If a company comes forward and answers questions and helps to resolve issues, is it any wonder that those games run well on Nvidia? I do think it comes down to money again; Nvidia obviously can afford to offer support and even an engineer or two when needed. However, I don't think this hands-on approach by Nvidia deserves much criticism.

    I would love to know who are those devs who turn down the chance to get their games up and running on Nvidia...anyone?
     

Share This Page