Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Sep 14, 2018.
Where did you get this information from?
Do you remember what happened last time nVidia took the risk of being the first adopter of a rushed manufacturing process? In case you don't recall, it was back in 2010, and codenamed Fermi. nVidia isn't going to allow that to happen again which is why they had the foresight to have both designs for a 7nm and 12nm GPU ready for the FAB as soon as they knew what TSMC's time table was.
If anyone was lacking foresight and proper planning, it was AMD. They put all their eggs in the 7nm basket and now they're standing around with their d**ks in their hand with no new GPUs which means nVidia is going to completely own the holiday season this year.
Actually Qualcomm and Mediatek have postponed any 7nm chip production at TSMC until next year because the costs are prohibitively high. This explains in part why the new iPhones are so expensive. So if you think its too expensive for mobile phone companies to have chips produced, imagine how expensive it would be for making these comparatively massive CPU's and GPU's for AMD? Considering the fact that we wont see any AMD GPU's till 2H, 2019... its quite clear that they were the ones who didn't plan ahead.
riiiight. I'm sure they're just terrified of them releasing a few vega 20 cards in 2018, with no gaming cards in sight till at least 2H, 2019. nVIdia is playing it smart by letting AMD go all-in on a super expensive process, and once the price begins to come down and yields are high, they'll have already taped out their next chips and will be in production by the time Navi hits.
You have absolutely no idea how many cards they pre sold. Until you can provide those numbers instead of pure speculation, you have no argument. I on the other hand was down at microcenter the other day and asked one of the guys that has been there for many years and when i asked how many pre-ordered cards they sold for them to be completely out, he said "hundreds".
And why would you assume the initial stock is so low? It's not like they're banking on a super expensive 7nm process like AMD is doing. If you are correct about nVidia's cards being in very short supply, then these upcoming AMD chips are going to be practically non-existent.
Just the other day Ryan Shrout from PC Perspective said:
His rear end. Lol
I am sure they did not create false sense of demand. Fact is, that people do not know and expect 50~100% higher performance jump. They expect fluent high quality raytracing at 1440p.
Many will be very disappointed.
(Since you seem to talk for it.) ...Since when is my rear end your mouthpiece? Or are they just very well acquainted?
Typically the case with most releases of new GPUs, that they begin with low initial stocks and build up inventory to meet the demand. Happened with Pascal too (took me months to buy a 1070). Would imagine more caution in the case of a significantly higher priced product (that has already proven to be controversial as a consequence). Couple that with the already huge existing inventories of Pascal and would be stupid of them to rush in with massive stocks of a new, higher priced product that may not meet with the same enthusiasm as the last.
Point still holds that being OOS after selling a 1,000,000 units is not the same as OOS of only a few units. It was not an assessment of how many cards were sold, it was a POINT that you just didnt seem to grasp, only offering hearsay that micro-center had "hundreds" pre-sold. Finally, how mature process yields may be and for Nvidia to build up massive inventories of a new product with controversial pricing (as a consequence of high yields) is not the same thing.
As many believe, the Moore's law is dead. On GPU side, we still can't run all the AAA games at 4K display at comfortable frame rates even with 1080Ti. That is why many have waited for a long time a new card which improves the frame rate by at least 50%. This is pretty doable with 7nm node. But instead Nvidia is releasing new cards which introduce a new variable in the equation of frame rate, which is the Ray Tracing. With their best card, 2080Ti, it can run games at only 30-60 fps on a 1080p display with Ray Tracing on. Since they haven't used 7nm yet, it will improve maybe 50% at the next generation. But this only means that the best card in the next generation, it can run games at 30-60fps on 1440p display. Is it possible to shrink the node farther from 7nm in the future? The CPU development seems to abandon the idea to improve the per core performance. Instead both Intel and AMD are fiercely working on multi-core CPUs. If Nvidia or AMD fails to introduce GPU with nodes smaller than 7nm, can they really achieve the comfortable frame rates such as 60-120fps at 4K display with Ray Tracing on at some point? Can they really release such a product for consumer markets? If not, the introduction of Ray Trading is merely an experimental side business rather than serious ventures to benefit the demand of the consumers except for some professional users.
I think another thing people need to remember is the importance of branding around all these discussions into fab and chip size. AMD will likely need to push ahead to try and re-establish some kind of reputation for being cutting edge and better in tech, and in order to do so probably is taking the risk to get on to the new node process before competitors. Intel and Nvidia are great corporations for themselves (strong focus on proprietary tech, monetize everything that they can, direct partnering with companies for benefits to them whilst trying to encourage less use of rivals etc.), with strong market control/monopoly and spent many years building up their reputation whilst simultaneously trashing AMD throughout it. This carries loads of weight on the market, and as long as they do not underperform compared to rivals, they will likely continue maintaining reasonable market demand whatever manufacturing process they use.
I think it is important to see that whilst we can debate tech at certain levels and speculate around performance and what the future holds, the other massive variable in all of this is what the average end user (not tech enthusiasts and beyond that congregate here) perceives to be the best option, and advertising and branding carries a huge amount of weight in this area, which directly influences spending of the average consumer.
The most important feature of this gen will most likely be DLSS.
They do have a few games lined up so it's off to a good start but the AI neural network and training sounds like it's going to impede adaption what with PC ports and release schedules and having time to run through the game over and over for the AI to learn and build up support for this to where it works to it's utmost. Maybe profile flags could be a thing but I have serious doubt it would look good overriding that with some other supported game.
(Or maybe automating it somehow, less manual work aside from fine tuning but that might impact overall quality but perhaps only slightly.)
It's definitively a nice software and hardware achievement though integrating deep learning to reconstruct the scene and utilize anti-aliasing in this way.
(32x by default and up to 64x via DLSS x2 from the whitepaper as a option.)
Though I suppose that goes for many other features too with the advantage for NVIDIA here having more resources and bigger market share so it can see these implemented with more hands-on engineering and support for titles for these features.
EDIT: And details and props I suppose, later in development maybe removing or changing some scenery and having to re-run this since there's no differences in the scene.
Or patches also altering or changing stuff for performance reasons although usually less extreme than mid-development before the game is finalized.
(Essentially not having any mismatched information due to any such tweaks.)
I wonder what they are calling "rt cores" actually are , seems kinda weird given that the GV100 also supports RTX raytracing, but has no mention of RT cores on its block diagram.
These cards will end up going the same way as washing powder, new ultra ultimate washing powder, better than their last ultimate washing powder. lol
But they are priced out of my comfort zone for bang for buck, my present £470 GT1080 was above my max for being a hobby PC gamer, and only the fact that i got £120 for my 970 to take that £470 down to £350 made me bite on the 1080, but nvida have priced me out of their card market as i simply could never happily spend between £700 and £1000 on one PC component, they are just ludcrious prices for me to spend on one part of a PC to simply offer a few more FPS and some extra pixels on screen from the last cards, but bang for buck it seems has gone out the wondow. lol
Thanks for the preview Hilbert. Good read.
Still not sold on the cards... I don't really need that increased performance as I don't think I'll make much use of RTX in the next 12 months or so at all, and I'm running 1440p, not 4K, and I don't plan to buy myself a new monitor anytime soon. Might just wait until things settle and buy a card plus a waterblock later on.
I don't know what to do...
These look great, but I do not like the prices and it feels like all of this will have a lot more room to breathe with 7nm at the end of next year.
I'm with you on this.
We all know the GTX970 was the all time best selling GFX card to ever hit the market, all i'm saying is that the 2070 won't sell as well as the 970 did.
I never said the 20XX would be a total flop, i'm saying i doubt they'll be anywhere near as popular. The 2070 is an RTX card that will suck at producing decent fps when RTX is on.
We've already seen leaked results from 2080ti with RTX on and it's not looking good for the 2070 when RTX is enabled. I think i have a valid point and won't be eating any crow anytime soon.
gtx 2080ti delayed a week
(for people that haven't pre-ordered)
yeah, like C.E.S. in January.
and as for the wafer...and the end of Moore's Law, Nvidia is being dragged around to the realization AMD had four years ago - scalability and multi-module (@ low latency) SoC are the only way forward until we have the technology for nano-3d printing or something analogous.
the reason (the real and only one) for this is the simple fact that AMD was in the fab business for years, they are fully up to speed with all foundry technologies.
and with that technological base they realized they were fast coming upon a wall and came up with "infinity fabric"... which forced Intel and to a different degree, Nvidia to reappraise IC design.
Nvidia and Intel were intent on "brute forcing" everything and to their credit, are really quite good at it.
but, both would be better (and better off) using their design skills for gpu cores (a la "zeppelin") as then they could actually kill off AMD.
luckily for AMD, they patented their tech.
like me... i've bought the Pascal, Maxwell, and Fermi from Nvidia... but i'm awaitin'-and-a-seeing RX Vega 2, and for my laptop, Navi.
chances are good i still stay green, but i may see red.