Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Sep 14, 2018.
Ah, if that's the case, then here in the UK the NDA will lift at 2pm, because Finland is two hours ahead of the UK when UK is on British Summer Time (BST). Will be some interesting afternoon reading!
To those that think the price is too high,
basically the deal is that gpu designs require exponentially more transistors
and the cost per transistor is not falling like it used to with every node shrink
The cost per transistor has remained the same or increased since 28nm,
An example of the effect this has,
The 980Ti has 8billion transistors , and now the 2080ti has 18.6Billion transistors, it should be no surprise that it costs ALOT more. (even with out yield considerations)
This is going to be the future of GPUs across the board im afraid(amd and nvidia alike), so long as transistor count increases and no major breakthroughs on the physics side of things occur.
Price will increase with performance from now on for the most part, at least it seems that way.
That was the Tesla architecture. The first dedicated compute card from NVidia, was also named "Tesla". The Tesla architecture was used for the GF8 series, GF9 series, 200 series and the (OEM only) 300 series cards.
Interesting, although we have to remember that the GPU silicon is only part of the cost of the card. You have VRAM/power supply circuitry/board/cooling solution and assembly + research costs as part of the final cost too (as well as other things I've not listed too). So a doubled increased production cost of the GPU core from 980ti to 2080ti won't result in overall production costs being twice as much - it depends on the percentage cost of the GPU core production in relation to the total production cost of the whole GPU card, and I don't know those numbers. (And I'm assuming the graph has been adjusted for inflation.)
If price increases every generation are going to become "the new norm"....the market will likely start shrinking as a result.
The cost of the actual components has also increased, since the boards are more complicated than the previous generations(for power efficiency reasons), you take, the increased cost of raw silicon, increased lithography cost (lower yields due to larger chips), and increased dram cost, increased board cost,increased R&D cost ,and it paints a pretty clear picture as to why graphics cards are getting more expensive.
the number of customers is also increasing, it can still grow, even with increased cost, albeit more slowly.
It's being fixed but it might affect some of the reviews measuring idle voltage depending on how much higher it is.
The 'gaming market' may increase, but probably more in the direction of consoles than PCs (if the latters higher cost continues).
Hi @Hilbert Hagedoorn
Can you do to yours testing suite as well Blender and V-RAY for GPU, I would be very appreciated for this
More and more people use their GPUs for rendering like I do
Thanks in advance
Lots of folk are still eager to upgrade every year for the latest and greatest. Though I think that will slow down on the high-end cards.