Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jun 4, 2020.
Oh dear, i hope they havn't over hyped again
And the 4080Ti for 3000USD? and the 5080Ti for 4000USD? What's the limit?
...people have to look at how products are priced vs how the market looks like and what the end consumer is willing to pay. All of this versus the margin the company wants to make.
I just can't believe that Radeon has gotten it together enough to be truly competitive with Nvidia in the upper range. The last time they were even close in the upper range was the 290x, but even that was a stretch given how much of a hair dryer those things were. Obviously I would welcome the competition though. I just can't get my hopes up.
The limit is what people are willing to pay. If folks did not willing to pay $1500 for a 2080ti, they wouldn't. Nvidia would somehow need to compensate.
It's not just competition that controls prices, it's what the market is willing to pay as well.
The thing is Kodak did invest into digital, a lot of the first sensor tech come from them (being a pro photog I know it well) that just wasn't enough in the long term.
For Nvidia, gaming is still their main source of income and if, for example, tightly integred 3d stacked MCM apu become the future for game, they could be screwed...we have honestly no idea of what the future reserve us and while Nvidia are clearly safe in the short-mid terms, seeing them as the juggernaut who can't fail even in the long term is wrong.
Personally, I am more interested in HPC than games and I can't care less what CPU performs best in games at HD resolution. Once you have maxed out your monitor's update rate it does not make one iota of difference. What I am really hoping for is to see AMD support CUDA. If Google can win their suit against Oracle regarding copyright of APIs then it will be only a matter of time. If AMD can support CUDA then the door will open to serious competition in the HPC arena. Until then, CUDA has a clear head-and-shoulders advantage over everything else. This is important because most new development is at the high end and innovations trickle down from there and HPC is the high end of GPU development.
Indeed they did - I used one of their digital cameras. But they invested in digital too late - they couldn't keep up with the other digital giants.
I agree, but I don't think APUs will be a threat to Nvidia for a long while. And unlike Kodak, Nvidia is very much aware of what might happen. That's probably why they made the Tegra series.
I remember in 2017 when AMD launched Ryzen all the doom sayers online were predicting a scam and failure etc but the opposite happened folks were left in awe at what the Ryzen could do for the price compared to Intel, Lisa Sue is a clever cookie and a scientist at that, I think and I hope we will be in for a surprise if you look at the UT demo there is clearly an element of RT in there and some serious number crunching going on, and thats in a console I bet the PC versions really shine
yeah I miss the old days, best days
While i'm seriously considering going AMD for my upcoming CPU change, i rely a lot of nvidia features for streaming and recording videos.
Without Nvidia NVENC i wouldn't be able to stream at 4K/record my videos like i have been doing for years, without having to buy a capture card.
No alternative to NVENC is a deal breaker for me.
Im not hyping it much this time around.. nor nv for that matter.. it's all down to price vs perf.
I have a 1080ti, that at $700 I thought was a bit ridiculous, but it was the only thing available that I believed would have the longevity for my niche need. VR. Beyond that I game on 75Hz 1440P ultra-wide. The Titan series cards were a proof of concept to see how much they could ask gamers for, while telling them that isn't for them. So then they just arbitrarily shifted up the pricing. After 4+ years, what ever is offered that has a marked improvement (50%+) over my 1080ti for at most the same money gets my money.
The concept of buying a company's lower tier GPU on the strength of it's halo GPU product is the most asinine thought process imaginable. That's just how you as consumer build and prop up a monopoly, and increase prices. It's logical for a CPU, because you're locked to the platform, and there is a chance the top tier price will come down to something reasonable/attainable. This continues to fail to be the case with Intel though. G-Sync was Nvidia's platform lock-in strategy, and the only cause for its resistance to Adaptive Sync support (on the desktop) for as long as it was. There was never G-Sync hardware in laptop LCDs that supported it. Everything about the PC platform, short of the motherboard, allows for fluid purchasing decisions. A closed solution to an otherwise open feature, is anti-consumer.
Sure, but also total sales is a concern for any business as well. If Nvidia can sell 100k cards @ $1000, their volume will almost certainly fall if its $1500 and they may get same or less net profit as when it was $1000. They definitely would prefer more cards to be sold as that solidifies their market presence.
I mean, you're not wrong and I would be interested in seeing that, but such slice would be waaaaay too specific, IMHO. There's too many variables to take into consideration there, like:
1) GPU drivers
2) build/version of rendering engine
3) set of rendering technologies used in said engine
In a time where each major game release comes with a new GPU driver for optimal performance, can we really factor each aspect of the rendering pipeline and GPU architecture as not/being responsible for better/worse performance? I feel like that would be a dive too deep.
That's a very good point. I don't stream or do any kind of video stuff, but there is one setting in the NV control panel that I can't live without and that is Digital Vibrance. Its always the first thing I enable (75%) when I install new drivers. It just makes everything pop.
The Nvidia driver will sell more cards again. AMD still hasn't even figured out how to enforce vsync, ffs.
Their OpenGL is a trashfire, and the DX11 driver STILL HAS OVERHEAD ISSUES. If it's even possible. There are games where throwing in the DXVK dll from the Linux project, improves performance on Windows, because the DX driver is so terrible.
They have an awesome Vulcan driver though.
If you have a decent calibrated monitor you may find no need to use DV.
I have a new monitor arriving at literally any moment so I'm gonna have a lot of tweaking to do today. If I can get the color settings that digital vibrance provides I may switch to an AMD GPU later this year. Of course price/performance is the key factor.
I do wonder when they will decide to ship the mesa3d radeonsi driver on windows, It outperforms the legacy opengl stack by a long shot, less buggy too imo.
Never gets old!