Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jan 10, 2019.
How abut we all just wait instead of nitpicking what is true or not. Feb. 7th is not that far away.
I was comparing it to my overclocked 1080ti. 60fps is the minimum I hit on my 2080ti. Averaging in the 70s with some games hitting the 80s to 90s. Whether or not you deem it worth it to yourself personally, that's a completely different matter and really only up to you. I personally would like to play my games at 60fps now rather than waiting so I bought one. Same goes I would imagine, everyone else who bought one. No sense to argue this further I guess.
Navi is going to be 1440p and 1080p 1st gen. work is being done on the controller IC as Navi is a "chiplet"/SoC design.
2nd gen Navi will be sold with high performance multi cores
Navi 2nd half launch this year is for 1080p and 1440p. Early next year we should have higher performance models. After that we should get arcturus. At least that is what all the rumors have pointed to.
Do you have any links to your claim about 2nd gen Navi being multi cored as everything i have read said this is still years off. It is a lot more difficult than multi core CPU's. I know both AMD and Nvidia have been working on it for a while, but i haven't seen anything recent about it.
I get that Ray Tracing is only in one game and people can't fully grasp the potential. The thing is Ray Tracing is not going away. AMD will be supporting it soon as well so rest assured this isn't some PhysX type crap. I strongly disagree with AMD on it being too early to get the tech out to the public. I think it's just an excuse because AMD has simply been lacking on the higher end for a bit. So yea to them its not time because it's not well served at $200-$300 yet. Nvidia and game developers are giving us a taste of what Ray Tracing can do, sure it's expensive but you don't have to buy it. They are helping move game development along more quickly in the direction of Ray Tracing and that is a positive thing. Yes, it is a tech that isn't representing good value at this time. What new tech does represent good value at first consumer light? Nvidia and Dice already made great strides on Ray Tracing performance since BFV first DXR launch. Early adopters that can afford these new cards only pave the road ahead for more ray tracing and also encourages game developers to include the tech. Most people don't understand that, they just see money going into the pockets of Nvidia and that's it. More RTX sales and more DXR use will actually provoke AMD to compete more as well and let's not forget Intel. There is actually a lot of value to buying these cards if it fits the budget. Boycotting RTX cards will actually slow Ray Tracing development IMO. Ray Tracing performance will always be worse but it will definitely be nice when I can enjoy ray tracing 144hz QHD for $800 even let alone cheaper lol. I am a high refresh gamer and yes I want those frames but I appreciate the change that Ray Tracing bringing even though we have seen more simple examples to date.
Nvidia has these 1100 cards ready just in case the public confirms they don't want RTX lol.
I'm getting a bit tired of seeing that particular claim bandied about so I'm going to have to address it to get it out of my system. That said I'm not talking about your post specifically here though so don't take it personally.
This is an enthusiast forum, I'm pretty sure that if one group of people understand the potential of realtime raytracing it's going to be the the one that frequents this place.
Potential means only that however, potential. That potential isn't being realized right now, nowhere near to it in fact, and it's never going to be on Turing. Everyone knows this, that's not a dig against the technology as such.
I actually think both AMD and Nvidia got it right this time around, for different reasons. AMD is correct in that the technology is going nowhere until there's wide adoption of hardware support for it, while Nvidia's implementation with dedicated hardware made the most sense at the enthusiast end where it's not going to be a dealbreaker one way or another.
My own feelings on the matter is that realtime raytraicing will remain an afterthought or gimmick until it's supported on the consoles, which hopefully will be as soon as the next generation but we'll have to see about that. The reason being that's where the money is, no developer is going to spend a lot of time/effort/money on raytracing exclusive functionality when it's only available to the PC market - let alone the enthusiast end of the PC market.
That, again, isn't a dig at the underlying technology itself and it certainly doesn't mean that I (or anyone with similar ideas) doesn't "grasp the potential" of realtime raytracing. It's the future for sure, but that means it's not the here and now.
It's basically a value-add at this point, you might get some additional graphical options in a few high-profile titles thanks to Nvidia's efforts but that isn't shifting any notable number of Turing cards on its own. The general performance is.
I think that's a really fair analysis of where we are at when it comes to Ray Tracing. And a much better post than some of the (oh it just sucks crowd)
Thank you, that's very kind!
Honestly I were just so incredibly tired of seeing every criticism against the current realtime raytracing implementations being dismissed as ignorance, as if we're all a bunch of luddites here. I just feel it's very realistic to be enthusiastic about the long-term implications without being sold on what's available today.
It's not about that - we all saw the potential of ray-tracing in the Star Wars demo, which was very impressive. If Turing was able to deliver that level of IQ in current AAA games at playable framerates then it would have been universally acclaimed. As it stands, it only provides minimal IQ improvements at a severe performance hit. As I said, it's not the technology that people hate, it's the current lackluster implementation.
Except that it probably won't. Ask any RTX owner if they enable ray-tracing in their games (currently only BF V) and the answer will almost always be no. People bought the 2080 Ti because it was the best 4K gaming card available, not because of RTX (they are not some virtuous group who are using their money to pave the way for the rest of us, they just want the fastest card that their money can buy). It's also clear that developers are in no rush to implement the technology - it still boggles my mind that they didn't have a single RTX title at Turing's launch.
I seriously doubt the existence of the 11-series cards. No real reason for them to segment their product line like that, especially at the high-end.
yes it is extremely hard. they've had prototypes of Polaris and Vega that were distinctly not ready for prime time with latency issues.
the "Infinity Fabric" 2nd gen has proven it's value as an interpolater but no controller IC has been fabbed smaller than 14nm YET. as you know silicon "real estate" is at a premium and IC design is hard.
"Arcturus" will be the optimized 7nm+ SoC, but we will see a "super-Navi" (multi-core) first as more than a proof of concept. there are some specific things i'm not able to say under restriction.