Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jul 5, 2019.
Ahh, the smell of shill is strong. Colours are being loudly displayed. Popcorn at the ready...
If you want "future proofing" buy a console. Nvidia cards are even worse atm, because they are not even 7nm when comes to "future proofing", or given their RT performance. So stop bashing AMD for no reason.
7nm has nothing to do with future proofing.
Take money aside, and the best "future proofed" card would be the RTX Titan, and that's not on 7nm
I really have zero clue as to why you even put 7nm in there.
A video card could be on 100nm, and the size of a football field(yes, i know, not realistic), and have more future-proofing then a GPU that's on 7nm, since....the nm has nothing to do with something being future proof.
By that logic, a 10nm Core i3 8121U 2-core, 4 thread, 2.2Ghz processor is more "future proof" then any other currently released intel processors, even though the 14nm Core i3 8130u, 2-core, 4 thread, 2.2Ghz processor both beats its performance, and by far beats its power consumption......
Now, having ray tracing, regardless of if you personally are unhappy with its current performance, does put a factor of future proofing on RTX series GPUs that the RX 5000 series of AMD GPUs will never have. It doesn't matter what your opinion is about the performance, because (unless something changes, which could happen) the RX 5000 series will never have ray tracing capability. Which means in the future, if a game comes out, you can choose with an RTX 20 series GPU to turn on ray tracing, or not, where you can't with an RX 5000 series. This is by definition "Future proofing".
AMD could enable ray tracing, such as Nvidia has done on it's GTX 10 series, but the performance comparison, just like with the 10 series, would still be abysmal in comparison. This would be another definition of "future proofing".
Now, AMD could come out with software based ray tracing, or whatever it is they currently put a patent on, which could change this. But considering this is unknown technology that will work on unknown cards at unknown performances, it's really not able to be considered a "future proof" concept when it comes to the RX 5000 series since we literally have zero clue as to what it is....at all, realistically.
Console are hardly future proofed as the same applies. Cards don't have to be at 7nm in order to perform better than the competition, much less play RT games. The current trend is to recycle cards 2-3 years and at that time all vendors will have cards that perform much better.
If you consider this bashing take a look at the "SUPER" thread.
Edit: Try to keep your own bashing in check.
With consoles you know it won't change the next 5 years. If someone buys the 2020 consoles, doesn't want to buy new hardware until 2025 at least. While on PC you need to replace GPU every 2 years at least, and with the current prices they cost as much as a console, if not more......
As for my post you quoted, as least based my argument with facts which Amazon put down. (i have replied to that post).
You just make wild guesses and retorting without a single fact but your own opinion.
There is other option. Stay on one tier lower resolution. 4K is matter of choice and sacrifice. 1440p is OK if one is willing to pay. 1080p allows for great fps and in time still OK fps with older GPU.
There is even good chance that strong GPUs will become technologically obsolete before their performance becomes low for 1080p.
That false because 2020 console hardware requirements may not support the 2021 - 2025 console entertainment requirements, though it might. Unless you have a crystal ball there is no way to know.
Is this a fact you are making in the post I linked, or just a wild guess?
While you can make statements like this, they are not meaningful before the Navi NDA has lifted and reviews are available.
Right question is: Who's going to make those 2021 "better" consoles and for what company?
If one talks about 2020 XBox and PS from AMD. What are going to be those "Better" consoles that would support those "2021 - 2025 console entertainment requirements"? Some new company will make chips for them? And what company will order their creation? "Next-Gen console Consortium"?
Chances are that once MS's and Sony's consoles are out, there are not going to be other targets for consoles for few years. Those consoles may miss some feature. But who else would come with that thing into market?
I'm sorry, but what does that have to do with the topic at hand? Did you really just ran out of arguments and threw a jab at me personally?
I'll give you the benefit of doubt and address your concerns, regardless.
I wasn't ripped off buying a 1050ti, since, like I said before, nvidia has better OpenGL implementation in windows. If you couldn't tell, that's a big deal to me since I use emulators heavily, so right off the bat AMD wasn't a good option.
Also, prices aren't linear across the world. I live in Brazil and here prices can get pretty crazy and different from other regions. So, in my case I ended up buying the 1050ti for about 60% of the price of a 1060/rx580, which were too expensive for me.
Regarding the high end, I'm not sure I follow your argument. We're talking about similarly priced cards here, and I stated that for (about) the same money nvidia is offering more than AMD. What does that have to do with having or not products in the high end tier?
That's exactly it. You went on and on how one is better off with nVidia because of OpenGL. But at that price point, even OGL would run better on AMD's side. For price of GTX 1050Ti 4G one can have RX-570 8G or 580 4G.
(Do yourself a little comparison. Or not. But that's ~2x performance with 570 and ~2.5x with 580)
You are apparently happy with 4GB VRAM - Currently (excluding tax):
RX 570 4G - Mini Size : $117
RX 570 4G - Standard Size: $128
GTX 1050 Ti 4GB - Standard Size: $146
GTX 1050 Ti 4GB - Mini Size : $161
But there are no facts there.
You know how many times retailers accidentally put up pages with completely wrong details and prices on them? All. The. Time. But you're unwilling to reply to posts that call you out on this because you have nothing to back it up other then relying on an "oops!" from a retailers as your grand facts, is very telling.
Plus the fact that you, in this thread, claim 7nm is a "future proofing" technology, as if it had anything to do with something in the future being able to utilize better the "7nm" in a product.
It would definitely be AMD providing the consoles. The question is whether we will see other advanced GPU features (like RT, VRS) adopted in the 2021 - 2025 time frame which are made available on videocards, and will consoles be able to wait till 2025 or later to implement. Games will likely become more visually taxing and whether or not a 2021 console has adequate performance would be a major question.
Console refreshes are not as much dictated by HW improvements as by marketing strategy. With high release rate, people can as well just buy PC.
People buy Consoles, because they guarantee that there will be new content for years. Moment that's no longer case, consoles will lose their appeal.
As for performance and features... We can expect nVidia to push 20B+ transistors on 7nm EUV. Maybe 28B?
No console can compete with that anyway due to price constraints.
Like I said, my GPU has nothing to do with the topic at hand, so why you keep bringing it up?
These prices mean nothing, when I bought the cards the prices in my country were as follows:
RX 570 4G: 749 BRL
RX 580: 1200+ BRL
1050ti: 799 BRL
1060 6G: 1200+ BRL
While the RX570 is a much better card for regular PC games, I don't really care about that, since I hardly play AAA games nowadays and don't mind dialing settings down if I ever do. So I tend to buy what gives me better emulation performance, while not being terrible at regular PC gaming.
AMD cards are known to give much worse performance on emulators that use OpenGL - that's the case for CEMU, Citra, PCSX2, among others, which are exclusively or much better with OpenGL, to the point where a 570 performs below 100% game speed - this is a big deal in emulation. So, again, in my particular use-case, the nvidia card was a far more interesting choice, and it still holds true somewhat today.
If you're out of the emulation loop, here are some quick examples of the bad performance:
CEMU - I had much better (25~30 FPS) on a GTX 750 (non-ti)
@Ricardo : I keep bringing it up, because you made generic AMD vs. nVidia post and that was not truthful.
On top of that you accused someone that he ignores 90% of GPU users and right in that post you brought in OpenGL minority and emulators minority. And you keep doing that since hypocritical #80.
I didn't make generic statement: I clearly said that the super line from nvidia offers more bang for buck than AMD because the nvidia cards have raytracing and better OpenGL implementation in windows than AMD cards. These are hard facts and are relevant for people buying cards today.
You're the one seeing a "AMD vs. nVidia post" where there isn't one. You're also the one bringing the last gen to the topic as a cheap way of turning this into "AMD vs nVidia". And lastly you're the one throwing words such as "ripped off" and "hypocritical".
You might say these differences don't matter to you, which is perfectly fine - again, you do you. But the fact is that these differences exist, and for a lot of people they are relevant.
So, how relevant to 90% of GPU users are those 4 DX-R games when there are hundreds of non-DX-R games.
How relevant to 90% of GPU users is performance in emulators?
Sorry, those fit to something like 5% category or smaller.
Raytracing will be done in the next generation of consoles, so much more likely to be useful than what you say. And, again, nvidia cards have no compromise compared to current AMD offerings, so nvidia has equal traditional performance plus raytracing. How is that not better value?
@Ricardo : Don't you get it? I am only quoting you so you can understand that you accused someone from forming argument that ignores 90% of product audience.
And you add 2 arguments where each ignores 95% of same audience.
Going back and forth is apparently pointless, because you do not get it even if I call it by true name of act you did. Instead you feel insulted.
I wanted you to understand, you try your best not to. (Maybe effortlessly.)
- - - -
As to your desinformation, Turing with same transistor count as Pascal is much weaker in traditional games. Turing with RT is 16% weaker than Turing without RT when transistor count is normalized in traditional games. There is nothing like "no compromise".
People hated RTX cards upon launch because they delivered mediocre performance improvement per $ over previous generation. But most of them ignored that nVidia was not charging more per transistor than before. They actually charged less per transistor. They had to make bigger GPUs to accommodate new functions while delivering same performance as before. Or much bigger to actually deliver higher performance than before.
Right now, Look at performance of 5700 XT vs Radeon VII. Look at their transistor budget. Look at their power draw... on same 7nm manufacturing process.
You know nothing at this point except some marketing slogans. I hope you can get better by actually doing your own research.
But this was my last effort towards you.
You are literally too Stupid to .... .... why do people like you even make it to the egg...