Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Nov 20, 2020.
And in RT faster than it is now. But won't beat nvidia. Still relatively good first try
My RX 580 is like ~10% average MAX
sounds about right
some show 10%,some 5%,I'd say 7.5% is pretty normalized.
it's good enough to challenge the 3080,but RT performance and DLSS will keep nvidia on top this time.
for a card that's over 30% faster than 2080Ti,right.........let's buy one for 1080p
some people will always pick results depending on what brand they cheer for
Fortunately for us consumers (because competition is good), nvidia is not Intel, they do not slack as much and don't fall behind as much or at all, even when they get caught up. So after seeing how good RDNA 2 is and literally being punched by the Radeon 6000 series, it's almost a given that the RTX 4000 series from nvidia will use a much better node (7nm/7+ from TSMC, at least, if not 5nm) and they will make the best of it. So I expect a fierce competition with even better performance jumps than this gen, from the next gen coming late 2021 or 2022.
You can buy a high end GPU even for 1080p, if you want to play on Ultra + high refresh rate and not care for 4-5 years. This is what a lot of people don't get and can't seem to comprehend... you can buy an overkill GPU to last you more years for a lower resolution, instead of buying a new one every 1-2 years, because is good enough just now...
yes but how does that make 4k irrelevant for 3080 ?
I'm not sure how a 3090 can last much longer than a 3080. They perform so much similar, when the 3080 won't be able to hit 60fps, the 3090 will be near there too.
luckily both will last 4 years easily
Issue will be vram at 4k with max settings on the 3080, much sooner than it will be an issue to hit 60 fps... in the average game anyways. But in the very demanding games, such as valhalla, the 3090 gets 60 fps, where as the 3080 gets 50 fps - that is a fairly substantial difference.
I'm personally happy there are 4 fast cards to choose from.
Now if my 1080ti can't hit 144fps at 1440p in new wow game, then i have to update.
So next monitor is going to be 2560x1440..This generation cards RT performance again takes a massive performance hit even with highest end cards, leaving the choice obvious (for me) to turn off RT to unleash 30-40% more FPS.
So why would i go 3080 ? , it uses 100watt more, is slower, has less ram,+cost more it do
Wait for Cyberpunk 2077 performance review. RTX + DLSS2.0 on Nvidia will run circles around anything AMD can offer.
amd need a dlss equivalent
until they don't have it even a 2070S with dlss will run circles around 6800XT in rt
47 vs 35 fps at 1440p
seriously,releasing rt-capable cards without dlss equivalent was pointless.
even a 3080 needs it for 1440p.
2080Ti needed it for 1080p.
Memory requirements can easily exceed 10GB if devs are allowed to leverage it. ie if memory size increases on next gen, both AMD and NVidia will have surpassed 10GB with high end cards.
RT can suck ram up like a hoover, as can very high res textures. It will be used for max quality if enough cards have it.
I expect the 3080 to have no major issues over the next 2 years with max quality settings but I'm not so sure beyond that.
This is the major appeal of the 3080ti or a 20GB version.
Not only for stock games but also game mods, they can chew through Vram as well.
At time 1440p falls down to 60fps on average, 4K is already in 25~30fps hell.
I do not play CC games, because they are all locked to 30fps. They are not fast paced game. But waiting over 33ms for game to react on engine level is increasing latency and drastically reduces ability to issue higher number of commands per second.
Sure, you can play Civilization on 4K.
And I'll repeat it for you and anyone else. Buying GPU for 1440p 144Hz does not mean, every game always has to run on 144fps. In time, card will be too weak and fps will fall to still comfortable 90fps. Years later it will be 60fps.
You made statement based on false premise. That's saying someone with 144Hz screen will not have good experience when game does 60fps.
And that it is same as when 4K game does 25fps. It is not same in any way or form. Having less than 50fps on average is already torture. And no self respecting PC gamer would accept it willingly through using too high resolution.
Reality of your statement is:
One can have good GPU which either starts its journey as: 1440p 144fps or as 4K 60fps.
Where is that journey in half a year when heavier games come?
1440p @120fps or 4K at 50fps.
What about when 1440p can do only 100fps? 4K at around 42fps.
At that point 1440p owner is still happy. 4K dude is reducing game details.
And at that point, if you took two 32 inch screens, one 1440p one 4K and mage image quality comparison from same distance. You would say that 1440p looks better.
In the end, it is same case as GTX 1080 which started this discussion. It can get you 4K @60fps in AC:Valhalla, but only if you are willing to play at LOW details.
I am pretty sure 1440p on high details wins in image quality on screens which have same size. And I would personally have no problem to enjoy extra fluid movements at 1080p thanks to even higher fps.
You know, 4K can deliver higher IQ than 1440p. But that requires proper use of high detail textures and high precision shaders. That's direct opposite of what you are getting when you turn details from ultra to medium/low.
10GB is just allocation. Tiny fraction of data is loaded into GPU each frame.
RX 6800(XT) can actively use around 7GB of data from VRAM before its fps falls under 60.
Sure, 9GB of extra cache is nice. Will prevent stutter. But higher the real VRAM usage, worse fps on current RDNA2 cards.
AMD titles will not push real VRAM use above 7GB no matter how much spare VRAM cards have. No matter how much allocation consoles can have.
If anyone pushes extra VRAM usage, it will be nVidia. And I doubt they are giving KO to all cards with 8GB anytime soon.
6GB, sure, may happen within 2 years. But which card with 6GB VRAM is really powerful. And if it is powerful, will reduction of one detail type which eats most of VRAM be sufficient?
Dynamic shadows are quite costly to bandwidth and occupy some extra VRAM. Maybe ultra texture pack is not exactly way to go on older GPU either.
But when we talk about cards having 10GB VRAM. I would talk bandwidth 1st. That's more pressing parameter than VRAM. Especially since Turing has Sampler Feedback as well as RDNA2. New games which will use more VRAM will implement available methods, so they will not choke cards with 8GB VRAM.
Recent history says that 4GB VRAM is not enough in some cases, and is fixed on affected cards by simply reducing detail or two. But those cards do not have power to deliver good fps at those details anyway.
We can start worrying about cards with 8GB VRAM when this applies to cards having 6GB.
DLSS2.0 pretty much doubles my fps in Control. I can go from the game completely maxed, including RTX at 1440p running at 30-40fps to running 60-70fps with DLSS2.0 and same settings. Dropping RTX a notch or a setting here and there, the game now runs at 100fps no problem and looks amazing.
I agree AMD need a DLSS alternative. I'll give them this though, AMD RTX performance is slightly better than I expected it to be. Not bad for a first try. Problem is, Nvidia now have a lead here, having had a previous generation to iron things out and like I've said many a times in the past, DLSS2.0 is a game changer.
Question. Are you talking 4k resolution in a years time or so? If so...
Do you think a 3080 would be capable of pushing over 60fps in 4k at max settings even if it had 32GB VRAM?
My guess is no chance.
I did not say that, but now that you mention it the only way the 3080 will be irrelevant is when those 10GB Vram won't be enough for 4k... I think that will be soon too.
Also, running games at 4k Ultra now, does not mean it will 2 years from now, when they will be even more demanding, Ultra preset. Unless you downgrade the eye candy or lower the resolution, 4k will not be sustainable for many years without a short upgrade cycle (2 years max).
Depends on the game.
GSync/VRR are commonplace on DP and some HDMI displays, sub 60fps has a place too.