Discussion in 'Videocards - NVIDIA GeForce' started by SerotoNiN, Sep 4, 2020.
Yeah seriously totally not worth upgrading from a 2080Ti, I'm keeping mine until next gen hits.
Next week, didn't yah get the memo.
I guess it depends on the situation? 1080p or 1440p/100Hz players probably don't need the upgrade but 4K players would benefit from it.
I personally can finally play RDR2 at 4K/60fps locked and playing Control yesterday maxed with full RT was a blast hitting my refresh rate at 3440x1440p most of the time. I'm also looking forward to installing and playing the Horizon Zero Dawn DLC2 at 4K/60fps with HDR enabled which was impossible with the 2080Ti.
I downloaded Quake 2 RTX which will be the ultimate test
I suspect it is you who is clueless. they can hardly fill the demand for the current cards. Let alone for a refresh in the next six months.
Do they have a 20GB model already on the table? I don't have a crystal ball, but chances are they indeed have it. But from a business perspective it would be totally senseless to bring out a new version in a short period after the current release.
Btw, those games run fantastic in 4K, and benches already show that the 3090 with 24GB! is roughly between 10 and 20% faster.
Funny how some people are totally overhyped that when a template of videocard version leak, they go out of their mind with speculation.
you are the clown that was acting like it would be a year before we see these 20gb cards. Again get through your brain that these cards were already on the road maps and leaked slides that have been around for weeks and sometimes months so they are going to be made sooner than later. And it's clear you can't even comprehend something as simple as it saying in the documentation for Watch Dogs legion promotion even mentioning the 20 gb 3080. But please keep making a fool of yourself with your ignorant comments.
I think for me it's enough for the next 2 years and why should I pay more for a 20Gb RTX3080ti or something at least the gaming industry will target the general formula of RAM to reach out and sell as many games possible they ain't gonne bring the standard of Vram taken by games up right now it's a matter of time and we wlll learn I guess
Seems to have plenty of vram
For most people and most games, yes.
But I personally don't think it has enough either.
That's why im dropping it as soon as 3080Ti is released.
Perfect example of VRAM hog is cold war. If I alt tab with ultra textures(maxes out vram usage) and alt-tab back in the game is <10fps for up to 5-6 seconds.
The VRAM usage drops when going to desktop but gets pooled up again upon entering game.
If i go to high or medium textures the FPS drop doesn't happen.
Good luck with getting your hands on a 3080TI
Warzone is the only game so far that 10GB of VRAM isn't enough for 4K. I have to reduce texture quality to low settings, otherwise my performance degrades and hitches after long gameplay.
I'm not sure how NVIDIA is gonna implement their SAM support but on Navi21 cards VRAM usage is raised by 1GB+ which could be problem for 10GB version of RTX 3080.
At this point I doubt we'll ever see a 3080Ti, and if we do, it will be so hard get and likely "late in the game" that the next generation will be around the corner. If the shortages weren't enough due to the known reasons, the new crypto boom beat the last nail in the coffin.
10GB would be plenty for Warzone, if it wasn't being broken either by bad code or tools injected into it (jury is still out on which it is)
It is not supposed to use 98% of the vram and then begin filling system ram.
Re what i said to memorian just now, i have seen this title consume all the vram on the 5700xt and no doubt it'll enact the same bug on 16GB 6800xt's (under the conditions where it reproduces), has any one in the vanguard discord brought it up?
The map is huge and it needs 12+GB for high textures @4K. 3090s or 6800/6900XT run the game super smooth. My 2080Ti was fine with Normal textures, 3080 needs low to be perfect.
MW needs 8.6GB (at 4k WITH RT)
The lack of RT in Warzone significantly reduces vram requirements, the usage you've seen on cards with more vram is just vram getting filled with texture data that is failing to be invalidated.
The 3090 ends up stuttering in the same way, it just takes a few more rounds to get there.
The value on this screen
Is how much the game should use up to the maximum when the default memory scale of 0.85 is used.
The vram leak results in the game filling vram up to and beyond the maximum until system ram is fully consumed too, multiple forum and reddit posts have displayed users aiming this to well below the maximum and utilities still displaying entire vram consumption and stuttering.
Were the game working properly, textures would be unloaded / reloaded using a texture invalidation and purging system based around the 0.85 scale, the game itself is intended to only use a fraction (less than 100%) of total vram, but that is not what is seen.
Both wind up with the same issue as 6, 8 and 12GB cards.
Warzone supports RT. If you reduce memory scale to 0.55 it is fine, but some textures look blurry because it needs more VRAM. I don't think 3090s suffer from this issue, if there was memory leak, it would start to stutter even with low textures after some rounds but it doesn't.
no it doesn't.
3090's are affected
It literally does, though. I've got it enabled in Warzone.
No it doesn't,
the setting is specifically for MW/MP,
Warzone(Battle Royale) doesn't have any DXR effects.
The framerate speaks for itself loud enough, but theres no visible difference what so ever apart from the momentary lighting engine caching itself in the distance.
I agree, it's pretty much the glaring flaw of Nvidia's cards right now. For 4K it's just not enough (maybe even 1440p games depending nowadays like Cyberpunk which seems to gobble VRAM).