Discussion in 'Videocards - NVIDIA GeForce' started by SerotoNiN, Sep 4, 2020.
you're offending people in a discussion about toys.
go back to kindergarten maybe.
no,I did not lie.
I said I used the highest preset and never had a warning or crash/freeze.
there is no lie in this.
you then replied with information about texture streaming,which I'd ignored cause it's clear you don't know how to talk to other human beings.
you need to be disciplined either by the mods or your dad.
the best way to deal with toyota is to state your factual piece and then ignore his next several posts, its just some angry kid who doesn't know that windows reports video memory in several different ways.
RTSS measures total gpu memory presently comitted, not the current processes dedicated, meaning its Game+WebBrowser+DWM+anythingelseusingvramonthemachine
Notice warframe isn't even using 1GB of vram on this screen.
oh nice try. I clearly said with all settings fully maxed and even went out of my way to mention even the texture streaming. you then replied with "not true from my own experience". so yes that makes you a liar. you probably went and looked at your settings since I mentioned that setting has to be manually adjusted to uber otherwise its only on ultra which is not a probelm. instead of being a man and admitting that yes you did not have it on uber you come back and pretend that you said "highest preset".
Why are you so angry? I am merely relaying their information. And no I will not buy a game to test their claim, that is a reviewers job, not mine.
I think 3080 is very good for what it is at 700 with 10gb g6x and massive performance increase over turing.
sure we'd all like 20gb
but I wouldn't pay $800-900 for just that really.Simply not worth the returns.
Should be quite obvious if you have a functioning brain... "Mem" right under gpu is the gpu memory - and it is at 9175 mb used.
sorry I meant that more in general way to anybody passing that Nvidia quote off as fact that those games only use 4 to 6 gb vram. I guess I am just sick of seeing that nonsense pop up in pretty much every discussion on the web about vram for Ampere.
Could be, but in the case of shadow of the tomb raider, it isn't.
I've tried playing it on a 2080 at the same settings as my 2080 ti - and it stuttered like crazy... then i turned down the texture setting as to get below 8gb vram usage (as that is what the 2080 has), and the stuttering was gone.
So it's not always "just caching" - and in fact in regards to the newest most visually impressive games, it almost always isnt.
thank you for the clarification.
well guess what, ding dong, when you run a game even with nothing else open the vram is pretty much the same in games. vram will get released form other things if need be and go to the game. I can play a game on clean restart with NOTHING else open and or with lots of tabs and whatever else and the vram usage shown is practically the same.
yep digital foundry and adored tv both talked about the 2080 stuttering in Rise of the Tomb Raider due to having only 8gb where as the 1080 ti was perfectly fine in the same testing.
There is no evidence in any DXR testing 4k or otherwise that supports your claims.
Not even on the reviews on this site.
Maybe try not playing games with web browsers in the background leeching video memory.
I remember that. It was Rise of Tomb Raider that actually uses even more vram than Shadow. 2080 was a stuttering mess while 1080ti and Radeon 7 ware doing fine.
nice,now find an example on 10gb ampere not on 8gb pascal.
ROTTR was actually a bug where vram data wasn't being invalidated and flushed, the current patch version should not have any issues what so ever.
what a load of crap. I just fired up Star Wars Fallen Order with nothing else open and vram usage was practically identical as when I fired up the game with two Chrome windows with over 50 tabs open.
From Hillberts review "If you want to play Ultra quality with Ultra HD as preferred monitor resolution, 8GB+ is advised."
You dont see it in benchmarks, as barely anyone does deep dives on the frametime numbers, and as the fps still looks good on the 2080 with maxed settings at 4k... no reported problems. The reality is though that it stutters with ultra textures on a 2080 at 4k, and not with high textures.
you really are reaching hard arent you? where the hell have I said that 10 gb is not enough for current games? I only mentioned 8gb and mentioned an example where that is even an issue at 1440p. that just has me worried about some future games and 10 gb since this is being marketed as a 4k card.
Yeah testing on various games and methods for how these utilize memory would be needed to get a more complete answer perhaps or a definitive overview plus you have some PC ports with issues around leakage or buildup over time and over allocation plus trying to let developers set a target and hope D3D is smart enough to use what's available.
(From memory Deus Ex Mankind Divided before it was patched asked for 512 GB and then DirectX utilized as much as the GPU VRAM reported as actually available.)
Caching around 80 - 85% is common too and then how this gets reported from total allocated memory and then what's RAM and what's VRAM compared to what's actually used.
Unwinder has discussed this a number of times as far as I remember when it gets asked about in the Afterburner and Rivatuner forums here, due to the programs popularity it would be nice to have both measurement modes available but as Astyanax mentioned few programs give that option unless you use a more in-depth monitoring suite even if it's more for analytical or development purposes.
(AMD GPU Profiler and stuff of that nature.)
You notice when the memory runs out at least, stuttering and hitching becomes very pronounced and the game just bogs down, leakage in particular can then start hitting RAM or even the page file before it just errors out or crashes or the OS itself warns about memory getting low if it gets really bad.
Framerate just about doubled instead of the expected 30% gain from a AMD Fury to the AMD Vega due to that 4 GB memory starting to become a real bottleneck some years ago without dropping some settings in some of the new games.
Still pretty sure 8 GB will last a while yet in most situations though some of the ultra settings or the occasional optional texture pack might hit close even before measuring what the VRAM utilization is at when going into 3840x2160 or higher resolutions.
Also expecting NVIDIA to provide or to allow third party models for the 3080 at least to go up to 20 GB later on in addition to a Ti type model some months after the 3080 itself which could also have a higher memory total from the get-go.
Should do some testing in a few games myself too and see what I can get up to, should be around 4 - 5 GB for Borderlands 3 from the settings I use and then maybe 6 - 7 GB by forcing streaming off and requiring constant highest texture quality.
(Various tweaks so a bit of a extreme case along with Monster Hunter World which uses D3D12 to allocate as much as possible but hits issues if you run out and the game loads in a lot at once.)
EDIT: The allocated memory is interesting as well seeing games like Flight Simulator have almost the entire GPU memory budget reserved but actual usage is significantly less due to it's streaming system.
Wouldn't put it past some PC games or even game engines to get into problems here either over-allocating or hitting other issues even without it being some buildup or leakage issue such as with Horizon Zero Dawn even after it's latest patches when it performs shader caching and then on exit and restart of the game it should work fine.
EDIT: Oh and the opposite issue as well, many games or game engines limiting memory utilization even when there's more available so you get this pop-in and streaming going on with the game preferring 512 - 2048 MB even when there's much more free and that goes unused instead.
(Improving though, think Divinity Original Sin 2 was still sitting around the 512 MB texture cache amount but more games are moving up to the 3 - 4 GB range if needed.)
Use it when it's available but don't waste it either.
Wonder if that's going to change as developers come to terms with the higher amount of memory they can now target as a lowest baseline with the console hardware change too.
There's still shared memory and it doesn't translate directly to PC but the higher available amount could see some things change plus other hardware differences as well even if that won't immediately translate over to changes on PC systems.