Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Sep 16, 2020.
This kind of GPU is a pipe dream in my country these days. It costs, literally, more than a car.
Yeah I made peace with the thought that I will most likely not get one before Cyberpunk comes out. So maybe around December I will get lucky, but only if the prices are not inflated still...
Didn't NVIDIA literally have a showing about how the 10GB version is fine, by showing games from the current AAA generation needing 6GB+ of VRAM?
There is nothing out for the next gen yet, and the demo that "proves" that these GPUs are OK for the next gen is literally showing that you needed more than the previous gen had, for playing previous gen AAA titles on the PC.
The mind games are amazing here.
Well they specifically said they asked developers about current games and upcoming games. They said the upper limit of current games, maxed out at 4K with RTX on is 6GB. So presumably next generation titles would use more but you have a 4GB buffer, the ability to not play at 4K and turn off RTX I guess.
I just remember when the 980 came out with only 4GB and Xbox One had 9GB available for games - people were saying the same thing and here we are, 7 years later and games still don't actually seem to use 9GB even at 4K, let alone 1080p.
So idk, I'm going to go ahead and say it's not a problem. If people want to wait for a 20GB card by all means I think they should - but I don't think we're going to be sitting here 4 years from now going "man the 3080 would have been a great card if only it had more VRAM". I don't think it's going to matter.
That really sounds wrong. Have they quoted anyone specific or was it "GameWorks Joe" with a GPU gun to his head? Games with 4k texture packs don't even fit in 8GB framebuffer cards at 4k. Imagine adding things like high shadowmaps etc.
With the raw power of the 3080, it should survive the current console generation at higher than console settings, so that's a no no from me.
What I actually remember is people with GPUs with 3, 4 or even 6GB of VRAM constantly complaining about "bad ports" and stuttering and how (correctly) smug everyone who went for the Radeon 300 series was, despite initially people saying that the frame buffer was excessive.
I hope you're right, but seeing how these GPUs will also be used as giant I/O hubs for NVMe, I think that extra VRAM will be even more important than the last gen.
@PrMinisterGR : VRAM as a Cache is too expensive. I would rather get extra RAM as cache and moved needed data to VRAM as viewport turns around while unloading old (no longer needed) data back to system memory.
You can do this at faster rate than storage in console.
And as far as 48GB VRAM goes. It will not help with game that is built around fast NVMe console-to-PC-port that did not coded in PC type of caching unless you have PCIe 4.0 x4 NVMe at full speed.
If you look at it this way, developers will be forced to use caching system of Windows-PC anyway. Because they want to sell those games. Sure, minimum requirements will be SSD. Recommended NVMe.
But no console port will have problem based on available VRAM as long as cards have 8GB VRAM and system has 16GB RAM.
10GB VRAM + 32GB ram is absolutely safe even when lazy console port comes. Because even lazy developer does this particular job correctly, otherwise his game tanks on PC review and he makes no sales.
- - - -
I have no idea why people torture themselves over it for so long. Discussed 50 times over by now.
This bridge will not be crossed before we get to it.
Maybe when MS releases XBox Series XT with higher clocked GPU, higher clocked full 20GB GDDR6, it will bring ability to make games that will have real advantage on console.
And as I wrote before, would have MS gave this console 20GB, I would have seriously considered it.
They didn't quote anyone, they just listed off a bunch of current gen 4K titles with RTX and said that the maximum amount used in those games was 6GB. Presumably they asked the developers of those titles about their future ones.
I don't really care for the 4K texture argument. I had a 980 4GB until I got my 1080Ti, I played Skyrim with modded 4K texture packs, 90% of them were uncompressed - simply running one of the texture optimizer tools cut the VRAM usage by 3/4's. I would even argue that you could reduce 80% of the textures in those packs to 2K and in a blind A/B test no one would know the difference. Landscape textures and stuff? Sure love 4K, but the top of a basket doesn't need a 400+mb texture.
I had a 980 with 4GB and I don't remember stuttering due to VRAM. Perhaps it happened after I got rid of it for the 1080Ti but I never had any issues playing games at 1080p.
Going through this, in more modern titles it seems like the 980 has better averages and lows than the 390x. I'm sure there are now some specific titles where it would negatively impact the 980 but I'm not sure I'd want to play any of those titles on either card.
FWIW modern games will also probably make use of DX12's newly added Sampler Feedback system, which allow for a ton of VRAM optimization techniques - especially in conjunction with all the GPUDirect stuff.
I guess we'll see in 2-3 years.
here is better video showing how card with less vram ages . it is not nothing . At release the cards were exacly the same . Just check now :
Some games its nothing , some it is huge . It is same for 1060 3gb vs 1060 6kg. (3gb vs 4gb is also quite a difference).
6gb will be minimum for 1080p very very soon.
I posted the video I did to address his specific point about the 980. Probably about two years ago I'd say I wouldn't recommend under 6GB. My problem though is you say 6GB will be minimum soon - implying it's not already the minimum. With that in mind, I can't find a single game that uses more than 25% more VRAM in 4K than it does in 1080p (allocation, not usage) and that includes VRAM heavy games like FSX/Doom. So Even if 6GB becomes the minimum soon for 1080P, the minimum for 4K isn't going to be that much higher - maybe 8. Even then you'd have to separate the allocation from usage. AMD has a slide from VII launch that shows multiple titles eating up 11GBs of VRAM but we know those titles perform fine on cards with far less.
I also want to clarify - if AMD releases a 16GB card that performs near identically to a 3080 for the same price, same features, etc obviously I'd recommend going with the 16GB card for longevity sake. If Nvidia released 3080's with 20GB of VRAM for only $50 more, I'd say - go with the 20GB - again especially if you plan on keeping the card for 5+ years. I think if you already upgrade your GPU at least every two generations the 10GB will be fine. I think if you're paying $700 more for a 3090 because of VRAM and you're not some kind of content creator, it's massive waste of money. You're not ever going to see a benefit from that VRAM and you'd be better off putting that money asidem buying a 3080 for $700 then buying a 4080 for $700 in two years.