NVIDIA Could Release RTX 3080 20GB en RTX 3070 16GB in December

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Oct 8, 2020.

  1. TheSissyOfFremont

    TheSissyOfFremont Master Guru

    Messages:
    252
    Likes Received:
    112
    GPU:
    3090 FE
    It seems like a mental situation to me.

    Either 10GB is enough and doubling that with a 20gb is absurdly overkill (and it will significantly up the price)

    Or

    10Gb isn't enough and Nvidia launched a flagship card without enough RAM - either cynically or incompetently.

    I mean, is there really any evidence that we're going to need a doubling of VRAM in the next 2-3 years? Game/GFX professionals weigh in here...

    The things that could cause that are (with my layman's understanding): Textures, Resolution, and general asset quality/density?

    I guess we will continue to see moderate increases in texture and asset quality to match the standardisation of 4k as the high quality resolution for PC and consoles, as well as the continued increase in density of assets in the environment.

    How much does VRAM volume effect RT? Not all that much I would have thought?

    Is this something that will allow DirectStorage to do something it otherwise wouldn't be able to?

    I'm really curious, because I feel like the next 5-10 years is going to see some big leaps in the technical development side of games. But I had assumed that was going to necessitate architectural innovation and significant raw power increases rather than memory expansion. (as a priority anyway)
     
    Last edited: Oct 11, 2020
  2. UZ7

    UZ7 Ancient Guru

    Messages:
    5,526
    Likes Received:
    67
    GPU:
    GB Vision 3080 10GB
    Well you have to consider the added cost for the +10GB and if its just vram upgrade or if there will be anymore SM unlocked with it. Then you have to consider the resolution you play with and the games you play. Of course once you're going past 1440p you will see an increase in vram usage. Some games use the extra vram as cache or "will need it later maybe" data so artificially inflating actual use. Some people play games and mod it requiring more vram use. Then theres the notion of consoles having 16GB therefore future games/ports may need it, based on that its another one of those "i may need it" situation but if we look at previous gen with 8GB as the norm and 11GB on the 2080Ti and look at the games based on resolution use, how many games actually ate up all the vram as "need it to run" type scenarios. In the end it will be interesting to see as new games develop with having more resources available.
     
  3. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,079
    Likes Received:
    914
    GPU:
    Inno3D RTX 3090
    If it was a caching mechanism only, it would fill a larger part of the VRAM. The mental gymnastics around this are only proof of NVIDIA's marketing, seriously.
     
  4. Dribble

    Dribble Master Guru

    Messages:
    294
    Likes Received:
    117
    GPU:
    Geforce 1070
    For games today I think 10gb will probably be fine - by the time you need 20gb the card will be too slow to use those settings anyway. In fact it wouldn't surprise me if they need to run the 20gb card at looser memory timings so it could actually be ever so slightly slower sometimes.

    That said something new could really require 20gb. For example I could see how the new tech to give the gpu direct storage access means that we end up storing more stuff on the card as we would probably want to store all the textures/models on the card, instead of using main memory and only sending the textures required at a particular moment to the gpu. I suppose that extra memory is a bet on the future - is it worth the cost trade off. If it was cheap/free sure, but if they charge several hundred $$$ doubt it's worth it.
     

  5. Denial

    Denial Ancient Guru

    Messages:
    13,972
    Likes Received:
    3,729
    GPU:
    EVGA RTX 3080
    I don't know how you can say it's part of Nvidia's marketing. I don't recall Nvidia marketing VRAM as a big thing, or ever talking about cache amounts, etc - other then when people asked about this specific card, the 3080, they never really mention it.

    That being said, AMD has:

    [​IMG]

    Here is AMD saying Farcry 5 uses 12.9GB of VRAM. The 3080 has no problem at all playing Farcry 5 at 4K maxed out. If you look at the 3090, performance falls exactly where you'd expect it, about 10% over a 3080 at 4K. How do you explain that if it's using 2.9GB over the max limit of the 3080? Wouldn't minimum frames massively suffer?

    To take it even further, the same thing applies to all these games and a 2080 at 8GB. The difference between a 2080 and a 2080Ti in Farcry 5 at 4K is 27% - matches every other game between those two cards at 4K - even matches the TPU average. Yet FC5 supposedly uses 4.9GB over a 2080's VRAM amount, in fact it even goes over the 2080Ti. How is that possible unless it's not actually using that much VRAM? Ether these games aren't actually using what AMD and/or VRAM utilities are reporting - and it's simply caching stuff in case it needs it, or VRAM doesn't actually effect performance in any meaningful way.

    Going back to the "games will use console amounts of VRAM" - let's look at how the Xbox Series X is broken down:
    So realistically these consoles, targeting 4K@60fps are going to have 10GB of VRAM for games and 3.5GB of ram for CPU related stuff. The Series S only gets 10GB total, presumably some portion of that dedicated to system software as well.

    I think you're overstating the need for more than 10GB of VRAM, I'm sure some people are understating it but realistically the answer lies in-between - that's its probably fine for most people. I personally think the card has enough VRAM for two generations at 4K gaming. 80% of steam gamers aren't even at 4K. DX12 is getting/has a bunch of features related to VRAM management.

    The Fury X launched two years after the Xbox One with less than half the VRAM of the console. It still lasted like two generations before it actually mattered.

    I don't think it's an issue and if people do - just wait for a 16 or 20GB card. For me personally, I don't have a 4K monitor and I plan on upgrading next generation anyway, so I'd honestly rather save the money now and just use it towards a 4080 or whatever.
     
    Last edited: Oct 12, 2020
  6. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,079
    Likes Received:
    914
    GPU:
    Inno3D RTX 3090
    The 16GB of space on the Series X is developer agnostic, it's not really split. So it still is 16GB.

    All the above, including the Nvidia things just reinforce what we basically agreed upon before.

    8GB Console Generation = 8GB GPU
    16GB Console Generation = 16GB GPU

    If you plan on keeping that 3080 for more than a couple of years, that is. Otherwise it will be fine for the first 2-3 years of its life.

    The Fury X's memory was an issue early enough for AMD to say they would "solve" it in the driver when the card launched.
     
  7. Supertribble

    Supertribble Master Guru

    Messages:
    942
    Likes Received:
    159
    GPU:
    Noctua 3070/3080 FE
    It's split insofar as the memory is split into different speed modules. My understanding is developers have 3.5GB of slower memory for game data and a remaining 10GB for video memory. I'm not sure how they could exceed the 3.5GB hard limit without the faster memory running at the slower speed?
     
  8. Chert

    Chert Member Guru

    Messages:
    122
    Likes Received:
    29
    GPU:
    Gigabyte RTX 3060ti
    No post.
     
    Last edited: Mar 14, 2021

Share This Page