I don't think the 3080 has enough Vram

Discussion in 'Videocards - NVIDIA GeForce' started by Serotonin, Sep 4, 2020.

  1. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,017
    Likes Received:
    7,353
    GPU:
    GTX 1080ti
    MSFS is using free vram for caching of texture data, but otherwise streams it as you go

    10GB is sufficient.

    But please, keep saying that you know better than the people who have made graphics cards for 21 years.

    [​IMG]
     
    Last edited: Sep 4, 2020
    itpro likes this.
  2. A M D BugBear

    A M D BugBear Ancient Guru

    Messages:
    4,394
    Likes Received:
    631
    GPU:
    4 GTX 970/4-Way Sli
    They way I look at is this:

    If they say that the 3070 is indeed faster then 2080ti, they should have up the vram to at least 11-12gb vram, 3080 more to like 16gb range.

    But I did saw on another thread on front page news about some lenovo had a 3070 with 16gb of vram, now we are talking.
     
    scajjr2 likes this.
  3. narukun

    narukun Master Guru

    Messages:
    228
    Likes Received:
    24
    GPU:
    EVGA GTX 970 1561/7700
    Just use normal resolution textures for the new games after expending USD $700 for your new card and you're gonna be fine!
     
    Serotonin likes this.
  4. CPC_RedDawn

    CPC_RedDawn Ancient Guru

    Messages:
    10,413
    Likes Received:
    3,079
    GPU:
    PNY RTX4090
    Most games nowadays cache extra data in remaining VRAM for faster loading, less hitching when streaming in data and less HDD access. This doesn't matter much if you are using an SSD or even an NVME drive as these are generally fast enough to stream in new data.

    DirectI/O will help massively but games need to support it for it to work.

    10GB should be enough for a while yet, remember it's GDDR6X as well so it has much higher bandwidth and speed as well.

    Though I do see either a 16-20GB 3080Ti/S or a 16/20GB 3090 coming later down the line. I am expecting AMD RDNA2 to be between 3070 and 3080 in terms of performance but come with 16GB GDDR6 and maybe a higher end 16GB HBM2 model as well.
     

  5. DannyD

    DannyD Ancient Guru

    Messages:
    6,770
    Likes Received:
    3,783
    GPU:
    evga 2060
    Would i love a 10gb 3080 right now - yes.
    Does the OP have me a little worried having driven around a few hours with this in my thoughts - yes.
     
    Last edited: Sep 5, 2020
  6. Cyberdyne

    Cyberdyne Guest

    Messages:
    3,580
    Likes Received:
    308
    GPU:
    2080 Ti FTW3 Ultra
    When it comes to VRAM for games, want doesn't equal need. Most games will cache assets in an effort to speed up loading times or streaming, sometimes excessively. A game may "want" 12 GB, but realistically only "need" 6 GB and still be smooth.

    And the "consoles will have 16 GB" argument is tired. That memory is shared, so not all will be available to the game. A chunk will be dedicated to the OS and background tasks. On PC, games will also use system RAM where it makes sense, consoles don't have that luxury. After all that, the real amount of RAM being used for graphics out of that 16 GB will be much lower.
    And to top it off, when the ps4 came out with it's 8 GB in 2013 it didn't suddenly make 4 and 6 GB cards irrelevant. Even 7 years later, 6 GB works fine for most new games. Sorry for the rant there, not really related, but I've seen this a lot.
     
    MMXMMX and DannyD like this.
  7. Serotonin

    Serotonin Ancient Guru

    Messages:
    4,578
    Likes Received:
    2,036
    GPU:
    Asus RTX 4080 16GB
    Never said I knew better. Thats just you being a prick. I raised a concern as a consumer. I'd keep typing, but you used anime....
     
    jbscotchman and Dragam1337 like this.
  8. bobblunderton

    bobblunderton Master Guru

    Messages:
    420
    Likes Received:
    199
    GPU:
    EVGA 2070 Super 8gb
    8~10GB is not enough, not even close, not even at 1080p!
    If you want immersive high-detailed environments, with nice high-detail layered textures, then you need 16~32gb for a start - even more if you want to work with ray tracing. Current games won't need that much entirely, but next-gen stuff will.
    While the 24gb is wonderful on the 3090, the price isn't... that should be half it's current price, but with not much competition we can't do much aside of say NO TY.
    Building a city map from scratch in BeamNG Drive & 8gb isn't even close to enough on this 2070 Super, I really feel this should have come with 12~16gb, as my old RX 480 8gb had that much but the drivers were too unstable. Would have loved a 16gb Radeon VII or Vega 64 16gb Creator Edition, but I couldn't be bothered messing about with the Radeon drivers again as I said, they cost me too much in lost work.

    There's two things that currently limit game environmental design: Draw calls if not using DX12 / Vulkan multi-threaded draw call features, and VRAM. Triangle-fill comes in a distant 3rd but even a half decent card can fill millions of triangles a frame at 60fps on a proper game-engine. Without enough VRAM, more common components of the game's 3D environment have to be culled from main models, and inserted by the game engine during play, costing more CPU resources. For anyone who's done 3D artist stuff, you know it's a constant juggle between CPU use, GPU use, VRAM use, and so-forth.

    So unless you want repeating scenery like an old cartoon, opt for the larger VRAM. Once next-gen stuff hits, you'll need it.
     
    jbscotchman, Serotonin and Dragam1337 like this.
  9. narukun

    narukun Master Guru

    Messages:
    228
    Likes Received:
    24
    GPU:
    EVGA GTX 970 1561/7700
    when a game uses a lot of vRAM we don't know if it is cache or actually used, in the case of CoD Warzone, you get stutters when you start moving around the world, so we can say that is asking for more space in that 'cache' vRAM and lead to stutter, that could happen to ANY game, specially for 2021+ games.
     
  10. Cyberdyne

    Cyberdyne Guest

    Messages:
    3,580
    Likes Received:
    308
    GPU:
    2080 Ti FTW3 Ultra
    @bobblunderton correlation does not mean causation, your examples are irrelevant.
    VRAM usage in modeling programs, rendering, and engine editors, is not the same in actual games.
    The 16 GB Radeon VII gets beat all the time by GPU's with half the VRAM, even in games capable of using 16 GBs. Refer to my previous post for how reality works.
     
    Last edited: Sep 5, 2020

  11. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    5,535
    Likes Received:
    3,581
    GPU:
    RTX 4090 Gaming OC
    Nvidia has made several gpu's where vram was a limiting factor from the get go at high res. 680, 780, i had to turn down texture settings in some games, cause of insufficient vram, and thus stutter from vram swapping if i didnt.

    10gb might be sufficient at 1080p, but at 4k? If at all sufficient now, it wont be for long...
     
    yasamoka likes this.
  12. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,017
    Likes Received:
    7,353
    GPU:
    GTX 1080ti
    That is your (incorrect) belief that ultimate/ultra settings are for gaming.
     
  13. Undying

    Undying Ancient Guru

    Messages:
    25,354
    Likes Received:
    12,756
    GPU:
    XFX RX6800XT 16GB
    They are otherwise those settings wont be there.
     
  14. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,017
    Likes Received:
    7,353
    GPU:
    GTX 1080ti
    no, they are for screenshots and offer almost zero visible improvement as demonstrated constantly in quality and performance reviews that say exactly the same thing im saying now.
     
  15. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    5,535
    Likes Received:
    3,581
    GPU:
    RTX 4090 Gaming OC
    There is a visual difference, and there is a reason i play at 4k in the first place... i care about the visual difference. Thus i need a gpu that is sufficient for it - including a sufficient amount of vram.

    But you can play at 1080p low settings for all i care.
     
    jbscotchman and Undying like this.

  16. Undying

    Undying Ancient Guru

    Messages:
    25,354
    Likes Received:
    12,756
    GPU:
    XFX RX6800XT 16GB
    Im sorry i just cant comprehend that statement. Any of us would not be upgrading if thats the case.
     
    toyota, jbscotchman and Dragam1337 like this.
  17. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,017
    Likes Received:
    7,353
    GPU:
    GTX 1080ti
    no there isn't.
     
  18. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    5,535
    Likes Received:
    3,581
    GPU:
    RTX 4090 Gaming OC
    For some settings, im inclined to agree that the visual difference is minimal - but for textures in particular it is quite noticeable, and it's textures that are the most vram demanding, thus usually the first setting you have to lower if you are short on vram. Been there, don't want to go there again.
     
    jbscotchman likes this.
  19. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    Shadows can take a good chunk too next to that setting and the display resolution itself but it's also one of those that are going to differ from game to game how it's implemented and how big of a difference this makes visually.
    Volumetric effects tend to be what really hits framerate too but has a less distinguishable difference on visual quality especially when the effects can cost 20 - 30% of the GPU's total performance with ultra as some very fine very costly effect.

    I like the extra visual fidelity though it does come at a performance cost and realistically scaling down volumetric lighting, fog and clouds as a recent addition in games for a smaller loss of visual quality over the performance gained is a pretty good trade-off to retain good framerate. :D

    Screen space reflections are getting more costly too as some games are getting into doing native resolution reflections and much more of it, Borderlands 3 and Horizon are good examples though Borderlands employs a variety of UE4 engine techniques as well. (There's even a ultra quality but that one can halve framerate though the reflection is now almost noise free so it's a noticeable step up visually although much too costly outside of the highest-end GPU hardware.)

    MSAA's kinda both part of this and then again not much of a part as it's phasing out in favor of faster temporal anti aliasing solutions too but instead these have more blur though they also cover more than geometry anti-aliasing and some games are starting to be almost built around having this active so disabling it makes things look pretty bad or broken almost.


    EDIT: Could also cut the display resolution but that looks terrible especially on modern screens, gives a good VRAM and performance boost but short of using windowed display mode it's not going to look good.
    (Back buffer resolution slider style scaling can work by at least preserving UI element clarity I suppose.)

    But yeah outside of dropping the texture resolution or cache memory amount for when that method is used usually it's shadows and then VRAM usage is often more lightweight among the other settings. (Bit extra for MSAA the few times this is available or for some older games including TXAA supportive ones.)


    EDIT: Also agreeing on how the ultra preset often is a smaller step up from very high but some settings are nice to have, LOD and pop or fade in reduction though overdone this really hinders both GPU and CPU performance.

    Shadows have so many variants in implementation and methods like cascade shadow mapping or various filter effects plus vendor solutions particularly NVIDIA with PCSS and the less often used HFTS modes.
    (So it can be anything from a noticeable loss of shadow quality and detail to a noticeable improvement of performance but a limited difference in perceived image quality loss other than perhaps some very fine detailing or more distant shadowing.)
     
    Last edited: Sep 5, 2020
  20. Right although ... how much was intended to be future proof and how similar to a situation could this turn out like the 7xx series when we had Titan like performance which ran into VRAM bottlenecks about a year or so after release? I completely submit I may be totally overreacting and NVIDIA may have done their best to learn from that experience and gauge the needs better this time.
     
    Dragam1337 likes this.

Share This Page