Nvidia has landed in a very dangerous position - if I was an exec, I'd be pretty nervous right now..

Discussion in 'Videocards - NVIDIA GeForce' started by Rob761, Nov 5, 2020.

  1. Rob761

    Rob761 New Member

    Messages:
    4
    Likes Received:
    4
    GPU:
    RTX 2080
    The culprit? The cause of this awful misstep? GDDR6X.

    I'm surprised Nvidia made the mistake of choosing a small amount of expensive memory over a decent amount of cheaper memory. Everyone including Nvidia is well aware that the video games industry is driven forward by home consoles. And everyone knew that Microsoft and Sony were launching new generations this holiday season. And with every new console generation comes the next big leap in hardware specs, including (most importantly right now from Nvidia's point of view) video memory quantity. It is roughly speaking doubling from the perspective of game development: console total system RAM is going from 8 to 16, but you can see from Microsoft's design that they intend 10 of the 16 to be dedicated to graphics with 6 for general processing.

    So why would anyone buy a 3070 with 8gb of VRAM when they can get an xbox with 10gb?? Or a 3080 with 10gb for $700?? On the other hand when they see your basic amd 6800 with 16gb of VRAM, they may well say "hey forget xbox, look at this".

    I've been in the green corner ever since the GTX 580, the last amd product I had was a HD 4870. I had no plans on going back to Radeon, but with this current situation, I literally don't have any alternative. 10gb VRAM simply will not be enough for the coming pc generation - fine for the consoles, but not for the pc.

    What could Nvidia do now to compete, or ideally get back in the driving seat? Well I'm presuming the problem is GDDR6X, it's too new and too expensive to double its quantity on every product? But hey, the 3090's got 24gb so no problem there, if you don't mind 350 watts and 1500 bucks for a video card. The 3070's got GDDR6 so no problem there, that could be doubled like amd. The problem is the 3080. Nvidia would have to put out a new product called RTX 3080 20GB using GDDR6 instead of 6X. Memory speed or bandwidth has never been an issue or bottleneck in my experience. The potential gains from GDDR6X over GDDR6 are just too small for the terrible price of being uncompetitive.

    Would I buy an RTX 3080 with 10gb of GDDR6X? Not in a gazillion years. Would I buy a new Radeon 16gb card? If I have no alternative like now, then yes. Would I buy an RTX 3080 with 20gb of GDDR6? Certainly (although tbh I'm not a fan of 320 watts - I might be more interested in the 220 watts of 3070). And Nvidia would be back (firmly) in the driving seat ;)
     
    SatsuiNoHado likes this.
  2. The Goose

    The Goose Ancient Guru

    Messages:
    2,631
    Likes Received:
    150
    GPU:
    MSIrtx2080 superXS
    I dont know where you got your pricing from.....wasnt from here https://www.ebay.co.uk/itm/EVGA-GeF...813688?hash=item2f4d910978:g:qMcAAOSwwVZfoGiT , as for the 3070.....not every body needs an over powered system, as i have no interest in next gen games the Msi 3070 Xtrio i have on pre-order is a nice boost over my current 2080s for the games i play....
    The division 1/2, Eurotruck sim 2/ats, f1 2020 and Assetto Corsa Competizione.
     
  3. Kevin Mauro

    Kevin Mauro Member Guru

    Messages:
    189
    Likes Received:
    51
    GPU:
    GeForce GT 710 2GB
    What an odd first post/thread/double whammy. Like reading from the mind of a deranged person. Or Google translator... *shrugs
     
    Tyrchlis likes this.
  4. metagamer

    metagamer Ancient Guru

    Messages:
    1,865
    Likes Received:
    731
    GPU:
    Palit GameRock 2080
    Great first post.
     
    DannyD likes this.

  5. lmimmfn

    lmimmfn Ancient Guru

    Messages:
    10,445
    Likes Received:
    123
    GPU:
    AorusXtreme 1080Ti
    Surely a troll account?
    "Why choose 3070 with 8 GB when xbox has 10GB"
    Are you age 10?
     
  6. wavetrex

    wavetrex Maha Guru

    Messages:
    1,357
    Likes Received:
    967
    GPU:
    Zotac GTX1080 AMP!
    He's not wrong...

    [​IMG]
     
  7. Kool64

    Kool64 Maha Guru

    Messages:
    1,003
    Likes Received:
    378
    GPU:
    Gigabyte RTX2070S
    DLSS is why.
     
  8. Maddness

    Maddness Maha Guru

    Messages:
    1,459
    Likes Received:
    674
    GPU:
    3080 Aorus Xtreme
    That remains to be seen. I wouldn't have brought my 3080 if I believed it for an instant. Time will tell weather it is or isn't. At this point in time it is more than enough. I've basically thrown my whole gaming collection at the 3080 @4k ultra and it has been more than able to handle it.
     
  9. pharma

    pharma Ancient Guru

    Messages:
    1,655
    Likes Received:
    473
    GPU:
    Asus Strix GTX 1080
    Curious the OP opens the thread with his first post at Guru3D without any additional dialogue.
     
  10. DocStr4ngelove

    DocStr4ngelove Master Guru

    Messages:
    853
    Likes Received:
    650
    GPU:
    MSI RTX2080 Super G
    Who says Nvidia isn't producing new 3000er cards with more RAM as we speak? Maybe the 3080 with 16GB is just around the corner.

    Honestly i couldn't care less atm. With a 2080S i can just sit here and wait what will happen till 2022.
     

  11. Kevin Mauro

    Kevin Mauro Member Guru

    Messages:
    189
    Likes Received:
    51
    GPU:
    GeForce GT 710 2GB
    Nothing "wrong" with Ampere as it stands beyond supply really. AMD is offering 16GB of VRAM for various reasons among the Radeon VII "technically" having had been the last HEDT card prior to the 3 cards they just launched. I always thought of them foreshadowing by selecting 16GB of HBM2 of having made a decision to continue that amount of VRAM in the future far as HEDT tier cards went. Here we are, with games and graphics design requiring new norms. It makes sense to me in the way I see many users here say "well I consider 16GB of system RAM a standard as opposed to 8" it's a similar aspect here - 8GB can be a bottleneck for PCs; consoles may be able to slide by but even they are now allocating more in that regard.
     
    Last edited: Nov 6, 2020
  12. metagamer

    metagamer Ancient Guru

    Messages:
    1,865
    Likes Received:
    731
    GPU:
    Palit GameRock 2080
    I know the feeling, I've been waiting for something to upgrade to for a couple of years now. But, everything I play runs great so I'm holding off.

    And the longer I wait, the better that upgrade will feel. I always used to upgrade when shiny new tech came out but the upgrades always felt marginal. I think I'll be happier if I wait and get a proper performance boost next time around.
     
    Maddness and DocStr4ngelove like this.
  13. N0sferatU

    N0sferatU Ancient Guru

    Messages:
    1,731
    Likes Received:
    68
    GPU:
    EVGA RTX 3080 Ultra
    Frame rates and real world performance sell, not RAM. Both the 3000 series and the 6700/6800/6900 etc cards will perform well and sell well that's for certain.

    Most people upgrade so frequently in this hobby that by the time the RAM is ever a restriction most will be well beyond the 3080 and even the 3090. It's just the envy of those who don't have the cards that keep whining about the card. OMG it can't overclock well. OMG it's 10GB of RAM. Keep whining I'll enjoy my 3080.
     
    Maddness and Tyrchlis like this.
  14. Rob761

    Rob761 New Member

    Messages:
    4
    Likes Received:
    4
    GPU:
    RTX 2080
    My goodness I did not expect that many responses! :D

    Are you really referring to my bank balance?? But.. but what do you know about that? And.. and how?? :D Hehe no my friend there's no problem there thank God :)

    Yes fellas I wasn't actually talking about the present at all, I am also very happy with my current setup in this generation. I'm talking about the future, the next 3 years or so, as all games at console level start using 10gb vram as the NORM. And pc vram usage has always greatly exceeded that of consoles. You think maybe amd's intimate involvement in the design of the new consoles had anything to do with them slapping 16gb vram on ALL (even the most basic) of their new 6800 cards?

    pls explain what exactly was unreasonable or stupid about my post, and I will do my best to clear it up for you

    Thank you sir, you're a gentleman :)
     
  15. metagamer

    metagamer Ancient Guru

    Messages:
    1,865
    Likes Received:
    731
    GPU:
    Palit GameRock 2080
    Ok, so you are real, huh. Welcome brother.
     

  16. dampflokfreund

    dampflokfreund Member Guru

    Messages:
    175
    Likes Received:
    13
    GPU:
    8600/8700M Series
    Nvidia still has the trump card, their tensor cores on their side. And machine learning will be huge in upcoming games.

    A machine learning based texture upscaler or VRAM compression running on tensor cores would make the VRAM a non-issue for all RTX GPUs.
     
  17. DannyD

    DannyD Ancient Guru

    Messages:
    1,605
    Likes Received:
    1,222
    GPU:
    MSI 2080ti
    [​IMG]
     
    Tyrchlis likes this.
  18. lmimmfn

    lmimmfn Ancient Guru

    Messages:
    10,445
    Likes Received:
    123
    GPU:
    AorusXtreme 1080Ti
    Consoles have a shared memory pool, 10Gig on a console is for game code and gfx. 8 Gig Gpu on PC is used for gfx, System Ram is used for game code execution with overlap in system ram where textures/objects are loaded before transferring to the gpu then flushed if not needed.

    An 8Gig GPU on PC has more gfx memory that the 10Gig on a console due to the console needing game code and OS in its shared RAM.
     
    tfam26 likes this.
  19. Gomez Addams

    Gomez Addams Member Guru

    Messages:
    166
    Likes Received:
    94
    GPU:
    Titan RTX, 24GB
    I don't of anyone who would base their decision to buy a video card or a console on the amount of memory they have. That makes very little sense to me but, of course, opinions vary.
     
  20. Astyanax

    Astyanax Ancient Guru

    Messages:
    9,457
    Likes Received:
    3,303
    GPU:
    GTX 1080ti
    ok 2 posts dude.

    :rolleyes:

    Me.

    There are no 16GB 3080's coming

    1> 384 / 320bit interfaces don't permit it in a single density config
    2> There aren't any 512MB modules in production to max a mixed density config. (clamshell 10x1GB+10x512MB) and it would be 15GB.
    3> cutting down to 256bit would be pointless.
     
    Last edited: Nov 7, 2020

Share This Page