Nvidia has landed in a very dangerous position - if I was an exec, I'd be pretty nervous right now..

Discussion in 'Videocards - NVIDIA GeForce' started by Rob761, Nov 5, 2020.

  1. Undying

    Undying Ancient Guru

    Messages:
    25,469
    Likes Received:
    12,876
    GPU:
    XFX RX6800XT 16GB
    I assume you bought 2,500$ titan becouse its the fastest not becouse amount of memory it had. 2080ti would make more sense but opinions vary.
     
  2. Martigen

    Martigen Master Guru

    Messages:
    535
    Likes Received:
    254
    GPU:
    GTX 1080Ti SLI
    Well, there's this if it's true:

    https://wccftech.com/nvidia-geforce-rtx-3080-ti-20-gb-graphics-card-specs-leak/

    Though technically it sounds like a 3090 with less memory. However it also states a 320-bit interface -- which while it would explain the memory configuration, I'm not sure how that works if it's essentially the same chip going by core counts.
     
  3. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,035
    Likes Received:
    7,378
    GPU:
    GTX 1080ti
    20x1GB modules in clamshell config,
    3090 is 24x1GB.

    if 2GB modules were already in production the 3090 would be a bit cheaper.
     
    Tyrchlis likes this.
  4. Arctucas

    Arctucas Guest

    Messages:
    2,169
    Likes Received:
    61
    GPU:
    eVGA RTX2080 FTW3
    Priorities.

    E-Peen.

    Hostility and lack of civility.

    Brave new world...
     

  5. DannyD

    DannyD Ancient Guru

    Messages:
    6,836
    Likes Received:
    3,802
    GPU:
    evga 2060
    It's these 'Rona times were going through, toilet-paper bandits, food hoarding and tech scalping, it's the new norm.
     
    Tyrchlis likes this.
  6. Gomez Addams

    Gomez Addams Master Guru

    Messages:
    255
    Likes Received:
    164
    GPU:
    RTX 3090
    Your assumption is mistaken. We got two of them because they had the most memory available. I do HPC programming with CUDA and we wanted as much memory as possible. Two Titans with 48GB and 9216 cores for 5K is a pretty good deal even now..
     
  7. Rob761

    Rob761 Guest

    Messages:
    4
    Likes Received:
    4
    GPU:
    RTX 2080
    Series X "ships with 16GB of GDDR6 SDRAM, with 10GB running at 560GB/s primarily to be used with the graphics system and the other 6GB at 336GB/s to be used for the other computing functions."
    PS5 has 16gb, but it's uniform, running at 448gb/s. I suppose since it's uniform, devs could use 11gb, maybe even 12 for vram if they want.


    @Tyrchlis
    Who asked you to apologize? Not me. For what? To be frank, I don't even understand sour grapes, I haven't lost anything, I'm not Donald Trump or whatever! As a long-time customer, I simply think it's poor product design by Nvidia, that's all.

    I'm sorry you seem to have a difficulty with me posting an opinion without a history here (I had another account some years back, around 30 or so posts, but the email no longer exists so I had to make a new account!). It's just never been a great priority for me personally, to be active in forums - that doesn't necessarily mean I know jack **** though, does it? This is my history with pc cards: 8800 GTX, 9900 GTX (that was a pretty clueless upgarde - but I was a beginner back then, college freshman! Oh God, I feel so old :confused:), HD 4870, GTX 580, GTX 680, GTX 980, GTX 1080 and currently GTX 2080

    As for posting benchmarks of the 6800, I don't even know if I'm going to be buying it yet. Like many others have said, I'm holding off for the moment to see what happens in the market. As Elvis would say, "only fools rush in" (wait that's wrong, he would say wise men say "only fools rush in" :D). So I'm afraid all I can do is think logically and reaonably about my next purchase before forking out hundreds of $.

    "unsustainable by facts"
    I agree with you at least on the importance of focusing on facts. These are facts:
    The new consoles go on sale in a couple of days. Console games (sadly I suppose) determine game development and game specs. PC follows and expands on what the consoles are doing, using more power and more resources. Console vram is now (as of tuesday) at least 10GB where it had been roughly 5 up to now. Developers used the 5gb and they will use the 10, and as ever the pc with its extra quality and extra settings will expand on that. Console and pc games looked good in the generation we're leaving, they are going to look absolutely mind-blowing in the generation we're entering into. The doubling of vram will play a very large role in that. 10gb for a top pc card is insufficient from THIS POINT ON. And 8gb is now 20% less than consoles. Which is why I wonder about a connection btwn amd's close involvement in the consoles' design and their opting for 16gb on all their new pc products.

    Power consumption however is another thing I would not turn a blind eye to. I'm at 215w at the moment with 2080, and to jump from that to 320w (touching on 50% increase) would be over the top I think. Particularly when transistors have gotten smaller by a third. I think I would set a limit at 250w, but I'd ideally prefer to be around where I am now. Remember Nvidia's greatest and most lauded successes over the past 10 years were 980 (4.6 tflops for a stunning 165w) and 1080 (8.2 tflops - 8.8 with boost - for an equally amazing 180w). Now just 2 gens later the 80 card is at 320w? This is not movement in the right direction. Or efficient design. Or laudable design. I remember power inefficient cards like HD 2900 and GTX 480 getting slated in reviews for being just that - power inefficient. No mention of it these days seemingly. The AMD/Nvidia Duopoly is powerful enough without its customers being unwilling, or the press being afraid, to hold it to account when there are shortcomings.
     
    lmimmfn, Tyrchlis and wavetrex like this.
  8. FatBoyNL

    FatBoyNL Ancient Guru

    Messages:
    1,861
    Likes Received:
    235
    GPU:
    RTX 4080 Suprim X
  9. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    This is like looking at the historically well managed city that is about to expand and saying: Water supply has never been an issue in my experience. We don't need more water or more pressure capacity.

    The reason why bandwidth has never been an issue is because some very smart people are behind GPU design. Bandwidth has always been an issue, a critical issue; they just made it invisible to you.
    Nvidia would love to use GDDR6 exclusively, instead of inventing GDDR6X. AMD did it by introducing massive cache. So apparently - bandwidth is an issue... duh
     
    Tyrchlis, endbase and Maddness like this.
  10. Freeman

    Freeman Master Guru

    Messages:
    600
    Likes Received:
    164
    GPU:
    4060 ti 8gb
    when you say: ''Games are going for consoles'' and also phones. Yeah, its true, worrying . But thats worrying for all computer components.
    AMD is sure making bank with the argument: '' it might run better because PS5 is on AMD.'' We will see.
     
    Tyrchlis likes this.

  11. Rob761

    Rob761 Guest

    Messages:
    4
    Likes Received:
    4
    GPU:
    RTX 2080
  12. CrazyGenio

    CrazyGenio Master Guru

    Messages:
    455
    Likes Received:
    39
    GPU:
    rtx 3090
    @Rob761 apparently you don't know where you are, this is a forum where all or most of people can afford an x80 or x90 or high end ti cards and most of all you are on nvidia side of the forum, people don't buy nvidia card for future proofing or have a pc for a long run, the only nvidia cards that really lasted very long where the 1000 series, i don't think even nvidia expected that, but if you are a nvidia user for a long time you will know that if you buy an nvidia card then it's because you do it for the bragging rights, the brand and the logo and it's fast on day one, but will not last more that 1 year.

    So it's natural that you find this kind of people that just underestimate your whole opinion or argument with personal attacks to you, like the already mentioned sour grapes ad-hominems or that if you don't buy it, it's because you can't afford it ad-crumenam. People that will attack you personally with fallacies to discredit your arguments or opinion even if you are right because people here most of them can afford to buy a high end gpu every year so things like, "not enough vram", not enough power, and raytracing bad, and other memes just past over to them because they will never feel it since if a frame just drop they will just throw the gpu to the garbage bin and buy another. That's the reality of this forum.

    This is my advice to you, just leave it there, this kind of post don't attract the right kind of people.

    And i say this because something similar happened to me years ago so i have experience over this, i was called a troll, a moron, and amd fanboy and poor third world country guy a scrub and everything you can imagine (hell even i got pmed and got called a pedophile because someone searched my username on google and found a brazilian ugly dude that looked like a pedo but had my same user name, so people here are super salty to even search for you on google) only because i mentioned that 2GB/3GB on the 600/700 series was not enough vram and even called 900 series a scam for being just 1GB better while also mentioning the 8GB shared pool memory on the ps4 and xbone, similar case to you.

    And i was right, mirror edge catalyst, recent call of duty games, titan fall games, doom 2016, and much others required at least 5GB of vram to have at leas console quality, because try to run doom 2016 on high quality on the 980 a 500 usd gpu that was more expensive than ps4 at that time (not today), the gpu will run well for a few levels, then it will start to puke the more you play those games, the 600/700 lifespan was reduced drastically and 900 feel really quick showing clearly that VRAM IT'S VERY IMPORTANT and it doesn't matter if the consoles have shared memory pool the PC ports will require more vram than console memory pools and was proved with r9 300 amd series and how the r9 fury X the supposed 980ti killer died and can't even reach the rx 580 performance, only because of the 4GB of vram even if its hbm fast vram.

    So this kind of topics are just a dejavu every gen https://forums.guru3d.com/threads/nvidia-4-gb-and-2gb-cards.389979/, you will never find people that will support your claims, only personal attack, I'd suggest you to leave the topic there add people like @Tyrchlis to the ignored list, because that people have the same stubbornness as an anti-vac or a flat earther and will not listen to anything you have to say and will only speak to you in ad-hominem. So for you sanity and the rest of the forum, leave all this here and continue your life
     
    Last edited: Nov 10, 2020
  13. vf

    vf Ancient Guru

    Messages:
    2,185
    Likes Received:
    310
    GPU:
    ATi Radeon™
    I remember my GeForce 3s lasting more than 1 year with many games. At least 3 years before I changed to ATi.

    6800 Ultra lasted several years at least. As did the 7800GS+.

    The only one I would say I ditched fast was the 5800 Ultra, it was so loud it was crazy.

    Even the 1080Ti FTW3 is lasting well to this day from launch that is playing any game I have, very well. Which my main at the moment is The Division 2 as the open world is very heavy on graphics.

    Bragging rights? Nope. Wont last more than one year? Nope. Even my warranty is up on the 1080Ti. That's 3 years.
     
    Maddness likes this.
  14. Freeman

    Freeman Master Guru

    Messages:
    600
    Likes Received:
    164
    GPU:
    4060 ti 8gb
    Console games on Windows are still Console games on Windows. XD
     
  15. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    hehe :D

    PS
    Yeah I am easy to amuse, but tbh I don't see anything inept from him. Not in this thread at least, and im not going to dig his history.
     
    Last edited: Nov 11, 2020
    Tyrchlis likes this.

  16. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    Yeah I get it. He was raising hell in that other thread. Here he opines that 600/700 should have came with more mem. OK that's an opinion and not completely out of this world.

    OP otoh... he wants 20GB RTX 3080 with GDDR6.. . . ..
    ..
    Getting slower card today, so that in the future it becomes perhaps less slower. LOL

    And all this because apparently he believes that bw is not an issue. He actually wants 2080 Ti amount of bandwidth for a card that almost TRIPPLES more than DOUBLES its FLOPS??!!!
    Furthermore he thinks that's such a good idea that he plastered it all over the internet :D
     
    Last edited: Nov 12, 2020
  17. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    well this aged well :D

    History high market share along with insane profits.
    THINK TWICE be4 deciding to bet against the GREEN MACHINE. <- free tip
     
  18. RealNC

    RealNC Ancient Guru

    Messages:
    5,075
    Likes Received:
    3,350
    GPU:
    4070 Ti Super
    Right :p
     
  19. Shagula

    Shagula Master Guru

    Messages:
    600
    Likes Received:
    72
    GPU:
    EVGA 3090 Ulltra
    • Record quarterly and full-year revenue for company, Gaming and Data Center
    • Company quarterly revenue of $5.00 billion, up 61 percent year on year
    • Company full-year revenue of $16.68 billion, up 53 percent
    Yeah i think they're in huge trouble, bankrupt by next week no doubt. I'm yet to meet a game my 3080 runs out of ram in including games like Days Gone where i'm playing in 4k with the resolution scale pushed up 20% further... Edit. Days gone in 4k with 120% resolution scale only uses 7gig ram. A 3070 would run out of grunt before ram at this point
     
    Last edited: May 27, 2021
    pharma, Maddness, GoldenTiger and 3 others like this.
  20. Witcher29

    Witcher29 Ancient Guru

    Messages:
    1,708
    Likes Received:
    341
    GPU:
    3080 Gaming X Trio
    RE village but that game is sponsored by amd xD
     

Share This Page