NVIDIA GeForce RTX 3090 and 3080 Specifications Leak

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Aug 28, 2020.

  1. Glottiz

    Glottiz Ancient Guru

    Messages:
    1,949
    Likes Received:
    1,171
    GPU:
    TUF 3080 OC
    People who don't understand VRAM problem are shortsighted. I guess most of you started playing on PC only this generation? There was no major increase in VRAM requirement for years only because we were stuck in PS4 and Xbox One generation for 7 years and all games we got on PC were targeting PS4 or Xbox during their development. When real next gen games come out that target PS5 and Xbox Series X, VRAM requirement will jump up significantly.

    It's really odd you have to spell this out on PC enthusiast forum. I mean, it's like the most common knowledge for everyone who gamed and built PCs for more than 1 console generation.
     
  2. pegasus1

    pegasus1 Ancient Guru

    Messages:
    5,184
    Likes Received:
    3,582
    GPU:
    TUF 4090
    Stalker as well, my 8800GTX was fine but many with the GTS were running into issues with low VRam.
     
  3. Loobyluggs

    Loobyluggs Ancient Guru

    Messages:
    5,240
    Likes Received:
    1,604
    GPU:
    RTX 3060 12GB
    I was really referring to tech on a port standard there with HDMI - there is no logical reason to NOT have 2.1 ports, yet, so many companies just stick with 2.0b and say 'screw it'.

    I'm hearing that my new monitor will NOT have HDMI 2.1 when it is finally released, even though it has the specification it has and is practically begging to have HDMI 2.1 (in daisy-chain)

    https://www.asus.com/uk/Monitors/ProArt-Display-PA32UCG/
     
    XenthorX likes this.
  4. sbacchetta

    sbacchetta Member Guru

    Messages:
    141
    Likes Received:
    66
    GPU:
    GTX 1080ti
    In theory you are wrong Vram is a frame buffer, you have enough of it or not it is not depending of your fps but of your "image quality" (in contrast low memory bandwidth problem will be more apparent with high FPS). When you don't have enough, you get stutter. But it's true stutter is more annoying the higher your fps was.

    But in practice you are right because the weaker your card is, the lower your resolution or image setting will be, so the lower your vram usage will be.
     

  5. XenthorX

    XenthorX Ancient Guru

    Messages:
    5,059
    Likes Received:
    3,438
    GPU:
    MSI 4090 Suprim X
    Been experimenting with H.265 on Youtube for a bit, really hoping next gen card will make real time encoding possible. H.265 Youtube streaming would be incredible.



    I need.

    Been on dual Asus 4K PB279Q / MG24UQ for two years, this baby could fit right in the middle.
     
    Last edited: Aug 29, 2020
  6. Loobyluggs

    Loobyluggs Ancient Guru

    Messages:
    5,240
    Likes Received:
    1,604
    GPU:
    RTX 3060 12GB
    You know it baby!

    1,152 zones, that's contrast the likes of which we haven't even seen the likes of which before...but, there is a greater panel by innolux that has...more than 2 million zones, and they call it 'megazone' (I take issue with naming an LCD panel after an early 1990's lazertag - but hey, it's good)

    and you can find this puppy here: https://www.tftcentral.co.uk/blog/innolux-latest-panel-development-plans-dec-2019/
     
    Venix and XenthorX like this.
  7. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,035
    Likes Received:
    7,378
    GPU:
    GTX 1080ti
    Uh? what?

    This has nothing to do with hevc and vp9 not being used for streaming. It's the fact the RTMP has no capability of carrying hevc and vp9 in the first place.
     
  8. Supertribble

    Supertribble Master Guru

    Messages:
    978
    Likes Received:
    174
    GPU:
    Noctua 3070/3080 FE
    This can get a bit frustrating, I agree, but perhaps many people choose high refresh 1440p and don't realise how demanding of VRAM 4K can be, also things are about to change with the new console generation as you said.
     
  9. alanm

    alanm Ancient Guru

    Messages:
    12,270
    Likes Received:
    4,472
    GPU:
    RTX 4080
    Been on 4k for 3 years. Never let down by vram yet. Maybe GPU power, but not vram.
     
  10. Undying

    Undying Ancient Guru

    Messages:
    25,477
    Likes Received:
    12,883
    GPU:
    XFX RX6800XT 16GB
    With rt vram usage gonna jump this gen exponentially.
     

  11. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Some people tend to disagree. And it is doable on YT too. But me, neither you have good enough HW to do it with good IQ and save bandwidth at same time.

    But maybe you want to visit VP9 wiki page and rewrite following:
     
  12. alanm

    alanm Ancient Guru

    Messages:
    12,270
    Likes Received:
    4,472
    GPU:
    RTX 4080
    It will be OK with Nvidias 10gb 3080. They know what they are doing. Their problem is choosing between practical vram and marketable vram. 10gb sounds too little and big Navi has more of a marketing edge with 16gb. So NVs choice is either 10gb or 20gb, no middle ground due to 320bit bus. Same problem with 3090, they cant allow 12gb for their top card when AMDs is 16gb, so their ONLY other choice is 24gb (due to 384bit bus). Now 24gb is a hell of a lot and I'll bet its influencing a lot of ppl into thinking thats what next gen games need, so 10-12gb seems too little in their eyes.
     
  13. I was right about the converter cable and @Astyanax you were wrong? :eek: This cannot be happening...

    PCI-E 4.0 Hmmm no thanks to Intel :D
     
    Last edited by a moderator: Aug 29, 2020
    Venix, Maddness and XenthorX like this.
  14. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    As punishment, find yourself quiet corner of universe and wait till he calls you back.
     
  15. :D:D yeah I'm being a snot
    Yeah, I had a similar reaction although to be fair and someone already pointed this out elsewhere many people misunderstand what is being used with what is being cached even with current-day video cards. @Undying I think it was you! Anyways. My point is to apply similar logic in the way you've been thinking about the way your OS handles RAM in a sense. Here we have 8GB on average with most people if not 6-8. 8 is plenty. 10 is likely more than plenty. I'll explain it. 8GB stretched out the limitations of 4K, including RT depending on the rare title that outside of caching had trouble. Prior to that on average 2GB was needed maybe slightly less from that jump from 2K to 4K I think? (We saw 4GB, 6GB, 8GB) I might be thinking 8K but I thought on avg you saw 2GB get eaten up. So 2GB more is a fair buffer. They cannot shoot themselves entirely in the foot here the cards have to be profitable and proof of concept.

    My stance on AMD when it comes to the "Why is it 16GB as opposed to not?" etc - This is speculation on my part entirely but I predict AMD will release gaming consumer Big Navi 16GB with access to their Enterprise level drivers and Pro tools (their alternative to Quadro drivers) called "Radeon Pro Software for Enterprise" just a guess but my hunch is they want the card to be dually faceted to absorb more market share both gamer demographics and developers in one while going after the bigger budgets with their Wxxxx series as tradition (which have higher CU count and or VRAM, longer warranties etc)
     
    Last edited by a moderator: Aug 29, 2020

  16. tunejunky

    tunejunky Ancient Guru

    Messages:
    4,451
    Likes Received:
    3,071
    GPU:
    7900xtx/7900xt
    these are different gpu chips, not "cut down" versions of the same chips.
    however, i think you have a point for the subsequent release(s).
    think of it as un-super, but just waiting for a really good AMD price point on Big navi. they have to be holding on to their chips that don't meet spec (because they paid for them) so i think a reverse-ninjitsu-jedi-mind-trick may be in order.
     
  17. tunejunky

    tunejunky Ancient Guru

    Messages:
    4,451
    Likes Received:
    3,071
    GPU:
    7900xtx/7900xt
    with node reduction it's a given that you get either dramatically higher performance or dramatically lower tdp, so it's obvious which way Nvidia went.
    personally, i do not want to go back to the days of the (tdp of) the R9 390, but Nvidia went and did whole hog.
    didn't they hear of global warming? or of SFF systems?
    well everyone with an E-ATX, Nvidia's has a card that still may not fit in your case.

    honestly, i'm disappointed despite the video crunching spec. Nvidia's seriously worried about Big Navi or they would've gone the power efficiency route like they have been ever since skewering AMD on power draw for the last ten years.
     
  18. wavetrex

    wavetrex Ancient Guru

    Messages:
    2,464
    Likes Received:
    2,574
    GPU:
    ROG RTX 6090 Ultra
    Very unlikely.
    It's a 12-channel memory (384 bit) chip with 2 channels cut, like 2080 Ti and 1080 Ti were 12-channel with 1 of them cut.

    Samsung yields must be terrible at this point ... making this "garbage quality" silicon (also based on the HUGE power requirement), sold at $800 or hell knows how much.
    NV really screwed it up this time...

    ... but as always, they have an unlimited number of fanboys that will buy into it.

    Enjoy GPUs costing upwards of $2000+ in 2023 guys !
     
  19. kapu

    kapu Ancient Guru

    Messages:
    5,418
    Likes Received:
    802
    GPU:
    Radeon 7800XT
    Ok i have a GTX 1060 with 6GB ram , which is GOOD enough for 1080p , i guess RTX 3070 is MENT for 1080p and is good replacement for my card ? Or i my understanding is wrong :D
     
  20. HybOj

    HybOj Master Guru

    Messages:
    398
    Likes Received:
    327
    GPU:
    Gygabite RTX3080
    You are right, just it will cost 2x as much as the 1060 costed you. And that is how many years later? Anyway, latest leak is, nVidia will pack a tube of lube in each GFX card box :)
     

Share This Page