Micron confirms GDDR6X for GeForce RTX 3090 with 12GB and over 1 TB/sec memory bandwidth

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Aug 14, 2020.

  1. Hog54

    Hog54 Maha Guru

    Messages:
    1,102
    Likes Received:
    19
    GPU:
    Evga RTX2060 KO
    The only thing I care about is when are the 3060s coming out?:)
     
    DannyD and jbscotchman like this.
  2. alanm

    alanm Ancient Guru

    Messages:
    9,826
    Likes Received:
    2,003
    GPU:
    Asus 2080 Dual OC
    All sorts of crazy leaks coming out, some from the Baidu Chinese forums where some of the twitter leakers hang out. Translation (requisite mountain of salt):

    On FP32 Performance

    Rumors point to GA104 throughput being the equivalent of Navi 21. For reference that is the Big Navi performance will equal RTX 3070 (Ti). The key changes related to 3 units of FP32 tensor RT which are internal unit replacement upgrades in the ALU cluster. The area, transistors and emitter width are almost doubled…. the engine type is different, CUDA count is the same but the FP32 throughput is through the roof. Therefore all cards got a memory bump up including GA104 to 16GB and 3080 to 20GB. The 3090 FP32 is more than 24T.

    An RTX 3060 GA106 FP32 performance is very competitive compared to a PS5 and is more cost effective.

    AMD have greatly underestimated Ampere performance. They will now focus on price and look to be competitive there.

    Regarding TSE Scores:

    The TimeSpy Extreme score of 10,000 was just the beginning of testing on less than 1900mhz.

    The FE card should deliver around 2000mhz and board partners cards around 2100mhz.

    The drivers have had another revision since.

    The TSE score in reviewer testing should be more like 11,000 to 11,500 (at ~2100mhz OC)

    TSE RTX 3090 Founders Edition just under 11k (10,800-11,000) | The partner cards should reach 11,200 - 11,500 score

    TSE RTX 3080 Founders Edition just under 9k 8,800 - 9,000 | The partner cards should reach 9,200 - 9,700 score

    TSE RTX 3070Ti (GA 104 full fat) founders edition 7-7,300 and partner cards 7,500 - 7,800 | Note: The 3070 Ti is apparently higher than the 2080 TI TSE (3070 Ti penciled in release in October)

    Performance will be reminiscent of the 9 series in terms of the partner cards – they will have greater performance improvements vs stock NV ones.

    Other General Performance Notes:

    Top of the line Ampere performance - FP 32 doubled, Tensor's doubled, RT doubled and memory capacity doubled. DLSS2.0 performance is 200% +

    The difference between 3070 Ti and a 3080 is 20-25% | The 3070 Ti at 4K will get 60FPS easy in games.

    The 3090 is about ~25% better than the 3080 in games and about 1.7 times better than a stock 2080Ti

    Founders Edition cards
    The radiator is exceptionally well made and great performance however; the default frequency boost is not high in the bios. There is Low frequency in FE cards but temperature control is very powerful. Overclocking apparently isn't too bad because of the lower frequency.

    Note: There are FE cards and Partner Cards. There is also a strong possibility there will also be a super high-end versions of Ampere from Nvidia.

    VRAM changes:

    Titan = 48GB

    3090 = 24GB

    3080 = 20GB

    All previous 12GB and 10GB were rumors or test SKUs that didn't make the cut.

    GA104 3070Ti will have 16GB

    GA06 will have 12GB


    There is currently no 3080Ti but that doesn’t mean one could come later.

    Power Consumption:

    The RTX 3080 power consumption will be just slightly over 300W but not more than 340W

    The RTX 3070 power consumption is coming in around 200W if not a little lower and should end up under 200W

    Competition –

    Apparently Big Navi is 'dead before arrival'. AMD are flustered because they have grossly underestimated Ampere performance in top 3 cards in the stack. Source rumors are saying Nvidia will win by a large amount in traditional performance this time and then DLSS and RTX scale proportionally. AMD were full of confidence before but now are very anxious. They based their targets off Turing's generational improvements (being the smallest in Nvidia history) and grossly miscalculated the evolution capability that Nvidia has done with Ampere

    RTX 6900 ROPs have been improved to 96 up from 64 that was leaked a while ago. However, FP32 performance will be nowhere near GA102.

    N21 = PCB is in final stages of being completed. The core has been taped out but drivers and PCB not complete yet. Expected to be completed end of September. The release and distribution are estimated to be after mid-October. Performance estimations are 1.5 times that of 5700 XT with a TSE score of less than 8000. (Will trade blows with GA104) Big Navi power consumption is estimated to be between 330-375W

    On Pricing –

    It is tough as no AIC has no news yet on Ampere pricing. It is expected that the Ampere cards will be expensive, and this is due to the overwhelming performance that will bring.

    Pricing estimations are around 20% more than what Turing was at launch especially on high end Ampere. The 3070/Ti will be around 2080 Super to 2070 Super prices.
     
  3. Aura89

    Aura89 Ancient Guru

    Messages:
    8,107
    Likes Received:
    1,247
    GPU:
    -
    It really doesnt matter how much a game "can" use it matters how much if any performance impact a game has from not being able to use "as much as it can" and this is often shown benchmark after benchmark that 8gb even at 4k is definitely enough for most scenarios. Yes there are situations where it could benefit more but they are far more rare then people seem to think....
     
  4. alanm

    alanm Ancient Guru

    Messages:
    9,826
    Likes Received:
    2,003
    GPU:
    Asus 2080 Dual OC
    The old co-processor rumor resurfacing as well. Again folks these are just rumors, no one should be going around restating them as 'facts'.

     

  5. icedman

    icedman Maha Guru

    Messages:
    1,053
    Likes Received:
    115
    GPU:
    MSI Duke GTX 1080
    some ppl here are crazy 8-12gb of vram is plenty even at 4k the only time i ever see my 1080 max out is if its caching and if its legitimately using more there's a good chance the gpu doesn't have the horsepower to keep up anyways. Don't get me wrong more is always better but i think ppl are exaggerating the amount needed here.

    Also i'm surprised to see gddr6x already considering how long we had gddr5 before an x variant came out wasn't it 3.8ghz where gddr5 started and ended at 8 or 9ghz? before we even saw the gddr5x.
     
    Aura89 likes this.
  6. Astyanax

    Astyanax Ancient Guru

    Messages:
    8,344
    Likes Received:
    2,787
    GPU:
    GTX 1080ti
    the co processor (if you can even call it that) exists within the SM.
     
  7. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    5,482
    Likes Received:
    2,024
    GPU:
    HIS R9 290
    Why does it matter? You're probably barely going to get 60FPS at 4K and it's not like 8K-ready textures will be around for a while.

    Huh.... You just got me to realize something:
    Perhaps the 3090 (and 3090Ti, if that is to exist) are Nvidia's way of sneaking in higher prices. The 3080 (Ti?) is probably going to cost roughly the same as the 2080 Ti, maybe a tiny bit lower, but this is the first time we've seen a 90 in a long time, so perhaps they're going to have that in its own separate price bracket.

    Not that I'm hoping this will be the case, but, there must be something pretty special about this GPU if they're going to 90, so it will likely have the price to reflect that.
     
    itpro likes this.
  8. TalentX

    TalentX Active Member

    Messages:
    91
    Likes Received:
    41
    GPU:
    GTX 1080 / 8GB
    I bet CP2077 will be even satisfied with just 4GB vram @1440p and max settings.
    Unlike mainstream "aaa" developers and publishers, who waste vram usage in exchange for a faster development process (as they prioritize quantity not quality), CDPR is known for being great at optimizing performance usage for their games.

    So it very much depends on the games you want to play. You will have games with much worse content and lesser visual quality that require much more vram, than some games with much more content and greater visual quality.
    It's all a matter of development process.
    Telling your customer "buy more vram" or even "buy more ram" is the excuse for not being able to fix the bugs, or to refuse to optimize the app/game.
     
    Last edited: Aug 15, 2020
    itpro likes this.
  9. Astyanax

    Astyanax Ancient Guru

    Messages:
    8,344
    Likes Received:
    2,787
    GPU:
    GTX 1080ti
    doubt it.
     
    Solfaur and itpro like this.
  10. Camaxide

    Camaxide Active Member

    Messages:
    65
    Likes Received:
    15
    GPU:
    MSI 1080 Ti Gaming X SLI
    Stingy? The post says clearly that Micron's roadmap show a potential for a 16Gb in 2021 and later up to 24Gb. Nvidia can't add memory modules that does not yet excist to their boards.
    Though if you are in real need of 16 or 24Gb of VRAM - then 2021 is just around the corner ^_^
     

  11. alanm

    alanm Ancient Guru

    Messages:
    9,826
    Likes Received:
    2,003
    GPU:
    Asus 2080 Dual OC
  12. Aura89

    Aura89 Ancient Guru

    Messages:
    8,107
    Likes Received:
    1,247
    GPU:
    -
    Post like this remind me of the Pentium 4 days when people said AMD processors were worse just because it had less frequency...and "MOAR FREQUENCY = BETTER!"

    More vram does not = better just because. RTX 2080 ti with 8GB vs 16GB would in 99% of scenarios perform just the same. So why exactly do you need so much ram? for that 1%? Sorry, likely less then 1%?


    Correct me if i'm wrong but this guy is talking about creating a game, not playing a game, correct?

    Because even the comments state this, which is a different environment then playing a game, there's reasons why profession video cards for instance have more vram, and would likely be better situation for that purpose. Doesn't really have anything to do with playing a finished game.

    You can create a game with unrealistic requirements all you want, texture sizes to the moon, ray tracing bounces unlimited, etc. Doesn't mean it has anything to do with what a realistic vram usage and need are.

    And sure this may change but when it comes to raytracing performance (with the GPUs we have now, and games we have now), the performance difference between an RTX 2080 and ti, raytracing wise, does not appear to be due to vram, but actual raytracing hardware differences....
     
    Last edited: Aug 15, 2020
    theoneofgod likes this.
  13. Venix

    Venix Maha Guru

    Messages:
    1,398
    Likes Received:
    523
    GPU:
    Palit 1060 6gb
    @Aura89 i would agree about the creation of a game and that today 16 with 8 gb will see no difference in most cases ...BUT we have a bright example with the 780 3gb the cards aged badly and a big role to that is the limited amount of ram ... on the other hand people that get the 2080 class 3080 class type of card normally changing em before this become a big factor , personally i do not think Vram will be a limitation for the 3080 but we will see i guess ...now if they release the 3060 with 6gb or ram instead of 8 ...i will not like that for sure .
     
  14. jbscotchman

    jbscotchman Ancient Guru

    Messages:
    4,868
    Likes Received:
    3,626
    GPU:
    MSI 1660 Ti Ventus
    This. Give a $280 base model that can compete with a 2080.
     
    Maddness likes this.
  15. alanm

    alanm Ancient Guru

    Messages:
    9,826
    Likes Received:
    2,003
    GPU:
    Asus 2080 Dual OC
    Valid point. Thought I saw RT benchmarks somewhere with very high vram usage that choked 8gb cards (2080), but cant seem to find it. May have been some RT game reviewed on HardOCP before the site went offline. Also my take from forum tech chatter that the more complex the RT scenes, rays bounced, global illum, etc, the higher vram usage. Of course watered down RT implementations or settings may not pose an issue, but with upcoming titles that take advantage of the new hardware, I think vram usage may pose a problem, at least on lesser capacity cards. Traditional rendering much less of an issue of course.
     
    Last edited: Aug 15, 2020

  16. Dribble

    Dribble Member Guru

    Messages:
    170
    Likes Received:
    68
    GPU:
    Geforce 1070
    8k gaming is just 4k gaming + DLSS. I bet the top end Nvidia card will be able to do that for a lot of games.
     
  17. Undying

    Undying Ancient Guru

    Messages:
    14,159
    Likes Received:
    3,347
    GPU:
    Aorus RX580 XTR 8GB
    DF in their review on wolfenstein rtx test 2060 6gb was running out of vram in no time having to reduce the in game quality. 2060 Super on the other hand was so so fine with its 8gb.
    Raytracing does increase vram usage and having to play next gen games with 12gb at 1440p-4k is questionable.
     
    Solfaur likes this.
  18. pharma

    pharma Ancient Guru

    Messages:
    1,553
    Likes Received:
    405
    GPU:
    Asus Strix GTX 1080
    Ampere RT cores and Tensor cores are supposedly different and more efficient than Turing cores. We will need to wait for the Ampere RTX reviews to surface before we can definitely say whether or not Ampere vram allocations are sufficient.
     
    Gandul and Noisiv like this.
  19. H3llF1re2005

    H3llF1re2005 Member

    Messages:
    44
    Likes Received:
    2
    GPU:
    Titan X Pascal SLi
    12GB is disappointing... definitely won't cut it for upcoming games at 4K. I really hope this rumour is off, or the Ti variant of the new card has more...
     
  20. itpro

    itpro Master Guru

    Messages:
    668
    Likes Received:
    372
    GPU:
    Radeon Technologies
    12 gb is OK if its fast for a variant for up to 4K resolution. Those with SLI who need 8K will suffer with 12gb vram, 16,20,24 are likely the smooth sweet spot. Anything less than 12gb for 2021 is a scam for mid to enthusiast range. 8gb-10gb vram should be the new low range like old 4-5gb cards.
     

Share This Page