Upcoming Geforce GTX cards Use GDDR5X not HBM2

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jun 16, 2017.

  1. rm082e

    rm082e Master Guru

    Messages:
    680
    Likes Received:
    226
    GPU:
    2070 Super
    TAA is what they use at the top end of Doom? That's probably the best AA I've ever seen in a game. It got rid of all the jaggies and shimmer, with almost no performance impact. I was very impressed with it.
     
  2. Ricepudding

    Ricepudding Master Guru

    Messages:
    771
    Likes Received:
    219
    GPU:
    RTX 3090FE
    Really when we start to push 4/5/8k in games in the future we wont need any more AA, it made a lot more sense in games that were 1080p and below when the edges were still rough and more blocky. These days even me running games at 1440p, i find that AA isn't that needed and with the future going towards higher resolutions suck as 4k+ i think AA might just stop being a thing, or at least something that is not needed as much
     
  3. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    7,685
    Likes Received:
    611
    GPU:
    Inno3D RTX 3090
    This is most likely what's going to happen, and I can also see a Ryzen-level upset happening in the desktop with Navi, but that will have to wait at least until 2019. If we talk about the now, the memory controller will probably cause much less microstutter in open world games even when not pressed for VRAM. We'll have to wait and see though.
     
  4. ivymike10mt

    ivymike10mt Master Guru

    Messages:
    226
    Likes Received:
    12
    GPU:
    GTX 1080Ti SLI
    Screen size have big matter. Even at 4K+ AA is good to have.
    There is 4K 27", and there is 4K 65". Well I can agree 27" will dn't need any type AA.
    But same rule will not work for (lets say) 50" coz there are benefits using 2/4 MSAA.
    8K its extreeme sport right now :)
     

  5. Ricepudding

    Ricepudding Master Guru

    Messages:
    771
    Likes Received:
    219
    GPU:
    RTX 3090FE
    Yes i agree, bigger screens might still take use of it. but i don't think many PC gamers run above 32" on there gaming screens... or if they do they are often at sofa distance making it the same difference
     
  6. kendoka15

    kendoka15 Member Guru

    Messages:
    131
    Likes Received:
    16
    GPU:
    EVGA RTX 3080 FTW3
    4k ssaa ftw :D
     
  7. Reddoguk

    Reddoguk Ancient Guru

    Messages:
    2,090
    Likes Received:
    255
    GPU:
    RTX3090 GB GamingOC
    I was under the impression that 4K doesn't really need much AA. I'm sticking with 1080p/1440p for at least another 2 years anyway so doesn't matter for me.

    If i was Nvidia i'd wait for GDDR6 for Geforce Volta cards, especially the xx80/xx70. The rest i don't care about if they have GDDR5/X but for Volta enthusiast cards they should use GDDR6. Why? because i believe GDDR6 will be a good selling point.
     
  8. rm082e

    rm082e Master Guru

    Messages:
    680
    Likes Received:
    226
    GPU:
    2070 Super
    All anyone cares about is frame rates. If the reviews show a big jump in frame rate while temps remain manageable, that's all that matters. Only a handful of people care about the talking points, but they'll buy the new products anyway.

    They just need a similar step up in power over Pascal, like they did with the last two cycles. If they can keep up expectations, they'll continue to own the market.
     
  9. Lane

    Lane Ancient Guru

    Messages:
    6,361
    Likes Received:
    3
    GPU:
    2x HD7970 - EK Waterblock
    This will be a good selling point for TI / Titan" version ( and updated version as today with xx80-70 ).. .

    And indeed, if they can have GV104 parts out before new GPUs of AMD is coming.. ( Vega20 etc )...
     
    Last edited: Jun 16, 2017
  10. Anarion

    Anarion Ancient Guru

    Messages:
    13,604
    Likes Received:
    378
    GPU:
    GeForce RTX 3060 Ti
    NVIDIA have had HBM2 cards for a while now. Currently its very clear that it doesn't make much sense to use it on consumer cards. It's not worth it just yet.
     
    Last edited: Jun 16, 2017

  11. MrBonk

    MrBonk Ancient Guru

    Messages:
    3,225
    Likes Received:
    180
    GPU:
    MSI RTX 2080
    msaa by itself is useless with modern rendering.. Unless it's used as part of a bigger technique, then it should stay dead.
    At least FXAA doesn't introduce horrible artifacts into the image (or smear it into oblivion as soon as there is movement. FXAA has a slight softness, most TAA flat out makes the image smudgy without a ton of SSAA on top)like many TAA solutions do. With oversampling(SSAA/downsampling) the negative effects of FXAA are completely invisible.
    SMAA 1x does not really do a better job than FXAA at 1x res aside from a sharpness advantage. It is equally as bad at subpixel information. And again with SSAA on top, there is almost no difference between the two.
     
    Last edited: Jun 16, 2017
  12. tsunami231

    tsunami231 Ancient Guru

    Messages:
    11,606
    Likes Received:
    816
    GPU:
    EVGA 1070Ti Black
    meh i though there we would see some date for next gen cards by now....
     
  13. GhostXL

    GhostXL Ancient Guru

    Messages:
    6,084
    Likes Received:
    54
    GPU:
    Merc 319 6800 XT
    No point in using anything but GDDR5X. May as well save money and costs for everybody including Nvidia.

    If the memory is fast enough for the next series....may as well keep it. Just because something doesn't say GDDR6 on the next card does not mean it wont be worth while.
     
  14. Fierce Guppy

    Fierce Guppy Active Member

    Messages:
    92
    Likes Received:
    14
    GPU:
    2 x GTX 980 / 4GB
    I'm happy with whatever memory technology nvidia runs with provided it doesn't bottleneck the rest of the subsystem.
     
  15. Lane

    Lane Ancient Guru

    Messages:
    6,361
    Likes Received:
    3
    GPU:
    2x HD7970 - EK Waterblock
    Can ask us why they use them on the high end professional sku so .... Obviously for Nvidia, the choice was cost ( their marge ) and surely availability (secure ) ..
     
    Last edited: Jun 18, 2017

  16. Backstabak

    Backstabak Master Guru

    Messages:
    716
    Likes Received:
    268
    GPU:
    Gigabyte Rx 5700xt
    Yeah, the AMD put themselves in the corner with HBM. On one hand it's a great technology, but it still requires time to bring down cost. On the other hand, if they as the creator, ditched it all toghether, I really doubt anyone would ever cared for it. That's also why they don't collect royalties on it, even though they own most of the patents for it.

    Also, I don't think that AMD has enough money to split their chip design for two different types of memory with different MC. So it will probably be a few years untill the AMD will really see a benefit of HBM, at least with gaming cards.
     
  17. Exascale

    Exascale Banned

    Messages:
    390
    Likes Received:
    8
    GPU:
    Gigabyte G1 1070
    Memory has to get closer to the cores to get the pJ/bit down. AMD is ahead of the curve with HBM, even though using it isnt strictly necessary for some GPUs. Nvidia knows that, which is why their P100, V100 and Quadro GP100 all use it. Its also why every pre-exascale supercomputer architecture i can think of uses some form of advanced memory, be it HMC or HBM.

    Processor in memory(PIM) is also taking off.
     
  18. Backstabak

    Backstabak Master Guru

    Messages:
    716
    Likes Received:
    268
    GPU:
    Gigabyte Rx 5700xt
    I fully agree, my point is, that AMD doesn't have money to split their design for compute and gaming gpus and they don't want to ditch the HBM because it would never take off otherwise.
     
  19. Emille

    Emille Master Guru

    Messages:
    785
    Likes Received:
    27
    GPU:
    1080 Ti Aorus Extreme
    Given that nvidia have been smashing amd in the performance department for about 15 years straight...I don't think this will be a problem.

    If the price to performance ratio is off then I will wait longer to upgrade but given that I am gaming on at 4k with a 1080ti at 60 frames in almost all games, I don't think the memory bandwidth will be a problem...although it clearly resonds better to overclocking performance incease than core clocks.
     
  20. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    9,802
    Likes Received:
    636
    GPU:
    Asus TUF 3080 OC
    Agreed. How often do we actually run out of VRAM? Even sitting on an ancient card that has 3.5GB-4GB of available RAM, the VRAM amount is rarely ever a limiting factor, the brute force of the GPU core is usually the issue.
    That's definitely not what I remember. One of us has memory issues. :wanker:
     

Share This Page