NVIDIA talks about Pascal - Will be fast and has 3D Stacked Memory

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Mar 19, 2015.

  1. D4rKy21

    D4rKy21 Banned

    Messages:
    724
    Likes Received:
    0
    GPU:
    Gigabyte GTX 980 G1 SLI
    3.5 vram king ?

    sorry i cant resist :D
     
  2. eclap

    eclap Banned

    Messages:
    31,495
    Likes Received:
    4
    GPU:
    Palit GR 1080 2000/11000
    I didn't know you were a girl. Let me apologise for all the hard times I put you through. Girls are more sensitive and I was out of line. Please accept my sincere apology.

    I sleep like a king, btw.
     
  3. D4rKy21

    D4rKy21 Banned

    Messages:
    724
    Likes Received:
    0
    GPU:
    Gigabyte GTX 980 G1 SLI
    I feel your 3.5 gb vram pain dude, i am so sorry :(
    I would also be angry if i pay for a 4gb videocard while in reality it just a rip off.
     
  4. eclap

    eclap Banned

    Messages:
    31,495
    Likes Received:
    4
    GPU:
    Palit GR 1080 2000/11000
    No pain here darling, I game at 1440p, no vram problems at all. Fancy a drink?
     

  5. D4rKy21

    D4rKy21 Banned

    Messages:
    724
    Likes Received:
    0
    GPU:
    Gigabyte GTX 980 G1 SLI
    Sure but dont poison me dude.
     
  6. Undying

    Undying Ancient Guru

    Messages:
    15,823
    Likes Received:
    4,843
    GPU:
    Aorus RX580 XTR 8GB
    Like i said, he have some issues, Vram issues stuck in his head. Titan X can help him. :D
     
  7. shamus21

    shamus21 Member Guru

    Messages:
    128
    Likes Received:
    21
    GPU:
    0
    Sound like a lot of sales BS to me
     
  8. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,810
    Likes Received:
    3,363
    GPU:
    6900XT+AW@240Hz
    Over 3 years in use are HD7970 with 3 GB and 6GB versions.
    Average benefit of 6GB version is improved performance by around 2%.
    Price difference between 3 and 6GB versions? I guess people can remember those $100+.

    Give GTX 980 which is about 40-50% stronger than HD7970 4GB vs 8GB. And benefit?
    Same (close to nothing). That alone is proof that we do not need more vram.

    Do you know when you will need that huge vram? Voxel based raytracing.
    You will have entire voxelized level stored as billions of small blocks in vram and card will do all sort of calculations over them as advanced destructive physics and that raytracing.
    Do you think we have that kind of compute performance in GPU affordable for average gamer? Can such card cost around $300?

    I am sure that 12~15 billion transistor 16nm chip with 16GB of stacked memory will cost more than current Titan X.
     
  9. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    12,229
    Likes Received:
    4,413
    GPU:
    2080Ti @h2o
    Hehe now see how people react to that: dx12 might bring SFR to Nvidia cards. Could double the use of 4GB vram in SLI. What do you say about this?!?! z0mg! :eyes:

    On a more serious note, I've run into situations where I filled 3GB vram with 4K textures on modded Skyrim. That's the special, uncommon scenario though. 4GB should be sufficient.
     
  10. Denial

    Denial Ancient Guru

    Messages:
    13,361
    Likes Received:
    2,868
    GPU:
    EVGA RTX 3080
    What about this whole thing sounds like sales? It sounds more like Nvidia has other markets then gaming and is designing it's chips to cater to everything. They are doubling performance/watt again which further helps mobile chips. They are doing the on-die ram and FP16/32 on the fly switching for deep learning networks/car auto pilot systems.

    One of the reasons Nvidia is in the financial position it's in today is because they took their GPU technology and applied it to markets that no one else has.

    I doubt a 32GB consumer PASCAL card would ever launch, but as 4K becomes more of a thing, and soon after 8K -- 12GB/16GB cards suddenly don't seem far fetched. Especially when texture resolution will need to be increased further to maintain sufficient detail. I don't really get the point of 12GB on the Titan X but by next year, with PASCAL it will probably be pretty nice to have.
     

  11. Fender178

    Fender178 Ancient Guru

    Messages:
    4,184
    Likes Received:
    207
    GPU:
    GTX 1070 | GTX 1060
    Up to 32gb of Vram. Makes me wonder if Nvidia will make the cards powerful enough to use that much Vram. Heck even 16gb for that matter. To me these cards will be Nvidia's 4k cards because of the new architecture and memory technology.

    I read the other postings and I am like is this going to be another VRam flame war topic again?
     
  12. D4rKy21

    D4rKy21 Banned

    Messages:
    724
    Likes Received:
    0
    GPU:
    Gigabyte GTX 980 G1 SLI
    Try Dying Light or Lords Of The Fallen for a example then u know what i mean.
    Those games can reach my vram limit and thats under full hd.
     
  13. Denial

    Denial Ancient Guru

    Messages:
    13,361
    Likes Received:
    2,868
    GPU:
    EVGA RTX 3080
    In games? Probably not -- but for compute applications, with FP16 & int8, 32GB will definitely be nice there.
     
  14. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    12,229
    Likes Received:
    4,413
    GPU:
    2080Ti @h2o
    Exactly my thinking, if once textures get bigger (like modded Skyrim showed me, see above), you can make use of bigger vram reserves. On the TitanX though it's merely for show, I'm fairly sure :D


    [​IMG]


    Sorry, couldn't resist :D
     
  15. lanelor

    lanelor Member

    Messages:
    36
    Likes Received:
    0
    GPU:
    GTX1070 Stryx
    I have dual R9 280s and there are just no games that:
    1) Can run them properly
    2) I care about

    So dual digit VRAM and so on is fine, but ultimately pointless if you end up playing Mark of the Ninja and Football Manager:(
     

  16. D4rKy21

    D4rKy21 Banned

    Messages:
    724
    Likes Received:
    0
    GPU:
    Gigabyte GTX 980 G1 SLI
  17. Hughesy

    Hughesy Master Guru

    Messages:
    357
    Likes Received:
    1
    GPU:
    MSI Twin Frozr 980
    Same, never ran into any Vram problems. Like I said some games cache which fills up Vram, but it doesn't really do anything. Someone show me a game coming out this year that actually "uses" and needs more than 4GB of Vram. I haven't seen any. Most games are only just about using 3GB from what I've played at 1440p. Loads still use under that, the only way games will use more is if you use silly amounts of AA, which is pointless when playing at 1440p.

    Edit: What I don't get with the person who posted above me, why he has SLI 980s if he thought 4GB wasn't enough? Doesn't make sense to me, if I thought the same I would've saved my money rather than waste it on something I thought wasn't good enough.:3eyes:
     
    Last edited: Mar 19, 2015
  18. D4rKy21

    D4rKy21 Banned

    Messages:
    724
    Likes Received:
    0
    GPU:
    Gigabyte GTX 980 G1 SLI
    Scroll up a bit.
    And again maybe for u pointless but for other folk its not, even 1080p and 4 or 8x msaa wil eat up my vram, like its nothing lol.
    When i buy a new gpu i wil go 1440p or up but not with those cards what i have now, thats shooting myself in the foot.
     
    Last edited: Mar 19, 2015
  19. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,810
    Likes Received:
    3,363
    GPU:
    6900XT+AW@240Hz
    Do you mean that Dying Light vram utilization issue before it got patched?

    And Lords of Fallen which even 3 months after release has bugs like "no-sound", missions which you have to repeat several times, because crucial ENEMY dies before you get intel?
    That's game which in 2 months from release got 50% discount on steam already twice.

    Giving more resources to idiots who waste them will only encourage them to release even worse products.
     
  20. vbetts

    vbetts Don Vincenzo Staff Member

    Messages:
    15,115
    Likes Received:
    1,690
    GPU:
    GTX 1080 Ti
    Alright D4rKy21 is gone now so no more posts to him, or what he brought up. Back on topic people.
     

Share This Page