Nvidia Pascal Specs?

Discussion in 'Videocards - NVIDIA GeForce' started by Shadowdane, Mar 17, 2016.

  1. pimpernell

    pimpernell Master Guru

    Messages:
    506
    Likes Received:
    67
    GPU:
    Gigabyte RTX 4080
    Anyone know of a confirmation that it will be a TI version before Christmas?
    I'm starting to dislike sli, and will go for a single card this time.
    But they are good enough until the end of the year, if there is about 90% chance for a ti version.
    If not i will just buy a standard 1080 (and regret it when the ti version arrives).
     
    Last edited: Apr 6, 2016
  2. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    There is no confirmation on anything regarding the Geforce variants of Pascal. It's all conjecture at this point.
     
  3. TheF34RChannel

    TheF34RChannel Guest

    Messages:
    277
    Likes Received:
    3
    GPU:
    Asus GTX 1080 Strix
    Indeed it is. And I were to guess we won't see a Ti until 2017. That's why I got a Ti now, so I can enjoy my games now and hop on the next Ti whenever :)
     
    Last edited: Apr 6, 2016
  4. Paul5850

    Paul5850 Guest

    Messages:
    130
    Likes Received:
    0
    GPU:
    Gainward Phantom GTX 970
    So a 1070/1080 ( x70/x80 or whatever they will call them... ) might just come out in September-October or simply in 2017 using GV104 chips since Pascal might just skip GeForce?
    Each way, It's still disappointing that back when I bought Fermi in April 2010, I got a GF100 for €350, now in Oct'14 I payed €370 for a GP104, cause they made the cheaper 100's cost as much as a my whole system back in 2010. :banana: :infinity:.
     
    Last edited: Apr 6, 2016

  5. Ieldra

    Ieldra Banned

    Messages:
    3,490
    Likes Received:
    0
    GPU:
    GTX 980Ti G1 1500/8000
    What ? The dies got way bigger
     
    Last edited: Apr 6, 2016
  6. Fender178

    Fender178 Ancient Guru

    Messages:
    4,194
    Likes Received:
    213
    GPU:
    GTX 1070 | GTX 1060
    I agree that GDDR5x will be enough for gamers look what the 980 ti has done against the fury x in terms of performance and that was GDDR5 memory vs HBM1. Also wether or not the Pascal cards have GDDR5 or GDDR5x memory that don't matter to me that much however I'd rather have the overclockability of the Maxwell cards in the Pascal cards.
     
  7. Ieldra

    Ieldra Banned

    Messages:
    3,490
    Likes Received:
    0
    GPU:
    GTX 980Ti G1 1500/8000
    Clocks are the real interesting question, I don't think you'll be seeing anything remotely like the overclocking headroom available on maxwell, simply because it's an immature process , that Titan at 1480 boost clocks is really pushing it imo. I think 1700 is an acceptable conservative estimate; as high as GP104 will realistically go
     
  8. Paul5850

    Paul5850 Guest

    Messages:
    130
    Likes Received:
    0
    GPU:
    Gainward Phantom GTX 970
    They were both fresh on the shelves the moment I acquired them, 4 1/2 years and 4 generations apart, so isn't it logical!?
     
  9. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,125
    Likes Received:
    969
    GPU:
    Inno3D RTX 3090
    In an interview with Raja Koduri AMD implied that there wasn't enough HBM2 to go around for mainstream/gaming cards, and that's why they would use HBM1. My guess is that NVIDIA will probably hold HBM2 for Tesla or really high end Titans. That also means that they will need two memory controller designs, unless their memory controller can handle both HBM and GDDR5(x).
     
  10. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,696
    Likes Received:
    9,574
    GPU:
    4090@H2O
    If you mean 1700 as in MHz core OC with Pascal, that's actually better than most Maxwell cards can go without (hard)mods, no? Everything above 1600 I've seen was LN2 or hardmodded Maxwell2 cards iirc.
     

  11. Ieldra

    Ieldra Banned

    Messages:
    3,490
    Likes Received:
    0
    GPU:
    GTX 980Ti G1 1500/8000

    Yeah but 1500mhz on a 1200mhz stock gpu is a 25% clock increase

    1700 vs 1500stock is only ~13%

    Hence my saying we won't see maxwell level headroom
     
  12. TheF34RChannel

    TheF34RChannel Guest

    Messages:
    277
    Likes Received:
    3
    GPU:
    Asus GTX 1080 Strix
    That's what makes Volta all the more interesting; matured process...
     
  13. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,696
    Likes Received:
    9,574
    GPU:
    4090@H2O
    Ah sorry, my bad, somewhat missed the 1500 stock clock.


    Yes, my personal upgrade target, if I can hold it until then :nerd:
     
  14. TheF34RChannel

    TheF34RChannel Guest

    Messages:
    277
    Likes Received:
    3
    GPU:
    Asus GTX 1080 Strix
    I know I can't hold out that long, especially since we might be looking at 2018. However, Pascal Ti must be impressive and hopefully none of the DP stuff. I thought I upgraded every other generation until I looked back and found I only skipped the 6 series lol. Anyway I can't wait to find out more about GeForce Pascal. I do think really care anymore what type of VRAM it uses, I'm more interested in plain performance numbers.
     
  15. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,125
    Likes Received:
    969
    GPU:
    Inno3D RTX 3090
    If Intel has shown us something about 16/14nm is that there is a hard voltage/clock limit (depending on the design), that is quite close to previous generations. I actually don't believe that both NVIDIA and AMD products will clock much higher than before, if they keep their design in the same vein.
    NVIDIA seems to go with "large Maxwell" for Pascal (that's how it looks for now, I'm waiting for an actual architecture digest), and AMD for another GCN iteration. My guess is that NVIDIA will have the clockspeed advantage again, unless AMD pulls an ace or something.
     

  16. Fender178

    Fender178 Ancient Guru

    Messages:
    4,194
    Likes Received:
    213
    GPU:
    GTX 1070 | GTX 1060
    If it is remotely close to what Maxwell cards have done I'll be happy.
     
  17. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,125
    Likes Received:
    969
    GPU:
    Inno3D RTX 3090
    It will probably be higher, at higher thermals. Don't forget the 300W TDP for the 1480MHz boost clock.
     
  18. Ieldra

    Ieldra Banned

    Messages:
    3,490
    Likes Received:
    0
    GPU:
    GTX 980Ti G1 1500/8000
    Yeah but this is a DP card, DP = higher power consumption
     
  19. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,125
    Likes Received:
    969
    GPU:
    Inno3D RTX 3090
    Is there any extra hardware there, or the normal cards will simply have it disabled via a driver switch? Are they going to have actually different hw for the consumer cards? It would be interesting to see if GP104 exists.
     
  20. Ieldra

    Ieldra Banned

    Messages:
    3,490
    Likes Received:
    0
    GPU:
    GTX 980Ti G1 1500/8000
    No I mean DP computation consumes more than SP, and tdp is a worst case scenario.

    They're unlikely to just use software limit, probably laser cut or just remove dp units entirely.

    Basically having dp units on the card won't increase power consumption by much, just that doing dp compute is more power intensive because you essentially have 2x the data
     

Share This Page