NVIDIA Pascal GP104 Die Photo

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Apr 11, 2016.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    37,066
    Likes Received:
    6,125
    GPU:
    AMD | NVIDIA
    Last week Nvidia announced the Tesla P100 data-center GPU and if you looked closely at the photo's you could already clearly see that big Pascal 15B transistor GPU being used. A new photo this time...

    NVIDIA Pascal GP104 Die Photo
     
  2. snight01

    snight01 Master Guru

    Messages:
    309
    Likes Received:
    15
    GPU:
    RTX 2080 Ti FE
    warming up the wallets, time for some new gpus peeps.
     
  3. Fox2232

    Fox2232 Ancient Guru

    Messages:
    9,940
    Likes Received:
    2,293
    GPU:
    5700XT+AW@240Hz
    Not for this small guy. Big things are needed... Vega?
     
  4. Undying

    Undying Ancient Guru

    Messages:
    12,518
    Likes Received:
    1,963
    GPU:
    Aorus RX580 XTR 8GB
    So no more GTX brand for nvidia after all this years, now its just X.
     

  5. DW75

    DW75 Maha Guru

    Messages:
    1,161
    Likes Received:
    566
    GPU:
    ROG GTX1080 Ti OC
    I said it a few days ago, and still believe this. I think Polaris is going to come out on top at all price points this round. I think people are going to be surprised in the next couple months with what AMD will have to offer.
     
  6. Backstabak

    Backstabak Master Guru

    Messages:
    519
    Likes Received:
    197
    GPU:
    Gigabyte Rx 5700xt
    Seems quite underwhelming, not even GDDR5X memory.
     
  7. BlueRay

    BlueRay Master Guru

    Messages:
    271
    Likes Received:
    64
    GPU:
    EVGA GTX 1070 FTW
    I'm not excited with this. Seems way underwhelming. Sure it will be faster than 9xx series but not that much. And since HDM and DDR5X are not ready yet for the mainstream we have to wait until the next generation to see a good performance jump. They need to push 4K gaming with a single card to the mainstream.
     
  8. AlmondMan

    AlmondMan Master Guru

    Messages:
    501
    Likes Received:
    42
    GPU:
    5700 XT Red Dragon
    Hm, going to be interesting to see what this turns out to be. Seems a bit underwhelming?
     
  9. TheDeeGee

    TheDeeGee Ancient Guru

    Messages:
    6,344
    Likes Received:
    730
    GPU:
    MSI GTX 1070
    Can't come soon enough, my 680 served me well for 4 years.
     
  10. xIcarus

    xIcarus Master Guru

    Messages:
    941
    Likes Received:
    90
    GPU:
    1080 Ti AORUS
    I am disappointed if we would not see GDDR5X but at the same time I'm questioning if in the end it's actually needed. Even an overclocked-to-hell 980Ti (only core OC) won't get bandwidth starved any time soon.

    Let's be serious, the hype around HBM was retarded - I cannot describe it any other way. Crap like '4GB HBM = 8GB GDDR5' or 'HBM will make 4k possible on one single card'. What the fek seriously. Sounds like the same people who say that DDR4 made their PCs much faster (no IGP).

    I haven't seen conclusive proof that HBM actually helps the Fury X to a tangible level. If anyone has such proof, please share it with me.

    What makes you say that? Genuinely curious.

    I believe it all depends on how they implement DX12 features.
    Atm AMD have an advantage in DX12 because everybody seems happy to jump to async shading but we haven't touched conservative rasterization yet, something GCN does not support.

    If these guys (AMD/Nvidia) don't start supporting conservative raster and async shading respectively, we will see a huge ****fest similar to what's going on right now DX12-wise. Devs will have to side with one party and ditch the other.
    Occasionally we might have the amazing dev which will actually take the time to optimize properly for both parties (for example coding volumetric lighting to take advantage of either conservative raster or async shading at will). Like I said. Sh!tfest.
     
    Last edited: Apr 11, 2016

  11. BD2015

    BD2015 Member

    Messages:
    46
    Likes Received:
    5
    GPU:
    Geforce GTX 1080 Ti x2
    I'd say this is just a filler to make some sales. Not what we're waiting for. (HBM2, 15B transistors)
     
  12. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    11,283
    Likes Received:
    3,327
    GPU:
    2080Ti @h2o
    I tend to agree.
     
  13. Denial

    Denial Ancient Guru

    Messages:
    12,580
    Likes Received:
    1,801
    GPU:
    EVGA 1080Ti
    I mean it basically lines up with what Polaris 10 is. We aren't getting consumer level 600mm2 cards first and we've known that for over a year. I'm not sure why people are surprised by the specs.
     
  14. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    6,797
    Likes Received:
    119
    GPU:
    5700 XT UV 1950~
    I would think both amd and nvidia will have full dx12 support with the upcoming gpus. Even intel has.
     
  15. Noisiv

    Noisiv Ancient Guru

    Messages:
    6,664
    Likes Received:
    504
    GPU:
    2070 Super
    This is proly the 2nd most important GPU in NV lineup. After GP100.
    Because of volumes sold at hefty prices in desktop and mobile, in both pro and GTX variant.

    But how they'll be able to do it with GDDR5 and 256bit, I have no idea.
    Faster memory modules and better compression are obvious starting points.

    Hopefully it will make 980Ti completely obsolete. No worries, it will :p
    Because anything else would be a fail.
     

  16. evilos

    evilos Member

    Messages:
    20
    Likes Received:
    1
    GPU:
    Gtx 970 G1
    ofc 1070 will be well worth it(again), 980ti performance level for about 350/400$$ and combine that with 8GB gddr5 onboard
     
  17. warezme

    warezme Member Guru

    Messages:
    192
    Likes Received:
    21
    GPU:
    Evga 970GTX Classified
    I don't see the problem as the GTX brand has been diluted to the point of being meaningless. It used to be only the high end models were GTX, like *70 and *80 series cards. Now you have GTX 950's out the *ss. Seriously? It's a 950.
     
  18. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    7,005
    Likes Received:
    139
    GPU:
    Sapphire 7970 Quadrobake
    This is such complete speculation. It all depends on the clocks too. This is more or less a shrunk 980Ti. If it hits 2GHz then it's fine.
     
  19. xIcarus

    xIcarus Master Guru

    Messages:
    941
    Likes Received:
    90
    GPU:
    1080 Ti AORUS
    Word. The 950 should not get the GTX badge.
    But at least they dropped the GT vs Ultra suffixes, those were a bit confusing.

    It should be as it was with the 500 series. 550 and under was GT, 560 and above was GTX.
    At least they're not overinflating their numbers like AMD does. 7850, 7870, 7950, 7970, 7990. Versus 650, 660, 670, 680, 690. WTF srsly. It's very confusing and even now I have to look up benchmarks every damn time I want to compare the cards.
     
    Last edited: Apr 11, 2016
  20. rl66

    rl66 Ancient Guru

    Messages:
    2,292
    Likes Received:
    167
    GPU:
    quadro K6000+Tesla M2090
    yes obviously it push to the top, but sadly for us AMD have made terrible choise of HBM1 wich is limited to 4gb...

    so the perf are crippled due to lack of ramspace (and in some case even the previous gen high end do better).

    now about the "GDDR5 only" drama: on pict i just see prototype or sample, if it work with GDDR5 then it work with GDDR5X too, it's pin to pin compatible (i can already smell old GPU stock with GDDR5X and a new name... lol).

    and GDDR5X is clearly nice ram, and cost less than HBM2 (that will be used for high end in red and green flavor for sure).

    "don't sell bear's skin before getting it first" none GPU from neither company are ready... let them come out and tested :)
     

Share This Page