Rumor: GeForce GTX 1080 8GB gddr5x based on Pascal GPU in May

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Mar 11, 2016.

  1. Netherwind

    Netherwind Ancient Guru

    Messages:
    8,813
    Likes Received:
    2,396
    GPU:
    GB 4090 Gaming OC
    Oh yea! WTS :D
     
  2. Denial

    Denial Ancient Guru

    Messages:
    14,201
    Likes Received:
    4,105
    GPU:
    EVGA RTX 3080
    Idk, I think it's easy to say "nvidia is gimping cards" when there is no clear alternative. I think it's far more complicated then that though.

    I wrote this in another thread but I guess I'll repeat it here. Lets go back to the 680 vs 7970. Both were released the same year, with the 680 generally outperforming the 7970 in most titles. Most games now though, the 680 just falls so far behind the 7970 that it's not even in the same ballpark. Why? I'm not 100% sure, but I think the answer is more complicated then "Nvidia downgrading Kepler lel". The 680 was a 2.5Tflop card compared to the 7970's 3.7Tflops. The 680 was also 2GB vs 3GB on the 7970. It's pretty clear that the 7970's hardware was just completely underutilized when it launched. You'll find that this comparison applies down most of the GCN/Kepler and even GCN/Maxwell line. Look at the Fury X compared to the 980Ti in terms of tflop output, memory bandwidth, etc. The Fury X on paper should completely annihilate the Ti. I wouldn't be surprised if in 2-3 years from now, when 4K is norm, that Fury X just wins in everything and the same argument is repeated then.

    In fact I think all of GCN was just completely underutilized for a while now. You have to remember that GCN was a radical departure from the VLIW architecture before it, where as Kepler was just a refined Fermi. So from a developer perspective it kind of makes sense that out of the gate, Kepler had better performance overall.

    Then you have the console argument. Most AAA titles launching now are the first games who's entire development process was focused on the current generation consoles. Which as you probably know, feature GCN based graphics cards. So obviously developers are going to target features that are beneficial towards that architecture type. For example, AMD cards lack the geometry hardware to do tessellation quickly. So as a developer, who is trying to push the best graphics on console hardware, you're going to tend to stay away from using tessellation, shadow volumes, etc as much as possible, and instead focus on say vertex shader based effects where GCN shines. The other thing that's becoming increasingly popular is compute based stuff -- not just async mixed graphics/compute but compute in general. Outside of Civ it's hard to find many compute based graphics benchmarks that aren't in OpenCL, but even with all the drivers up to date, a 780Ti performs worse than a 280x in Luxmark, where as all the Maxwell cards perform far better than Kepler. As I already posted here, a 950 is almost 3x as fast as a 770. So either Nvidia never bothered getting Kepler's OpenCL compute stuff going, or Maxwell is just much faster in Compute than Kepler is. And with more games utilizing compute in general, it could help explain why Kepler isn't keeping up with GCN/Maxwell.

    And yeah, I do think that part of it is also probably Nvidia just not focusing on Kepler as much overall. Are they deliberately doing it? I doubt it. They have a certain amount of resources that they are going to distribute throughout their driver team and Kepler is obviously going to get less. I just don't think it's the sole reason and I think it does a disservice to AMD honestly. They a **** ton of flak, especially here on Guru3D about the console stuff. They also got a ton of flak about Mantle effectively being discarded, before the Vulkan stuff. And yet, it's those two things that I honestly think are the main reasons why we are seeing this shift in performance.
     
    Last edited: Mar 11, 2016
  3. Singleton99

    Singleton99 Maha Guru

    Messages:
    1,071
    Likes Received:
    125
    GPU:
    Gigabyte 3080 12gb
    That made a good read Denial , and a lot of what you have explained does make perfect sense , from a Amd side their cards certainly are standing the test of time compared to nvidia , and it will be very interesting to how the fury cards perform in a year from now when Maxwell very well maybe be on it knees .

    I just hope that Maxwell will do better than Kepler has done it terms of longevity as im fed up with upgrading graphics cards every year , it's enough already . Where as i gone for a couple off r290x back in the day i could off still been using them now with good fps

    I will not be buying this new card from Nvidia not a chance and like i said earlier if in 6-months to a year Maxwell is what kepler is now then i'm done with Nvidia.
     
  4. nhlkoho

    nhlkoho Guest

    Messages:
    7,755
    Likes Received:
    366
    GPU:
    RTX 2080ti FE
    I'm sure that both sides were working with Microsoft to get as much info about DX12 as possible. I also don't believe that AMD thought of Mantle on their own. There were rumors about the capabilities of DX12 before Mantle was released and then when it finally was released, it was broken and didn't work which leads me to believe it was rushed to try and beat DX.
     

  5. Robbo9999

    Robbo9999 Ancient Guru

    Messages:
    1,855
    Likes Received:
    442
    GPU:
    RTX 3080
    Well that makes sense! (not saracasm!)
     
  6. tsunami231

    tsunami231 Ancient Guru

    Messages:
    14,702
    Likes Received:
    1,843
    GPU:
    EVGA 1070Ti Black
    looks intresting wonder what the asking price is gona be 400$+ my guess..
     
  7. Denial

    Denial Ancient Guru

    Messages:
    14,201
    Likes Received:
    4,105
    GPU:
    EVGA RTX 3080
    Low level API's have been around since forever and discussions about that coming to PC were around as far back as DX10. Most of the concepts for Mantle were already essentially built as a modified version of DX11 used in the consoles. It was like DX11 with low level extensions. Then Johan Andersson from DICE basically told AMD he wants that functionality on PC. I'm sure that initial DX11 bring up for Xbox heavily influenced Microsoft's decisions with DX12 as I'm sure they were heavily involved.

    I do think both worked closely with Microsoft, as evident by Maxwell's DX12_1 features. Whether or not Pascal was too late into the design phase to fix the Async stuff is a good question, but I'm sure if it wasn't Nvidia will just figure out a way around it.
     
  8. Corrupt^

    Corrupt^ Ancient Guru

    Messages:
    7,270
    Likes Received:
    600
    GPU:
    Geforce RTX 3090 FE
    Same here. I have literally seen my 770GTX's framerate dip bit by bit throughout the past 1.5 years in BF4.
     
  9. Davud

    Davud Member

    Messages:
    41
    Likes Received:
    16
    GPU:
    MSI 5700XT Gaming X
    I'm guessing 25-30% improvement over same category today...
     
  10. Robbo9999

    Robbo9999 Ancient Guru

    Messages:
    1,855
    Likes Received:
    442
    GPU:
    RTX 3080
    That's weird, each time I update my graphics driver I run game benchmarks and write them into a table (geeky I know, but I want to be sure I'm not downgrading!) & there's been no changes apart from rare slight increases. I test F1 2012, Batman Arkham Knight, Batman Arkham Origins, Bioshock Infinite, Metro Last Light, Tomb Raider, Shadow of Mordor. (Batman Arkham Origins saw a significant decrease in framerate upon moving to Windows 10 from Windows 7, but that was the only one) (Been tracking it since about June 2013)
     
    Last edited: Mar 11, 2016

  11. nhlkoho

    nhlkoho Guest

    Messages:
    7,755
    Likes Received:
    366
    GPU:
    RTX 2080ti FE
    It's not that you are losing performance that people are whining about, its that performance doesn't improve over time like AMDs cards do.
     
  12. ScoobyDooby

    ScoobyDooby Guest

    Messages:
    7,112
    Likes Received:
    88
    GPU:
    1080Ti & Acer X34
    Wake me up when the 1080TI has some specs and a release date out in the wild.

    Not interested in any other card really.
     
  13. rgothmog

    rgothmog New Member

    Messages:
    6
    Likes Received:
    1
    GPU:
    Zotac 980 TI Amp Extreme
    Oh Great :(

    Oh Great.Now ive just bought a Zotac 980 ti the next gen cards are coming out:bang:bang:
     
  14. LimitbreakOr

    LimitbreakOr Master Guru

    Messages:
    620
    Likes Received:
    158
    GPU:
    RTX 4090
    They're going to call it GTX 1440 lol and the ti will be called gtx2160
     
  15. Kaotik

    Kaotik Guest

    Messages:
    163
    Likes Received:
    4
    GPU:
    Radeon RX 6800 XT
    AMD did develop Mantle by themselves (with some influence from dice etc). Mantle was also under development for several years, not just the time it was public.

    MS was one of the companies that had access to all mantle related since day 1, and considering the timeframe, similarities and the fact there's gcn in xb1 there's no doubt that Mantle kicked off DX12 development and that both gcn and mantle affected how dx12 turned out to be a LOT.

    Also, there was never dx12 being low level rumours before Mantle details came public
     

  16. sammarbella

    sammarbella Guest

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    IMHO AMD GPU drivers were simply closing the DX11 performance gap they had (and still have) in relation to Nvidia drivers.

    DX11 API draw calls performance increased around 50% in AMD drivers during the first 6 months of 2015 but it's still around 50% less performing than Nvidia drivers.
     
  17. Netherwind

    Netherwind Ancient Guru

    Messages:
    8,813
    Likes Received:
    2,396
    GPU:
    GB 4090 Gaming OC
    Indeed. The 980Ti is the best card I've ever had so the successor must be just as good.
     
  18. Fender178

    Fender178 Ancient Guru

    Messages:
    4,194
    Likes Received:
    213
    GPU:
    GTX 1070 | GTX 1060
    If a 1070 or a 1080 can grant me 20-40% increase in performance of my r9 290 card I'll be happy.
     
  19. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,125
    Likes Received:
    969
    GPU:
    Inno3D RTX 3090
    I believe that May is way to early for a chip that they didn't even have a fake die to show off at its "unveiling". Also the complexity and signaling requirements of GDDR5X, will probably put its cost on par with HBM, but without those annoying AMD interposer patents to deal with. That of course, will probably kill smaller form factor cards, not that they matter that much.
    In short, I expect "full" Pascal (not the initial 680/780/980 trap, neither the idiotic Titan), at around November.
     
  20. jinxjx

    jinxjx Member Guru

    Messages:
    110
    Likes Received:
    0
    GPU:
    AMD-MSI 260x 2gb
    does anyone have a good guess what this new card will be equivalent too as in a 970 or a amd 390 ?.....thanks
     

Share This Page