GCN replaced with polaris

Discussion in 'Videocards - AMD Radeon' started by Barry J, Dec 31, 2015.

  1. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    hey lets stick with the Polaris and GPUs;
    before some newbie comes with wall-of-text NV/AMD analysis that begins with My first gfx card was Riva128 [...]
     
  2. RexOmnipotentus

    RexOmnipotentus Guest

    Messages:
    796
    Likes Received:
    4
    GPU:
    Vega 64
    So you were eager to buy another flag ship card from a company that (deliberately) downgraded your previous card?
     
  3. Undying

    Undying Ancient Guru

    Messages:
    25,472
    Likes Received:
    12,879
    GPU:
    XFX RX6800XT 16GB
    Take a guess what will happen if the same thing happen with his Maxwell card?

    Pascal baby! :D
     
  4. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,742
    Likes Received:
    9,635
    GPU:
    4090@H2O
    I wonder if they will go to Polaris 1.0, Polaris 1.1 and so on, as they did with GCN, and introducing only optimisations for the next years.

    On topic, I guess this architecture will heavily rely on parallel computing. I'm curios to see if they make it full fledged dx12 with no or very weak (as in little improvement over GCN) dx11 performance, or if they will find a way to make it considerably faster in less than heavily multi threaded scenarios.
     

  5. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,128
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    No, you're right. But the whole point of the shrink is to stuff more transistors in there in the end. Intel was having similar problems with their CPUs when they went from 32nm to 22nm. Voltage leakages increase a lot. So in the end, a transistor shrink doesn't necessarily mean a lower power consumption for a new product as a whole. You also have the issue of the concentration of heat in a smaller area, making cooling much more challenging.

    AMD is comparing their new stuff with the most efficient stuff of the previous generation. That stuff being Maxwell. What should they compare it to?

    I'm such a big fanboi that I almost got banned from the forum for a couple of threads about AMD. What are AMD's "broken promises", this is what I don't get. They don't deliver in the driver feature level (there is barely any parity and they still don't offer enforceable vsync/triple buffering), but what have they really "promised" and then "broken" their promises? As far as I can see the Fury X for example, is $100 cheaper than the 980Ti, it's as fast or faster with recent drivers (15.9.1, I couldn't find any Crimson comparisons, if you have any please share)
    [​IMG]
    [​IMG]

    It also has its scheduler on hardware, which means that it doesn't really need a new driver for every single game going out. The 980Ti is a great card, but seeing what has happened with Kepler, where it was obvious that NVIDIA were straining their resources to write a driver for every single architecture they have, you have to wonder what will happen if Pascal comes out and it too has (finally) a hardware scheduler.

    GCN is not going anywhere as long as the current generation of consoles is still on the market. The whole strategy from AMD is to have titles optimized for their cards "out of the box", since the consoles would have the same architecture as they do. GCN was a huge investment, and to be honest the performance is just fine. Thermals could be better, but it seems that the new process nodes are taking care of that.

    What I'm guessing we're getting is a tweaked GCN with better tessellation/ROP performance, better thermals due to the new process, and all the DX12 tickboxes for marketing reasons. All and all, the top part will probably have 8192 GCN cores, my guess is 256Rops/1024TMUs (I hope at least).
    All modern GPU architectures since Fermi/GCN are properly parallel and fully programmable. The differences are smaller than they seem on the marketing papers.
     
  6. Rich_Guy

    Rich_Guy Ancient Guru

    Messages:
    13,146
    Likes Received:
    1,096
    GPU:
    MSI 2070S X-Trio
    The 2X improvement was against their own cards, not Nvidias 950.
     
  7. Doom112

    Doom112 Guest

    Messages:
    204
    Likes Received:
    1
    GPU:
    MSI GTX 980 TF V
    First of all PR you need to get your facts straights or you do not post at all.

    AMD promised that Fury X will be a overclockers dream so was it ?
    AMD promised that Fury X will be the world single fastest card so was it?

    Matter of fact no one buys reference GTX 980 Ti from now on and anyone does it than he or she is not stupid or AMD user that keep the GPU on stock. Maxwell is not design to run on stock so it is meant for overclocking.Moreover, even 230+ core mhz will make gtx 980 ti 15% to 20% faster than fury X , which is achievable by anyone.
     
    Last edited: Jan 4, 2016
  8. Doom112

    Doom112 Guest

    Messages:
    204
    Likes Received:
    1
    GPU:
    MSI GTX 980 TF V
    "Jen-Hsun claims that Pascal
    will achieve over 2x the performance per watt of Maxwell in Single
    Precision General Matrix multiplication" These are architecture improvement not base on new node.

    Even if Nvidia put Maxwell on new node than it could compete it with GCN 4.0.
     
  9. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,128
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    Unless you refer to something else, in the slide you have quoted they specifically mention a GTX 950.

    The only thing I can agree with you was about the "overclockers dream", that I heard from their own lips during the presentation. Even if it's not exactly a dream, that doesn't mean that there is no overclocking. Most people seem to be getting at least a 12% on their core frequencies. That's nothing to write home about, but nothing really bad either. But no overclockers dream.

    For the same price (actually 7 euro more), you can get a KFA² GeForce GTX 980 Ti with a reference cooler, or a PowerColor Radeon R9 Fury X with a watercooler.
    If you want to get a custom cooled/overclocked 980Ti with a good triple fan cooler from a good brand like ASUS/MSI etc, it seems that you go for a minimum of 700 euros. So it's again more or less in the $100 difference. If you believe that you get your money's worth from it, go for it. Keep in mind that the 980Ti is not really that much cooler than the Fury-X, since Maxwell is more or less at its configuration limit on these cards.
    To be honest, apart from the extra money that are always nice, what really bothers me with a card like that is the software scheduler and the driver that will always be needed. Also AMD stays on GCN, NVIDIA seems to once more be moving on with Pascal. I try to buy my hardware and keep it for as long as possible. If I didn't do that, I would probably get an overclocked 980Ti with a watercooler. If I wanted to get something and hold it longer, I would get the AMD card.

    Jen-Hsun says a lot of things more carefully these days. None of us is a developer, therefore none of us knows how important what he mentions is. Or maybe the new card can do that, but be memory bottlenecked, or who knows what else. In the end of the day, we give too much credence to marketing ****. Until Hilbert and the other websites can test them, the point is a bit moot. At least AMD gave specific numbers with specific hardware.
     
  10. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,512
    Likes Received:
    18,814
    GPU:
    AMD | NVIDIA

  11. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,128
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    Nice!

    These look interesting.

    An overview of key 4th gen GCN architecture relative to product changes are:
    •Primitive discard Accelerator
    •Hardware schedulers
    •Instructions pre-fecter
    •Improvised shader efficiency
    •Memory compression


    I don't even know what the primitive discard accelerator does, I can suspect it deals with triangles. It was a bit of a surprise to see the hardware schedulers mentioned there, since I feel these have always been the strong points of GCN. Maybe they made them more power efficient, or they added more capabilities.

    All the rest seem iterative changes with a marketing gloss over. The big parts will probably be giant GCN (6000+ core processors) with better tessellation and better shader IPC. I have no problem with that. I'm starting to be really curious about Pascal now.
     
  12. Barry J

    Barry J Ancient Guru

    Messages:
    2,803
    Likes Received:
    152
    GPU:
    RTX2080 TRIO Super
    lets hope AMD do this right and that includes the drivers, NVidia needs competition and AMD has been a little slow lately .
     
  13. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,128
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    That's my main hope too. Their DX12 driver (and I suspect the Vulkan one) look stellar, the DX11 one has gotten a lot of improvement this year. They should keep improving and it looks ok for them. It's still interesting that NVIDIA hasn't said anything about Pascal. If TSMC has problems with the 16nm process, expect a pricey new generation : S
     
    Last edited: Jan 5, 2016
  14. ThunderForce

    ThunderForce Guest

    Messages:
    29
    Likes Received:
    9
    GPU:
    GTX 980 / 4 MB
    Last edited: Jan 4, 2016
  15. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,128
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    "Completely new architecture". Please, terminate the people doing your PR AMD. Thankfully the explanations after were a bit more enlightening. Apparently they tried quite a lot to improve single threaded performance. That sounds like an "overhead" improvement to me somehow.
     

  16. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,677
    Likes Received:
    287
    GPU:
    RX 580 8GB
    Definitely misleading there.

    Would be nice if they talked more about gaming performance and not power efficiency.
     
    Last edited: Jan 4, 2016
  17. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    power efficiency is 90% of the whole deal
    you can lose 10% and be OK, but lose 20% and you're gone from laptops.

    30W yet capable GPU, I'd like that even in my dekstop.
     
  18. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,677
    Likes Received:
    287
    GPU:
    RX 580 8GB
    Naturally they must advance in power efficiency as well as performance but they should showcase both to cater for low-end HTPC/gaming and high-end GPU users.
     
    Last edited: Jan 4, 2016
  19. Dch48

    Dch48 Guest

    Messages:
    1,821
    Likes Received:
    1
    GPU:
    Sapphire Nitro+ RX 470 4g
    Exactly, GCN is not dying nor being replaced. It is just being upgraded and renamed.
     
  20. sammarbella

    sammarbella Guest

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    Is really power efficiency 90% of the whole deal?

    In laptops maybe it's for some but Intel iGPU is the king in power efficiency in laptops.

    Now speaking about desktop GPUs.

    Nobody in his right mind is going to spend 500-800 US$ in a GPU to play GAMES and be happy to save some bucks in power bills at the expense of performance....in GAMES.

    IMHO the top priority for people buying a GPU FOR GAMING is:

    GAMING PERFORMANCE

    And very far behind that power saving.

    People buying this new Polaris thing will download the next ClockBlock 2.0 even before the mandatory GPU drivers.

    From what is see in the video the more important aspect of polaris is power efficiency.

    AMD compare two computers with exactly the same components but GPUs.

    [​IMG]

    I guess in the sake of fairness both GPUs have similar graphical performance and the only measurable performance is power efficiency:

    - New Polaris

    - Nvidia GTX 950

    So, what is the message?

    If you are about to buy a GTX 950 don't buy it, buy a new Polaris it will spend less watts at the same gaming performance.

    Is Polaris a power efficiency breakthrough in the middle to low performing GPU market?


    :3eyes:
     
    Last edited: Jan 4, 2016

Share This Page