GCN replaced with polaris

Discussion in 'Videocards - AMD Radeon' started by Barry J, Dec 31, 2015.

  1. Twiddles

    Twiddles Maha Guru

    Messages:
    1,155
    Likes Received:
    11
    GPU:
    MSI 2080 2190-7550
    Wait until you pay your own bills. Those little extra's add up quite fast.
     
  2. sammarbella

    sammarbella Guest

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    That was funny.

    Not everybody in here are teenagers, i'm in the +40 years range.

    I pay my power bills since many years so i'm fully aware of GPU performance cost.

    Thanks to care about my expenses.

    :)
     
  3. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    It is because same GPU has to serve mobile and desktop.
    And pro.
    And if they had the resources to bifurcate it would be for pro.
    Because except for the power envelope, mobile/desktop have the exact same workloads and usage.

    Not to mention efficiency is important in desktop too.
    Casual overclockers like me will be limited both by TDP and by common sense when overclocking that 250W product.

    You think I am running those clocks from my sig other than for benchmarking?
    Like hell I am.
    Even though I have excellent temperatures, and virtually silent fans, courtesy of my big@ss ACIV mod,
    dafuq if I need additional 200W sipping under my desk.
    Even if I didn't care for the electric bill, its just ugly and inconvenient and stressful and stupid, like totally opposite of nice and relaxed gaming.

    But that's beside the point, because of the need to address the mobile and desktop with the same GPU.
     
  4. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,677
    Likes Received:
    287
    GPU:
    RX 580 8GB
    Power efficiency improvements are great to show off but performance improvements are quite important as well :)
     
    Last edited: Jan 4, 2016

  5. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    I suspected this but thanks for confirming:

    [​IMG]

    VSYNC-CAP

    :wanker:
     
  6. sammarbella

    sammarbella Guest

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    IMHO power efficiency is great but it's a far secondary priority if you buy a dedicated GPU looking for performance in games.

    i like silence, 1080 rad (MO-RA3) and GPUs+CPU water blocks here.

    1175/1600.

    If new Polaris is only able to win over GTX 950 in power efficiency this new GPU is not for me.
     
  7. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    If they have perf/W, extracting more perf. is super esy.
    Just build bigger GPU, overclock it whatevah.

    But did you see that cheap trick with Vsync ON?
    Instead of using...uh I dunno Skylake-U in 20W idle rig and properly demonstrating Polaris perf/W superiority. They cap fps and undervolt Polaris.
    These guys are hopeless. Their benchmarks are indicative of nothing.
    I can count my fingers and get a better estimation of their next product.
     
    Last edited: Jan 4, 2016
  8. sammarbella

    sammarbella Guest

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    The AMD ad show something about Polaris: it indicates the GPU performance range (GTX 950) in which Polaris want to be compared to in order to show his power efficiency.

    Maybe AMD want Polaris to be the new power efficiency king in the medium-low GPU range.

    If the goal was to show the best power saving in the best performance scenario a 4K/60 ultra settings comparison Titan X will be the GPU to compare to.
     
    Last edited: Jan 4, 2016
  9. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    At this point top GPUs are pretty strong. Next generation does not require huge performance boost as it can at best eat from SLI/CF market and that is not exactly huge.
    Focus no.1 is on efficiency in power consumption. That's what will boost AMD's sales.
    Focus no.2 should be on delivering same performance while using lower amount of transistors to increase profits.
     
    Last edited: Jan 4, 2016
  10. sammarbella

    sammarbella Guest

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    How many 2015 AAA games are you able to play at 144 FPS in your 144 Hz monitor in high/ultra settings with your Fury X?

    Or was your goal simply play all games at 72 FPS?

    Next generation GPUs will need more power to be able to handle at least 4K/60 gaming or 1080p/144 fps gaming.

    To play games at 1080/30 with medium-low settings i can use my PS4 or wait for the next Intel iGPU.

    That will be a real power saving!

    SLI/CFX users already know that adding more GPUs is not exactly the perfect solution to cover single GPU lack of power to play 1080p/144 or 4K/60.

    You can only improve power efficiency (at performance cost) when you reach a performance excess level what is not the case right now.

    The new monitors will increase in size, hz and resolution and that will need more GPU power no less to sustain the fps at the same graphic settings.

    My next monitor will be 21:9 3440x1440 100Hz i expect to buy a single GPU able to deliver enough performance to use it at 60-100 FPS gaming at high/ultra settings.
     

  11. Loophole35

    Loophole35 Guest

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    I can do nothing but suspect this is a scenario they concocted to make unrealistic claims. Remember the Fury X beating the Titan X in all their slides. While I do expect this card to be more efficient this looks like the Nano efficiency trick in that slide.
     
  12. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    You can check how stuff was running back then in Fury X thread. Post #5.
    I think it is quite nice that some good games run over 120fps maxed 3840x2160.

    New AAA titles, I am afraid that they are mostly pretty sh*tty and I enjoy lower budget games much more. I am currently in the middle of "Life is Strange". Game is about story and particular play style and not about graphics. I love it and I hate it. But I guess everyone who played it feels same about it.

    When you look at game like Hawken, you can max out graphics in that game with average HW. And it looks like those "AAA" titles sold for $60~90.

    TL;DR: Badly written shader code will put to its knees best GPU even after 10 years.
     
  13. sammarbella

    sammarbella Guest

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    Tetris is a good game, i guess a tetris HD version at 4K/120 is very possible when some manufacturer build a 4K 120 Hz monitor with DisplayPort 1.3.

    Next Samsung S7 phone is rumored to have a 4K/60 screen maybe it's possible to play this game also there without much trouble.

    Seriously we don't need powerful or efficient GPUs or even dedicated GPUs at all to play medium to low graphically intensive games.

    "Life is strange" is a good game i will play it in one of my old gen consoles (360/ps3), they are more power efficient and i'll get similar graphics like in any telltale episodic game.

    If power efficiency and low graphic performance is what you want as priority iGPUs or consoles are the right solutions.

    I bought dedicated GPUs to get something better than that and i think you too buying a Fury X and a 144 Hz monitor.

    For me the solution when a GPU can't deliver enough performance in games (fps and/or image quality) is not to play older or low demanding games....is to buy more performing GPUs.
     
  14. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    And 64Kb or RAM will be more than enough for anybody, forever and ever and :infinity:
    No, it won't be. :puke2:

    Multi-GPU has always been an abomination. Whenever I happened to have a multi-GPU system for some reason I was always happy when I could upgrade to a next-gen single-GPU card with comparable performance and get rid of the multi-GPU horror.

    It certainly helps if your cards can be relatively silent and 'not too warm' even if your cooling sucks. I think this was a big problem with the reference Hawaii/Grenada cards (way way too loud and still warm enough to cause pain on touch).
    Better power efficiency can help but it has it's limits.The optimal setup depends on the target performance (eg.: Are you ready to give up the raw performance for better power efficiency? -> only if you don't need that performance anyway but "a GPU is never fast enough" :) ).

    Or use the same amount of transistors but "make it go faster". It's the same thing from different angles.
    In PC hardware, the trend has always been UP in terms of transistors. That's why they need die shrinks, so the GPU can leave the gates of the factory and you don't need a truck to carry it home. :banana:
    AMD won't change that even on paper (and especially not in physical reality). You always need more transistors at the end. Tricks and optimizations yield magnitudes less than brute force (especially if you have been doing those optimizations for quiet some time already).
     
    Last edited: Jan 5, 2016
  15. thatguy91

    thatguy91 Guest

    I believe that is the same as the other information I saw. In that case, the competition is a GTX 950. The comparison wasn't meant to be about outright performance, which is quite obvious going by the power ratings, just efficiency for the same level of performance. A big key issue here is the drivers used, as the AMD driver is no doubt not as optimised for performance and efficiency as it will be on release and later. The drivers for the GTX 950 would be mostly matured by now. Secondly, there could even be further minor hardware tweaks before the final release, so the result is quite impressive.

    The driver AMD used belongs to the new 16.x series, and is likely quite 'alpha' in nature.
     

  16. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    You love your tetris, and you can run tetris in 4k already.
    in 120fps+ as a bonus on non-existant screens for next few years.
    Tetris: Dead Island Edition
    Tetris: Warframe Edition
    Tetris: CS:GO Edition
    Tetris: Remember Me Edition
    ...

    But if you have available 4k screen running poor 60Hz, you can as well run following Tetris Games at constant 60+ fps:
    DoTa 2 Tetris, Payday 2 Tetris, WH 40k: Space Marine - Tetris, BF3 Tetris, Darksiders II Tetris, TES 5: Skyrim Tetris, ...

    And if you are not perfectionist and can survive occasional dip to 58 fps then you can play 64-man Largest maps in Battlefield 4 Tetris again in 4k. (Like many others as judging games by minimum fps is...)

    And it is not that I have picked few gems here, Actually only problematic games on 4k I could find in my list of games are Witcher 2 and Wither 3, but those are not Tetris Games, or are they?
     
    Last edited: Jan 5, 2016
  17. sammarbella

    sammarbella Guest

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    Played "tetris: black mesa" a few days ago with gedosato downsampled from 8k to 1080p all in game settings maxed but msaa x4.

    Still on 1080p monitor.

    I have no pocket money to buy the new 8k TVs...or gpu with adequate output.

    :D

    Edit: ofc not tw2 and 3 are not tetris games.
     
  18. CPC_RedDawn

    CPC_RedDawn Ancient Guru

    Messages:
    10,449
    Likes Received:
    3,128
    GPU:
    PNY RTX4090
    No point in using nearly 4 years old now even AMD knows this. They focused too much on APU technology which is still a great platform in my mind and could scale well in the future if all this "efficiency" talk pays off.

    May as well use the standard common CPU than their own old technology that barely anyone uses to show performance results.

    Also shows that they probably used an identical system to give more accurate results. If they used an AMD system for their GPU and an Intel system for the Nvidia GPU results would be varied and much more skewed, the same could be said if it was the other way around.
     
  19. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,754
    Likes Received:
    9,647
    GPU:
    4090@H2O
    Just should have just used an AMD CPU for both GPUs. Should be enough to feed a 950 or whatever they used with medium settings at 1080p, no? Would still be a comparable system. Or is there any real reason behind avoiding their own components?
     
  20. deathroned

    deathroned Guest

    Messages:
    47
    Likes Received:
    0
    GPU:
    950
    Ahem sir it does. Stock vs stock the 980ti was unfair since many boosted to 1400mhz+
     

Share This Page