AMD Radeon 400 series is based on Polaris microarchitecture

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jan 1, 2016.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,392
    Likes Received:
    18,564
    GPU:
    AMD | NVIDIA
  2. Kloet075

    Kloet075 Guest

    Messages:
    30
    Likes Received:
    0
    GPU:
    Gigabyte Gaming G1 970
    Photoshopped?

    If this is a quick photo, then tell me why the "for" and the word "every" are not in line of context? it seems photo-shopped, could be so you could read it, but this is certainly not the original. :question:
     
  3. somemadcaaant

    somemadcaaant Master Guru

    Messages:
    454
    Likes Received:
    65
    GPU:
    Red Devil 6900XT LE
    Yeah that image just looks all kinds of wrong.
     
  4. Backstabak

    Backstabak Master Guru

    Messages:
    860
    Likes Received:
    347
    GPU:
    Gigabyte Rx 5700xt
    That image is probably just edited with what would be otherwise not visible. You can even see that "n" in inspiration is also shorter than it should be.
     

  5. PinchedNerve

    PinchedNerve Guest

    Messages:
    232
    Likes Received:
    1
    GPU:
    Gigabyte AORUS 1080 Ti XE
    I've been thinking about getting an Asus R9 390 Strix. Does anyone have an idea about how long it will be, before the 400 series is released?
     
  6. Undying

    Undying Ancient Guru

    Messages:
    25,339
    Likes Received:
    12,747
    GPU:
    XFX RX6800XT 16GB
    TBA 2016, just like Pascal.

    R9 390 is a very good card. If you need a card now no need to wait something we still dont know nothing about.
     
  7. DeskStar

    DeskStar Guest

    Messages:
    1,307
    Likes Received:
    229
    GPU:
    EVGA 3080Ti/3090FTW
    Maaaaaaaan I love the sound of new hardware on the horizon....!!

    Either company could release their big dogs next and I think I'll scoop up at least one from each company this go around. Going to be nice rocking something new in a year or so....
     
  8. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,975
    Likes Received:
    4,342
    GPU:
    Asrock 7700XT
    I have the R9 290 (basically the same thing) and for 1080p gaming it's probably the best-valued GPU you can get at the moment, with or without the bonus games the GPU may come with. I too am interested to see where the 400 series goes but I don't think you'd regret getting a 290/390.
    Just something to keep in mind though - I don't know if you like doing BIOS mods but many of these GPUs have their BIOS totally locked-down. Also, seeing as you're in Flordia, I hope your AC works during the summer, because these GPUs exhaust a lot of heat.
     
  9. xIcarus

    xIcarus Guest

    Messages:
    990
    Likes Received:
    142
    GPU:
    RTX 4080 Gamerock
    Didn't Nvidia say a while back that Pascal would be released in Q1? What happened to that?
     
  10. PinchedNerve

    PinchedNerve Guest

    Messages:
    232
    Likes Received:
    1
    GPU:
    Gigabyte AORUS 1080 Ti XE
    AC works well. :) The Asus Strix looks to OC very well, should I want to do that, so I wouldn't be looking to mod the bios. Running a Dell U2715H 2560x1440
     

  11. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    Nvidia never gave a time-frame other than 2016. But it's usually about 9-10 months from tape out to a chip launch. Which would put the launch around April. But this isn't a normal chip launch. It's the first Finfet GPU, it's the first HBM GPU for Nvidia, it's a new architecture and they are most likely going to use a mezzanine connector to a PCI bridge and ship the chip in modules to OEMs. All of these things will definitely complicate the time table for the launch.
     
  12. TDurden

    TDurden Guest

    Messages:
    1,981
    Likes Received:
    3
    GPU:
    Sapphire R9 390 Nitro
    Why would AMD want to scrape GCN? GCN is well known for AMD software engineers, early tests show perform well on DX12, is same architecture used in consoles and 14nm should make it power efficient too. Plus its cheaper to improve on GCN then create completely new arch.
    Probably Polaris is just a name for new GCN version :) why should nVidia be the only one with fancy names :)
     
  13. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,677
    Likes Received:
    287
    GPU:
    RX 580 8GB
    As said, Polaris is most likely GCN 1.3/2.0
     
  14. Dazz

    Dazz Maha Guru

    Messages:
    1,010
    Likes Received:
    131
    GPU:
    ASUS STRIX RTX 2080
    GCN 2.0 i believe it will be, since recently AMD said GCN wasn't going anywhere. It works well enough. I want to upgrade my R9 290 but don't feel compelled to as it playing everything i own @ 2560x1080p without breaking a sweat.
     
  15. Reddoguk

    Reddoguk Ancient Guru

    Messages:
    2,660
    Likes Received:
    593
    GPU:
    RTX3090 GB GamingOC
    Everything in the near future is light based. I think this is just a reference to it's power though.

    As in Polaris is 2.5 times faster than GCN. That i doubt is totally true. Not 2.5x faster than their fastest GPU. 2.5x faster than GCN 1.1 or even 1.0.

    I've been reading up on these light based processors of the near future, were they just replace copper traces in the PCB with fiber instead.

    Since we are getting close to the end of Moores Law(with silicon based tech) they are looking for ideas for a new way to use current silicon but with better/faster latency and throughput using light based internal connections.

    I very much doubt AMD will receive this new tech with Polaris but the light based CPU/GPUs will start making an appearance in 2016 that i have no doubt about.

    It's very clever tech because they can stretch out Silicon usage for longer while they look for alternatives.

    I've read that they might not even bother aiming for 8nm and 10nm will be a huge risk as well. So they are looking into ways of using 12/14nm for much longer and that means they need to find new ways of using 12/14nm.

    These light based processors have already been given the green light so expect half Silicon, half Quantum real soon. :)
     

  16. Lavcat

    Lavcat Master Guru

    Messages:
    552
    Likes Received:
    44
    GPU:
    Radeon 7900 XTX
  17. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,125
    Likes Received:
    969
    GPU:
    Inno3D RTX 3090
    As I posted in the AMD subforum, I believe it means 2.5x more energy efficient (watt/transistors packed), and the 169bce reference probably means 16.9 billion transistors for the top part.
     
  18. Moegames

    Moegames Guest

    Just get the 390 man...it is the best bang for the buck right now by far and it'll flex more muscle as drivers evolve and DirectX12 games begin to come out. It is more than enough for 1080p and 1440p and even that extra vram gives more room to grow over time, it's great to use for downsampling resolutions too.

    I just bought the gigabyte G1 390 and love it, it OC's at 1100mhz core with no voltage increase, of course i knew before buying the gigabyte version that the voltage would be locked in this brand but i really don't care as 1080p is my resolution of choice since i use a Panasonic Plasma HDTV as my monitor.

    Any of the new 400 series cards from AMD will take a bit of time to evolve as well. I never buy the first line of graphic cards of any new generation of cards. I usually wait for the 2nd wave/irritations of new gen cards before i even begin to think about buying. Like all new tech, i am very patient when buying any new tech as i like to wait for more efficient, better optimized versions of technology that usually comes after the first round of released technology.

    Besides, you can always sell your 390 down the road when you actually have good enough reason to upgrade to a newer gen of graphic cards. Pretty much all performance level cards of the next gen cards won't be a huge increase over the 390's. It's safe to say the biggest gains will be with the very high end of enthusiast cards first..then later you will see a much better bang for the buck cards with next gen cards down the road.

    Polaris is simply GCN 2/1.3, there is no way AMD is going to toss out GCN before it has even been utilized in games properly with directX12. The amount of R&D would be staggering to start a brand new approach to something that would be a true replacement of GCN. LIke others have said already, all Polaris is is a rename of GCN, it does not take rocket science intelligence to figure out what is going on here with AMD renaming GCN to Polaris. Heck i would bet my last dollar that the current AMD Fiji lineup of cards will be "ported over" to the new smaller nanometer of cards coming in 2016 and renamed once again as a "true" next gen card. Anyone wanna take any wagers>? :)

    Remember, both the Red and Green team rely heavily on rebranding these days, not just lineup of cards but the technology inside them. So just think about it...Polaris may have new bits of technology added to the exciting GCN technology but make no mistake about it, it will be GCN 2/1.3 rebranded. Don't let these mega companies dupe you like a rag doll. We don't need to go through that sh1t like the mainstream folks do! lol
     
  19. Tree Dude

    Tree Dude Guest

    Messages:
    532
    Likes Received:
    3
    GPU:
    Radeon R9 270X 2GB
    Silicon is an element, quantum is a type of computing. You would compare binary computing to quantum computing. Quantum is not becoming mainstream any time soon as we still don't fully understand how to use it outside of encryption algorithms.

    Graphene, carbon nano tubes, black phosphorus and a few others are being looked at to replace silicon. There is no clear winner yet so silicon will stay for a while longer.
     
  20. Reddoguk

    Reddoguk Ancient Guru

    Messages:
    2,660
    Likes Received:
    593
    GPU:
    RTX3090 GB GamingOC
    Controlling electrons using silicon is quantum physics, i never said it was going to be quantum computing.

    Directing atoms to travel down tiny fiber optics to remove some of the resistance copper has will double or even triple the current speed of CPU's.

    It's like putting a tiny internet web inside the silicon and removing need of copper traces. The bandwidth would improve immensely and caching would be near instant. This would also make silicon last god knows how long because they can just keep improving 10 nm forever.
     

Share This Page