Computex 2015 Exclusive: AMD Fiji GPU Die Photo

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jun 3, 2015.

  1. Evildead666

    Evildead666 Guest

    Messages:
    1,309
    Likes Received:
    277
    GPU:
    Vega64/EKWB/Noctua
    From what I read about DX12, Win10 will allow much better rendering for x2 (edit : x3, x4,...) cards. Much reduced latency, SFR instead of AFR, and RAM Stacking, or Unified memory for the GPU's (Finally!!!!).

    I can see the PCIe Gen3 x16 bus being used a hell of a lot more.

    I would expect both sides are putting 110% into a new OS's drivers, especially an OS almost directed at Gamers, with DX12.
    Almost everyone is expected to update, since its free.
    All Win7/8/8.1 then move over to Win10.
    They then only need one driver team for Win10, and a 'legacy' driver team (as Nvidia will have also) for Win7/8.1
     
    Last edited: Jun 3, 2015
  2. LimitbreakOr

    LimitbreakOr Master Guru

    Messages:
    621
    Likes Received:
    158
    GPU:
    RTX 4090
    Do you realize that dual link dvi adapter costs 100$ by it self? I cant believe amd is forcing me to buy a 980ti... I really wanted to go with amd This round, even sold my 980 as it cant handle my new monitor at 4k gaming.
     
  3. Texter

    Texter Guest

    Messages:
    3,275
    Likes Received:
    332
    GPU:
    Club3d GF6800GT 256MB AGP
    Yeah but their estimation completely discounts the AIB 980Ti's which are launching as well atm. So that would probably mean a 'reference' hybrid cooled Fury vs a bunch of 1.1-1.3 GHz 980Ti's with big air coolers that allow for 1.5GHz clocks without even hitting the Fury's reference TDP. Combine that with any driver issues on AMD's side and it's a lost cause to want to claim a performance crown.
     
  4. Mineria

    Mineria Ancient Guru

    Messages:
    5,540
    Likes Received:
    701
    GPU:
    Asus RTX 3080 Ti
    There is still a need to have current games to render faster.

    As for DX12, another interesting part of it is using the best of a mixed GPU architecture, question is if AMD and NVidia will let us have that, could be cool to have the best of both worlds.
    As for the rest, we have to wait and see, to much of it is still up to game developers, would be much better for the consumer if it was a native part of the whole rendering process.
     

  5. Toss3

    Toss3 Guest

    Messages:
    202
    Likes Received:
    17
    GPU:
    WC Inno3D GTX 980 TI
    A passive displayport to dual-link dvi adapter can be had for around $20.
    Linky
     
  6. Evildead666

    Evildead666 Guest

    Messages:
    1,309
    Likes Received:
    277
    GPU:
    Vega64/EKWB/Noctua
    Win10, July 29th.
    I don't expect there will be many driver problems (on either side) if every gamer upgrades to Win10/DX12. :)

    If a new OS that everybody was supposed to upgrade to was coming, i'd be working on it 24/7 with the entire team, up to, and after, release date.
     
  7. Mineria

    Mineria Ancient Guru

    Messages:
    5,540
    Likes Received:
    701
    GPU:
    Asus RTX 3080 Ti
    4K monitor without DP??

    As for older monitors this will do (1920x1200 @60Hz):
    http://www.amazon.com/BuyCheapCable...433336686&sr=1-1&keywords=dp+to+dvi+dual+link
     
    Last edited: Jun 3, 2015
  8. Evildead666

    Evildead666 Guest

    Messages:
    1,309
    Likes Received:
    277
    GPU:
    Vega64/EKWB/Noctua
    Yeah, that would be REALLY cool :)
    I wouldn't mind having two cards if it gave me the best of both worlds, at the same time no least!!!
    Lets hope the OS will help lockout vendor independant stuff.
     
  9. SHOCKTRUPPEN

    SHOCKTRUPPEN Banned

    Messages:
    89
    Likes Received:
    0
    GPU:
    GTX 980 Strix
    Amd fiji

    Is it possible that we have come to a fork in the road with graphic cards? One card can not carry out everything it needs to anymore? The price points nowadays are outrageous. To buy a new high end card you have to decide, do I want a whole new rig, or just a new graphics card? Why isn't anybody talking about this? I think you are all lemmings for paying the prices now to what, play games? You all throw barbs at each other telling your selves your stupid if you like Nvidia ,no your stupid for liking AMD. No, you are ALL stupid for paying the high prices! And really aurguing about 28nm and 14nm, and oh my god it needs water cooling, and oh my god its just slightly slower than the 980ti,(which costs the same as nice used 1994 bmw 318 convertabile that you can drive around for the summer with the top down and get some) You are all out of your minds, nit picky, holier than thou, rude, clueless bunch of whiney wash women all about what? Graphic cards!!!!!???? Really? STFU, wait for the cards to come out, too expensive? Don't fkn buy it!! Not up to snuff on performance? Don't fkn buy it!!! 5mhz slower than the other brand but $100 cheaper? But it still costs $700? don't fkn buy it!!! Just stfu for once, you are all dumb, and all your trys at a witty come back will be for not because I will no longer read your silly comments. So impress your friends, er wait your girlfriends, er wait impress yourself with your witty comebacks. You are the only one that cares what you say trust me!!:banana::flip2::finger::flip::booty::finger2::eatme:
     
  10. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    ...

    Lol?
     

  11. Evildead666

    Evildead666 Guest

    Messages:
    1,309
    Likes Received:
    277
    GPU:
    Vega64/EKWB/Noctua
    Well, the interposer does allow for differently manufactured chips to be placed on the same interposer.
    It would mean that the external chips wouldn't have to be on the same expensive process.

    I don't know, there might be some case where an external chip on such a fast bus would be useful. Only memory really comes to mind lol :)

    16/14nm GPU's won't cost less, not at the beginning.
    It will depend on yields. A 4096 shader GPU like Fiji on 16/14nm would be a LOT smaller, and a lot better in the power consumtion dept.
    It looks like the Fiji GPU only just fit on the interposer, which means it is HUGE.
     
  12. MBTP

    MBTP Member Guru

    Messages:
    143
    Likes Received:
    11
    GPU:
    Sapphire RX590
    4GB, this is not good. Well, lets wait and see the results.
     
  13. Evildead666

    Evildead666 Guest

    Messages:
    1,309
    Likes Received:
    277
    GPU:
    Vega64/EKWB/Noctua
    Tro-lol-loll. :)

    That said, the 28nm/14nm issue is important, since that is the reason this Gen cards are so expensive.
    Large GPU's on 28nm, when we should have technically been on 20/16/14nm when this Gen chips were thought of and designed.
     
  14. xIcarus

    xIcarus Guest

    Messages:
    990
    Likes Received:
    142
    GPU:
    RTX 4080 Gamerock
    It looks to me like they extrapolated the number of cores % increase to FPS % increase. Which is most likely not true.
    I mean by that logic, the Titan X should be 50% faster than the 980 by core count alone, which should translate into 50% more FPS. Except it doesn't work that way. Know what I mean?

    Also I question the validity of the benchmarks. Titan X avg fps @ 1440p is 104FPS and @ 4k it's 104FPS aswell? Yeah..

    But it's true, we don't know anything about how HBM affects performance. That makes the performance really difficult to predict.
    But if true, it would be a huge slap to Nvidia. Which is what we're all waiting for. Or at least what I'm waiting for.

    And how would VRAM bandwidth help if the content the card requires is not in the VRAM? Seriously, that makes no sense.

    And you have 2 cards that render 4k maxed out at more than 30FPS in pretty much every game. They're called Titan X and GTX980Ti.
     
  15. Asgardi

    Asgardi Guest

    Messages:
    248
    Likes Received:
    14
    GPU:
    MSI GTX 980 Ti OC
    Hmm...

    Your reasoning doen't make ANY sense. It is the FPS in games which determines better performance, not memory bandwidth. And based on information in this story and Nvidia's pricing its starting to look like 980 TI will give you more FPS. And it does so while running cooler and with less power, hence making less noise if air-cooled. And as mentioned by someone before, the 980 TI also have quite nicely room for overclocking.

    What comes to "innovation", the HBM memory is made by Hynix, not AMD. They just buy the components and use it in their card, just like Nvidia will do in the future.

    I hope Fiji will be super fast so that Nvidia would have pressure to drop prices but its the FPS in benchmarks which makes it better, and there is no proof of such thing as of now.
     
    Last edited: Jun 3, 2015

  16. Evildead666

    Evildead666 Guest

    Messages:
    1,309
    Likes Received:
    277
    GPU:
    Vega64/EKWB/Noctua
    I would agree that 4GB is too little for a next gen card.
    I already come up to 3.6GB usage at 1080p in GTA V (FXAA).
    They couldnt really do much though, HBM2 isnt available yet.
     
  17. LimitbreakOr

    LimitbreakOr Master Guru

    Messages:
    621
    Likes Received:
    158
    GPU:
    RTX 4090
    No, I have a 1080p@120 monitor and also a 4k monitor, I still need my 120hz as it is very fast and smooth as opposed to my 4k that has a noticable input lag even though Samsung claims 2ms...
     
  18. Hughesy

    Hughesy Guest

    Messages:
    357
    Likes Received:
    1
    GPU:
    MSI Twin Frozr 980
    Really? I use less than that at 2560x1440p in GTA V.
     
  19. Evildead666

    Evildead666 Guest

    Messages:
    1,309
    Likes Received:
    277
    GPU:
    Vega64/EKWB/Noctua
    Thats what comes up as GPU Memory usage in GPU-Z after a while of gaming.
    I have fxaa on, motion blur off, and depth of field off, msaa reflect. off, vsync on, shadow qual. softer, shadows on high (Biggest gain in fps from this), grass on high/highest, extended distance on full, and the rest of the normal options on their highest settings iirc, and other extended options off.
    It shows less in the in-game count though.
     
    Last edited: Jun 3, 2015
  20. waltc3

    waltc3 Maha Guru

    Messages:
    1,445
    Likes Received:
    562
    GPU:
    AMD 50th Ann 5700XT
    Agreed...;) You'd think some of them had never seen a gpu launched with a new architecture before....! (And it is a new architecture whether or not it internally resembles a current gpu architecture--HBM changes everything.) No dvi? I haven't used that in several years--the new DP & HDMI 2.0 are more than adequate--DVI is really long in the tooth comparatively.

    I don't get the gossip about the power usage, either, as cards that have two 8-pin connectors generally don't always draw the maximum possible. Additionally, I'm sure AMD has a lot of work yet to do on the drivers. The water cooling is simply because the gpu surface area is so *small* compared to current products--it's always tougher to adequately cool smaller chips that run very fast (the "huge" die not withstanding.)

    Theoretically, anyway, this product compared to current products is like comparing a card with a 64-bit ram bus to a card with a 256-bit ram bus...I would think the product would shine at extremely high resolutions. At any rate, it's just the first step in brand new direction...nVidia has already talked up similar products itself--it's just that AMD is beating them to it...
     
    Last edited: Jun 3, 2015

Share This Page