Computex 2015 Exclusive: AMD Fiji GPU Die Photo

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jun 3, 2015.

  1. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    gddrBandW = 32; //gddr5 bus width
    hbmBandW = 1024; //hbm bus width

    Test isn't really accurate. He's only factoring in bus width and not speed. While GDDR5 has a much smaller bus it's running at a way higher speed. A more accurate test would be 512/1024 for those parameters. As actual bandwidth comparison would be like 320 vs 640 or something. (290x vs FijiXT) This isn't to mention that games don't remove all texture elements from memory after being rendered.
     
    Last edited: Jun 4, 2015
  2. Mineria

    Mineria Ancient Guru

    Messages:
    5,540
    Likes Received:
    701
    GPU:
    Asus RTX 3080 Ti
    A script doesn't reflect actual gameplay, same as when people tested their 970's with RAMGATE and other scripts.
    I got exact the same results (headless mode ofc.) as those who had stuttering issues, yet I did not have any stuttering neither high framerate jumps.
    Besides that, game rendering is much more complex than drawing squares and filling them into memory.

    Much more interesting to see some real world performance, especially the minimum framerate archived for each game.
    All other tests are **** worth.
     
    Last edited: Jun 4, 2015
  3. seaplane pilot

    seaplane pilot Guest

    Messages:
    1,295
    Likes Received:
    2
    GPU:
    2080Ti Strix
    A bit of info for the peeps telling other peeps to just get a DVI converter for people with Korean or similair panels that can overclock. Problem is it needs to be active and they cost about 120.00 plus USD and the converters dont play well over 85Hz refresh rate at 1440p from my experience. Unless StarTech has made a new improved active converter that will do 100Hz plus refresh at 1440p?
     
  4. Mineria

    Mineria Ancient Guru

    Messages:
    5,540
    Likes Received:
    701
    GPU:
    Asus RTX 3080 Ti
    Active adapters are only recommended for up to:
    1920x1200 @120Hz
    2560x1600 @60Hz

    Question is if it is AMD that sucks for not offering DVI or the panel vendors for not offering DP.
     

  5. ---TK---

    ---TK--- Guest

    Messages:
    22,104
    Likes Received:
    3
    GPU:
    2x 980Ti Gaming 1430/7296
    My korean monitor is dvi only and is overclockable to 100hz, they may have to be dvi to overclock maybe? Not sure on that have not read about them in a long time.
     
  6. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    Dual-link DVI is needed for the overclockable Korean monitors and older 1080p 120Hz monitors. DisplayPort to dual-link DVI active adapters are expensive ($70+) and most of them do not really allow pixel clocks above 330MHz. One exception is mentioned by ToastyX on his forums (http://www.monitortests.com), it's capable of going up to 400MHz which is enough for about 1440p 104Hz, give or take (with tightened timings).

    Dropping the DVI port is a *very* bad move, making that card at least $70 more expensive for anyone who's running this card with any monitor lacking DisplayPort and is other than 1080p 60Hz. Any savings by buying this card over the 980Ti are eliminated, especially if this card is slower than the 980Ti.

    I think AMD are trying hard to drive Korean monitor users over to Nvidia. The monitors have been selling like hotcakes between 2012-2015, and anyone who's had a taste of 1440p 120Hz before the Asus ROG Swift and the Acer Predator is on one of these monitors...

    Anyone on a Catleap, Overlord, or older Qnix will have to lower their overclock to 100Hz (from up to 110-120Hz, few 130Hz), given they are able to find and buy the adapter that goes up to 400MHz (and I'm betting is $90).

    It's too early to get rid of DL-DVI.

    DVI is cheaper, and it's the only interface available so far for a special variant of monitor PCBs that have a display controller (EP269) capable of a 450MHz pixel clock (~1440p 120Hz) and comes without an OSD or any artificial limitations (hence why it's able to overclock without issue to the PCB). Most monitors on the market either have chips that are capable of less. 330MHz for 1080p 120Hz, 1440p 60Hz, and 165MHz for 1080p 60Hz (OC to ~76Hz). Most OSDs would give an error message above certain limits, and some would firmware variants even refuse to display.
     
    Last edited: Jun 5, 2015
  7. eclap

    eclap Banned

    Messages:
    31,468
    Likes Received:
    4
    GPU:
    Palit GR 1080 2000/11000
    Yep, agreed with Yasa, AMD just rendered themselves out of the frame for me, like many others I'm running a Qnix here at 96hz and that equals to 393mhz pixel clock with standard timings, I haven't gone reduced because I don't need to at that refresh rate. I'm not willing to spend the extra £££ on an expensive adaptor. It's a pretty retarded move by AMD and I honestly don't understand why they did away with DL DVI.
     
  8. ---TK---

    ---TK--- Guest

    Messages:
    22,104
    Likes Received:
    3
    GPU:
    2x 980Ti Gaming 1430/7296
    Yeah lame, load of people have monitors that would need the adapter.
     
  9. eclap

    eclap Banned

    Messages:
    31,468
    Likes Received:
    4
    GPU:
    Palit GR 1080 2000/11000
    Another thing AMD don't need right now is the backlash from angry customers who will buy their gpus and not have compatible monitors. There will be a fair few of those. And some of them will take to internet and bash AMD. It's a bad move.

    Has there been an official word on why AMD went down this route btw? Anyone know?
     
  10. tiff_lee

    tiff_lee Guest

    Messages:
    7
    Likes Received:
    0
    GPU:
    HD6950 2GB
    Display connectivity is a right mess and definitely needs some unification so everyone is on the same page.

    With regards to the whole active/passive dongle debate i'm referencing this AMD video on youtube, apologies but due to my low post count i've had to break the link up (not looking to get an infraction but the video is relevant to what im blabbering about)

    www youtube com/watch?list=PL6CFEA6C71BB14979&t=363&v=Jf0X0lNFmgw

    now the video states and as i'm sure most are aware the 5/6/7xxx series support 2 displays (legacy connections) before requiring the use of an 'active' DP dongle.
    The legacy connections are VGA/HDMI/DVI but among them also is Displayport to DVI, whereby a digital signal is passed through from the DP 'port' on the card without the need for an active dongle i.e the native display port signal does not need to be converted (around 4 min mark in vid).
    Now to me that suggests using the displayports as a legacy connection is in effect the same as if it were 2 DVI ports on the card although I could well be misunderstanding that (have no idea if this method of connection restricts pixel fill rate or not), however fast forward to around the 6 min mark and it shows you when using a DVI connection for a third then you would need to convert the native display port signal.

    Also it is possible that board partners could integrate active display port chipsets into the cards negating the purchase of active dongles which has been the case with previous cards.
     

  11. StarvinMarvinDK

    StarvinMarvinDK Maha Guru

    Messages:
    1,374
    Likes Received:
    119
    GPU:
    Inno3D 4070Ti 12GB
    Maybe they are bundling an adapter along the gfx ?

    Or else its simply:
    [​IMG]
     
  12. xIcarus

    xIcarus Guest

    Messages:
    990
    Likes Received:
    142
    GPU:
    RTX 4080 Gamerock
    Yep, that's exactly my point. Stuttering is far worse than lower fps in my opinion.
    Maybe AMD could have a special trick for this but I'm not sure. Some very advanced compression algorithm could be really good for them.
    But fitting 6GB into 4GB.. That's a pretty tall order, although at this point it seems necessary.

    I can relate. Testing with those tools showed that the card indeed has a problem. But really, I haven't seen stuttering in any of my games.
    People really made that issue look like it was far worse than it actually is.
    The only thing I was pissed about was Nvidia lying to us.
     
  13. mR Yellow

    mR Yellow Ancient Guru

    Messages:
    1,935
    Likes Received:
    0
    GPU:
    Sapphire R9 Fury
    They usually bundle adapters.
     
  14. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    I'd imagine that that they feel DVI is on it's way out -- which at this point it should be. Like aside from a brief resurgence fueled by cheap korean overclockable monitors, I doubt many gamers who are going to buy a ~$700 GPU don't also have a HDMI/DP screen to go with it.

    Yeah, it's defintely going to come with one for regular DVI monitor users. The only people this change effects is the Korean monitor people. And that represents like 1/1000th of the gaming community. It probably wasn't even a thought in AMD's minds and they probably don't care after that.
     
    Last edited: Jun 5, 2015
  15. tiff_lee

    tiff_lee Guest

    Messages:
    7
    Likes Received:
    0
    GPU:
    HD6950 2GB
    I highly doubt the removal of DVI was a deliberate attempt to alienate the Korean monitor users, trying to alienate customers no matter what you're selling is just bad business practice.
     

  16. MADOGRE

    MADOGRE Guest

    Messages:
    11
    Likes Received:
    0
    GPU:
    Gigabyte 980ti G1
    I don't care for AMD, never had a good result from the ones I have owned
    (4) mainly driver problems. I do hope they are competitive because we have all seen what happens when they are not, we get a 660 labeled 680 for $499 and a 960 as a 980 for $549-599.
    So for every ones wallet lets hope they have some thing that competes well and keeps prices down.
     
  17. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    They're definitely not 0.1% of those who buy such GPUs. Look at this forum, you'll see just how many have Korean monitors. The Qnix OCN club is one of the most viewed and biggest threads as well.
     
  18. Humanoid_1

    Humanoid_1 Guest

    Messages:
    959
    Likes Received:
    66
    GPU:
    MSI RTX 2080 X Trio
    Must admit I was tempted by one of thise screens for the obvious reasons too. Even so I hadn't thought so many people went for them!
     
  19. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    Ok so lets pretend that every single post in the Qnix OCN club is a different user that owns the monitor then lets multiple that by five because why not.

    That's 117,850 Qnix monitors. Sounds impressive right? Nvidia sold ~10,000,000 GTX 680/670s alone.

    Like I get that Korean monitors are popular on these tech forums, but these tech forums are a very minor subset of GPU buyers. And the Korean monitors are a subset of tech forum goers. For AMD I'm willing to bet that "Korean Monitor Users" never came up in any of their internal conversations about the outputs on this card.

    Don't get me wrong -- I own a Qnix 2710, I used it as a secondary and I have it clocked at 96hz. But I think that there comes a time where some of this older stuff needs to be retired. And I don't think it should be postponed based on one monitor, let alone the Qnix.
     
    Last edited: Jun 5, 2015
  20. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    That makes sense my man.

    My objection is that AMD need every single customer. When deciding what interface to get rid of, in this case DVI, they should have considered what monitors rely exclusively on DVI. One AMD driver had fixed an issue related to a Korean monitor, more than a year ago, so they're pretty aware of such monitors, especially since they were the only option for 1440p 120Hz for more than 2 years.

    I would have advocated they get rid of DVI eventually, but definitely not now, it's still 2015. DVI-only monitors were produced just 2-3 years ago, it's not like VGA. Maybe a couple of years and it would have been fine.

    We need cheap DP to DL-DVI adapters to offset the loss, but I'm not holding my breath.
     

Share This Page