ASIC Quality

Discussion in 'MSI AfterBurner Application Development Forum' started by Hootmon, Oct 14, 2016.

  1. Hootmon

    Hootmon Guest

    Messages:
    1,231
    Likes Received:
    6
    GPU:
    XFX THICC III Ultra
    I hope this is the right place to make this thread...

    I just became aware of an 'ASIC quality' feature of TechPowerUp's GPU-Z. Im running 1.9.0

    If you go to the Validation tab, and click on the three stacked lines icon in the upper right, the GPU Settings has an ASIC Quality tab. Mine is 76.9%

    Not sure what that means in the 'real world', but I have never seen it before outside of a few comments by Unwinder. That is why I put the thread here. Hope you don't mind, Alexey.
     
    Last edited: Oct 14, 2016
  2. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    Potential "silicone quality" in terms of electrical resistance if I understood any of what I've read in some amount of detail, there's been various conflicting reports even from GPU vendors it seems (some AMD engineer explained it a while back it seems.) also something on leakage and as a result potential overclocking limits but that's above my understanding.

    EDIT:
    https://linustechtips.com/main/topic/27079-what-is-asic-quality/

    Has some info. :)

    EDIT: Also not sure if it still holds but supposedly lower quality silicone for lower-end GPU models, makes sense in a way.
    http://www.geeks3d.com/20120122/test-asic-quality-of-geforce-gpus/
    (Similar to re-binning chips or how it's called for use on lower-end GPU's to avoid spill or how it's called, disable this or that to match the spec of some GPU and then test, if it passes now it's a Fury chip instead of a Fury X, as a example.)
     
    Last edited: Oct 14, 2016
  3. Hootmon

    Hootmon Guest

    Messages:
    1,231
    Likes Received:
    6
    GPU:
    XFX THICC III Ultra
    I will read that, Jonas, but...

    Im curious to see what GPU-z has to say about yours?

    ETA: Moral: quality doesn't necessarily equal quality
    Does it mean anything?
     
    Last edited: Oct 14, 2016
  4. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    Personally I actually haven't used GPU-z in a long time, last time I tried was when I had almost just built this original system with it's 6970 based GPU I think it was, caused a BSOD whenever I tried to run it (Even after testing when a updated version of it came out.) so I never used it since, probably fine now but still a bit wary as a result plus I'm not too bothered by it although I'd imagine this particular Fury model would be above average in quality since it's a later introduced model when fabs were better though that doesn't necessarily mean it's correct, besides Sapphire EOL'ed it I believe in favor for a even newer refresh (Model is called "Nitro" now.) with a slightly tweaked cooling design (The "butt plate" part of the cooling sink that sticks out over the PCB has a different design on it's shroud cover. :p ) and some PCB changes whatever those could be for, stability perhaps but that's all buried in marketing speech.

    It is also Fury GPU and not a Fury X (Though similar to how Nvidia does not allow changes to their Titan GPU models there are no custom FuryX models to compare against directly short of people flashing Fury GPU's successfully to the same specs as the X model by unlocking them.) anyways so it won't be top-quality I'd imagine, particularly also because the bios firmware flash to enable disabled compute units or what they were, clusters?, anyway it seems this GPU has a high chance of having actual faulty ones instead of just being locked or disabled.
    (Supposedly also proven by how Fury X GPU models generally can be clocked higher - and with less voltage increase required. - even with temperature and such under control compared to the stock cooling solution on the regular Fury.)

    Perhaps I'll give a try though, just because it bugged out years ago (And on Win 8.1 then, or maybe straight 8 without that update.) doesn't mean it'll be a problem now but it's just something that's stuck or how you say. :)


    EDIT:
    Also not that you can wait forever on the next thing but had I waited a few more months the 4000 CPU series would have been available with improved functionality support such as proper PCI-E 3.0 functions instead of whatever it's now. (And probably upwards of a 50% or so cost increase, ha ha.)
    (Not that that's what likely caused that above problem though or that bandwidth even with "pseudo" 3.0 support is a problem yet.)

    But at the moment there's nothing to upgrade to or well that would be worth upgrading to yet for what it would cost (RAM, motherboard and CPU at the least, PSU probably too and maybe this time something less overkill just because I could.) and what it would amount to in performance or how you'd say, x99 seems like a "dead end" from what I've read up on now but still the only fully high-end motherboard and CPU combo from Intel as even the upcoming CPU's and the 200 series boards will be a bit in-between or how to say with higher-end models planned with some CPU lineup in maybe 2018 so yeah, heh.
    (Though other improvements besides CPU clock speed or cores or whatever are useful too of course, lower energy consumption for one thing or actually having real USB 3.x support or PCI-E 3.x along with that M.2 port for high-speed SSD units without using PCI-E space and then there's DDR4 too with whatever improvements that has over DDR3 outside of higher-speed and also higher timings heh.)

    Which leaves AMD but that's a bit uncertain how their upcoming CPU's will compare though maybe it'll be a surprise hit, would be fun to have some competition again at least even if it might not be on the absolute top-end models of CPU's just yet. :D
    (But that's all better for a separate topic I suppose, heh guess I'm rambling on again which means I'm probably pretty tired, sorry. :p )


    EDIT: Well going back to asic again I almost forgot that even for the same GPU model (IE Sapphire Tri-X Fury to use mine.) it can still vary a lot in reported quality between every GPU, they're all unique due to well I don't understand this fully either but the silicon is never truly perfect.
    (They all still work of course - and are binned and tested to certify this. - but as a result it's a bit of a "lottery" as it's called in just how well your particular GPU will handle overclocking and/or voltage increases or similar.)

    (or CPU for that matter, I think it's about the same for that.)
     
    Last edited: Oct 14, 2016

  5. Hootmon

    Hootmon Guest

    Messages:
    1,231
    Likes Received:
    6
    GPU:
    XFX THICC III Ultra
    The variance between specific GPUs was sort of my point.

    Thanks again for even trying to address it.
     

Share This Page