MSI Afterburner .NET Class Library

Discussion in 'MSI AfterBurner Application Development Forum' started by stangowner, Feb 28, 2011.

  1. stangowner

    stangowner Guest

    Messages:
    607
    Likes Received:
    11
    GPU:
    2xMSI N550GTX-Ti Cy II OC
    Yes, I receive 0 too. I think Afterburner only returns that value for some cards. I have a pair of 550Ti s and they return 0 too. If memory serves.....I think it should show in information dialog in Afterburner. If its missing there, then you'll get 0 from this API too.

    Using C# code that dumps the complete .net class:
    Code:
    ***** MSI AFTERBURNER GPU 0 *****
    GpuId = VEN_10DE&DEV_1244&SUBSYS_26101462&REV_A1&BUS_2&DEV_0&FN_0
    Family = GF116
    Device = GeForce GTX 550 Ti
    Driver = 306.97
    BIOS = 70.26.18.00.00
    MemAmount = 0
    
    ***** MSI AFTERBURNER GPU 1 *****
    GpuId = VEN_10DE&DEV_1244&SUBSYS_26101462&REV_A1&BUS_3&DEV_0&FN_0
    Family = GF116
    Device = GeForce GTX 550 Ti
    Driver = 306.97
    BIOS = 70.26.18.00.00
    MemAmount = 0
    And using your .ps1 code:
    Code:
    MSI Afterburner (API version 2.0)
    GPU #1
      Device: GeForce GTX 550 Ti
      Driver Version: 306.97
      Memory: 0
    GPU #2
      Device: GeForce GTX 550 Ti
      Driver Version: 306.97
      Memory: 0
    - Nick
     
  2. timiman

    timiman Guest

    Messages:
    42
    Likes Received:
    1
    GPU:
    2xR9-290 Asus DCU-II
    No luck here. I've just tried that and I'm still getting the "dead shared memory" message. I also removed the "=nothing" and got the same results.

    I forgot to mention this in my post. I have an issue about the "HardwareMonitorMSIAft" because originally it is declared as object, if no datatype is set there. When I'm using "HardwareMonitorMSIAft = New HardwareMonitor" and then get the "HardwareMonitorMSIAft.GpuEntries.Count" I get no values at all. This one has to do with bad automated casting from Object to HardwareMonitor, i think. So, I left this solution out from "day 1".
     
  3. stangowner

    stangowner Guest

    Messages:
    607
    Likes Received:
    11
    GPU:
    2xMSI N550GTX-Ti Cy II OC
    OK, I'll try to look into it further in the next few days.

    Even if you Dim As? Sorry, I haven't really used VB since VB6.
    Code:
    Dim HardwareMonitorMSIAft As HardwareMonitor
     
  4. timiman

    timiman Guest

    Messages:
    42
    Likes Received:
    1
    GPU:
    2xR9-290 Asus DCU-II
    OK. Thanks. I'm available if you want any help about tests etc.

    If I use the "Dim HardwareMonitorMSIAft As HardwareMonitor" with MSI Afterburner closed, I get the exception of "MSI Afterburnrer is not running" without been able to handle it. That's why I'm using the Try..Catch.
    Just an idea, I will try to use an "AddHandler" on HardwareMonitorMSIAft when I get home and see if this will help at all.
     

  5. danielkza

    danielkza Guest

    Messages:
    8
    Likes Received:
    0
    GPU:
    Shappire HD4830
    @stangowner

    Thanks for the confirmation. I resorted to going through the video controllers in WMI and matching them with the device IDs provided by Afterburner. I don't have a setup with two identical cards to confirm it works, but it should if all my assumptions were right.


    $abMonitor is an instance of HardwareMonitor, and $abControl is an instance of ControlMemory. The final result is an array of memory sizes in MB where the indexes correspond to the Afterburn GPU indexes.


    Code:
    Function Get-WmiGPUMem($abMonitor)
    {
        $gpuEntries = $abMonitor.GpuEntries
        $gpuMemories = @(0) * $gpuEntries.Length
      
        $videoControllers = Get-WmiObject "Win32_VideoController"
    
        # Look through the video controllers, convert their device IDs to a format comparable to Afterburner's,
        # then select the one that matches the selected GPU
    
        foreach($vc in $videoControllers)
        {
            $driver = Get-WmiObject "Win32_PnPSignedDriver" | ? {$_.DeviceID -eq $vc.PNPDeviceID}
            if($driver.Location -match 'PCI Bus (?<bus>\d+), device (?<device>\d+), function (?<function>\d+)')
            {
                $bus = $Matches['bus']
                $dev = $Matches['device']
                $fn = $Matches['function']
    
                if($vc.PNPDeviceID -match 'PCI\\([^\\]+)')
                {
                    $abDevID = $Matches[1] + ('&BUS_{0}&DEV_{1}&FN_{2}' -f $bus,$dev,$fn)
    
                    for($i=0; $i -lt $gpuEntries.Length; $i++)
                    {
                        if($gpuMemories[$i] -ne 0)
                        {
                            continue
                        }
    
                        if($abDevID -eq $gpuEntries[$i].GpuID)
                        {
                            $gpuMemories[$i] = $vc.AdapterRAM / (1024 * 1024)
                            break
                        }
                    }
                }
            }
        }
    
        return $gpuMemories
    }
    
    Function Get-GPUMem($abMonitor, $abControl)
    {
        $gpuMemories = @(0) * $abMonitor.GpuEntries.Length
        $wmiGpuMemories = $null
    
        for($i=0; $i -lt $abMonitor.GpuEntries.Length; $i++)
        {
            $gpuMonitor = $abMonitor.GpuEntries[$i]
            $gpuControl = $abControl.GpuEntries[$i]
    
            if($gpuMonitor.MemAmount -gt 0) {
                $gpuMemories[$i] = $gpuMonitor.MemAmount / 1024
            } else {
                if($WmiGPUMemories -eq $null) {
                    #Write-Warning "Failed to detect GPU RAM from Afterburner. Reverting to WMI method."
                    $WmiGPUMemories = Get-WmiGPUMem $abMonitor.GpuEntries
                }
    
                $gpuMemories[$i] = $WmiGPUMemories[$i]
            }
        }
    
        return $gpuMemories
    }
    
    And if I'm not asking too much, I noticed you have two GTX 500 cards. Would you mind testing the full script? I only have one GTX 600 card, and I would like to know if everything works as expected before launching it 'officially'. The script is here, and the needed files are here. You also need the MSI.Afterburner.NET.dll obviously. (everything in the same folder)

    The script has a bunch of options but you don't need to change any of them if you don't want, except -gpuIndex and possibly -testIterations (so you don't waste much of your time). By default the script will start with the default clocks and raise them 200Mhz at a time, and halves the step each time a test fails.

    Thanks in advance.
     
    Last edited: Jan 24, 2013
  6. stangowner

    stangowner Guest

    Messages:
    607
    Likes Received:
    11
    GPU:
    2xMSI N550GTX-Ti Cy II OC
    Hi,

    I've started this twice, once with some parameters and once without. The results are
    Code:
    PS E:\temp\powershell> .\MemTestG80_AutoOC.ps1
    GPU #0
      Device: GeForce GTX 550 Ti
      Driver: 306.97
      Memory Clock (default): 2150MhzMhz
      Memory Clock (current): 2150MhzMhz
      Memory Size: 1023.6875MB
    GPU #1
      Device: GeForce GTX 550 Ti
      Driver: 306.97
      Memory Clock (default): 2150MhzMhz
      Memory Clock (current): 2150MhzMhz
      Memory Size: 1023.6875MB
    False
    Trying 2350Mhz with 1023.6875MB and 50 iterations: SUCCESS
    Trying 2550Mhz with 1023.6875MB and 50 iterations: SUCCESS
    Trying 2750Mhz with 1023.6875MB and 50 iterations: SUCCESS
    Trying 2950Mhz with 1023.6875MB and 50 iterations: clock_too_high
    SUCCESS
    Trying 3150Mhz with 1023.6875MB and 50 iterations: clock_too_high
    SUCCESS
    Trying 3350Mhz with 1023.6875MB and 50 iterations: clock_too_high
    SUCCESS
    Trying 3550Mhz with 1023.6875MB and 50 iterations: clock_too_high
    SUCCESS
    ......
    Trying 50550Mhz with 1023.6875MB and 50 iterations: clock_too_high
    SUCCESS
    Trying 50750Mhz with 1023.6875MB and 50 iterations: clock_too_high
    SUCCESS
    Trying 50950Mhz with 1023.6875MB and 50 iterations: clock_too_high
    SUCCESS
    Trying 51150Mhz with 1023.6875MB and 50 iterations: clock_too_high
    SUCCESS
    Trying 51350Mhz with 1023.6875MB and 50 iterations: clock_too_high
    SUCCESS
    Trying 51550Mhz with 1023.6875MB and 50 iterations: clock_too_high
    PS E:\temp\powershell>
    In both instances, I pressed CtrlC to stop it. I have no idea why the clock keeps climbing 200 Mhz each test....I was under the assumption it was going to climb 200, 200, 200, fail, -100, 50, 25, fail, -12, etc until it found the most stable frequency. Am I not understanding this properly?

    Also, Afterburner said an external profile was applied when I started the script, but I did not notice any changes. And it only said it once. Just so you are aware, when applying changes via the API, it takes one second for the changes to be applied, but the script processes the iterations faster then that.

    Let me know your thoughts. I'll test it for you again if you want.
    - Nick
     
  7. danielkza

    danielkza Guest

    Messages:
    8
    Likes Received:
    0
    GPU:
    Shappire HD4830
    @stangowner:

    You got it right, it was supposed to go up 200 MHz and then slowly back out on failure, but I've seem to have not tested my logic enough. Thanks very much for your help, I'll work on it a bit more. At least I know it's not blowing up completely with two GPUs, it's already a start!
     
  8. stangowner

    stangowner Guest

    Messages:
    607
    Likes Received:
    11
    GPU:
    2xMSI N550GTX-Ti Cy II OC
    Timiman, mind doing me a favor? I've been trying to track down this handle to dead memory issue. I think I may have found the problem.

    I created a simple app using your snippet as a base to reproduce the problem. Fortunately I can reproduce it quite easily.
    Code:
    Imports System.Threading
    Imports MSI.Afterburner
    
    Module Module1
    
        Sub Main()
            Dim i As Integer
    
            While i < 40
                Try
                    Dim mahm = New HardwareMonitor
                    Console.Write(mahm)
                    mahm.Disconnect()
                    mahm = Nothing
                Catch ex As Exception
                    Console.WriteLine(ex.Message)
                End Try
                i = i + 1
                Thread.Sleep(800)
            End While
    
        End Sub
    
    End Module
    However, after stepping through the code several times and even trying several things, I could not for the life of me figure out why it was happening. So thinking it may be a garbage collection issue, I changed the sleep timer in the loop. Sure enough, anything under 1000-1500ms can cause the issue, meanwhile anything over 1500ms does not. Simply increasing that makes the issue go away.

    So my question to you is can you insert an artificial delay of 2 seconds into your application whenever you are disconnecting/disposing one of these objects? Please let me know if this solves it for you too.

    I have placed a GC.Collect() in the end of the Disconnect() method the HardwareMonitor class. This forces the garbage collector to run, and seems to resolve the issue. I can now run my test code in a loop at 100 ms, or even comment out the delay and let it run full speed, and I no longer see the issue. I can update the classes with this change once you confirm and I do more testing.

    Thanks,
    Nick
     
  9. timiman

    timiman Guest

    Messages:
    42
    Likes Received:
    1
    GPU:
    2xR9-290 Asus DCU-II
    stangowner,

    I've added a 2000ms delay (even 4000ms) on my application and the problem remains as it was.
    On the other hand your snippet works without any message of "dead shared memory" at all.

    Maybe my logic of the sensors reading needs some changes, too.
    For example, the init of the available sensors runs every second. (during the test above, this timer was set to 10000ms).
    Also, every second I call a GC.Collect() to clear memory, too,
    knowing that the GC.Collect() is not always really fired, but "pushed" to be fired sooner.

    To get more realistic results with your idea, it won't be a bad idea to send me a test version of your .dll. If it is OK with you, of course.
     
  10. stangowner

    stangowner Guest

    Messages:
    607
    Likes Received:
    11
    GPU:
    2xMSI N550GTX-Ti Cy II OC
    Sure, just let me just update the ControlMemory class too, test it, and then I'll send you link to download.

    Thanks for testing with me.
     

  11. LongbowX

    LongbowX Guest

    Messages:
    1
    Likes Received:
    0
    GPU:
    Kingston 8GB
    Hi Strangowner,
    I emailed you last night using the address on your code documentation page. Posting here to get your attention, I'm not sure if you check that address.
    -I
     
  12. jwdenotter

    jwdenotter Guest

    Messages:
    1
    Likes Received:
    0
    GPU:
    290 4gb
    Hi,

    First of all, many thanks for the great work on this library!

    I just have one question, is it also possible to configure the settings of Afterburner using this library within C#.

    For instance, there is a general setting that you can control the voltage (which is default disabled) and to synchronize properties (like the fan speed) to all type of same cards.

    I would like to control those settings, any idea how this can be done?

    Regards,

    Jan Willem
     
  13. stangowner

    stangowner Guest

    Messages:
    607
    Likes Received:
    11
    GPU:
    2xMSI N550GTX-Ti Cy II OC
    This library was updated to resolve an issue where a large number of monitors (40+) would result in an 'Access Denied' error. Simply replace the previous version of the .dll file with the new one.

    The MSI Afterburner Android & iOS apps also use this library. To resolve the issue for those as well
    • exit the remote server if it is currently running
    • replace the .dll file in the remote server application folder which you downloaded from MSI (same folder as the MSIAfterburnerRemoteServer.exe file)
    • restart the remote server.

    Links in the first post have been updated.
     
    The1, Andy_K and Unwinder like this.
  14. Unwinder

    Unwinder Ancient Guru Staff Member

    Messages:
    17,127
    Likes Received:
    6,691
    Good job, Nick!
     
  15. ragesaq

    ragesaq Member

    Messages:
    29
    Likes Received:
    6
    GPU:
    MSI Sea Hawk EK X 1070 8g
    Fixed my issues! You rock Nick! Is there an upper limit around the number of monitors? I'm going to load it up :)
     

  16. stangowner

    stangowner Guest

    Messages:
    607
    Likes Received:
    11
    GPU:
    2xMSI N550GTX-Ti Cy II OC
    I had it up to 66, but I don't think there is a limit now.
     
  17. Lino

    Lino Guest

    Messages:
    2
    Likes Received:
    0
    GPU:
    1080 Ti 11gb
    First of all thank you for the DLL.
    I have problems in making it work, though.
    My configuration:
    X64 windows 7, gual GPU (1080ti)
    Afterburner (4.4.2) is open and working.
    The DLL connects to afterburner and the ControlMemory class gives MIN/Current/MAX values for fan, temp limit and so on... but all the other readings are Zero.
    Any idea why?
    (screenshot here: https://imgur.com/a/yxSUK )

    This is the full dump of the first GPU:

    ? CM.GpuEntries[0]
    AuxVoltageBoostCur: 0
    AuxVoltageBoostDef: 0
    AuxVoltageBoostMax: 0
    AuxVoltageBoostMin: 0
    AuxVoltageCur: 0
    AuxVoltageDef: 0
    AuxVoltageMax: 0
    AuxVoltageMin: 0
    CoreClockBoostCur: 0
    CoreClockBoostDef: 0
    CoreClockBoostMax: 1000000
    CoreClockBoostMin: -400000
    CoreClockCur: 0
    CoreClockDef: 0
    CoreClockMax: 0
    CoreClockMin: 0
    CoreVoltageBoostCur: 0
    CoreVoltageBoostDef: 0
    CoreVoltageBoostMax: 100
    CoreVoltageBoostMin: 0
    CoreVoltageCur: 0
    CoreVoltageDef: 0
    CoreVoltageMax: 0
    CoreVoltageMin: 0
    FanFlagsCur: None
    FanFlagsDef: AUTO
    FanSpeedCur: 70
    FanSpeedDef: 0
    FanSpeedMax: 100
    FanSpeedMin: 33
    Flags: FAN_SPEED | CORE_VOLTAGE_BOOST | POWER_LIMIT | CORE_CLOCK_BOOST | MEMORY_CLOCK_BOOST | THERMAL_LIMIT | SYNCHRONIZED_WITH_MASTER
    Index: 0
    IsMaster: true
    MemoryClockBoostCur: 890000
    MemoryClockBoostDef: 0
    MemoryClockBoostMax: 1000000
    MemoryClockBoostMin: -502296
    MemoryClockCur: 0
    MemoryClockDef: 0
    MemoryClockMax: 0
    MemoryClockMin: 0
    MemoryVoltageBoostCur: 0
    MemoryVoltageBoostDef: 0
    MemoryVoltageBoostMax: 0
    MemoryVoltageBoostMin: 0
    MemoryVoltageCur: 0
    MemoryVoltageDef: 0
    MemoryVoltageMax: 0
    MemoryVoltageMin: 0
    PowerLimitCur: 150
    PowerLimitDef: 100
    PowerLimitMax: 150
    PowerLimitMin: 50
    ShaderClockCur: 0
    ShaderClockDef: 0
    ShaderClockMax: 0
    ShaderClockMin: 0
    ThermalLimitCur: 84
    ThermalLimitDef: 84
    ThermalLimitMax: 90
    ThermalLimitMin: 65

    If I use the HardwareMonitorClass instead I get the correct values... o_O
    var HME = HM.GetEntry(0, MONITORING_SOURCE_ID.CORE_CLOCK);

    ? HME
    Data: 1784.5
    Flags: None
    GPU: 0
    LocalizedSrcName: "GPU1 Frequenza core"
    LocalizedSrcUnits: "Mhz"
    MaxLimit: 2100
    MinLimit: 100
    RecommendedFormat: "%.0f"
    SrcId: 32
    SrcName: "GPU1 core clock"
    SrcUnits: "MHz"
     
    Last edited: Mar 6, 2018
  18. makar

    makar Guest

    Messages:
    3
    Likes Received:
    0
    GPU:
    GTX 960m
    @stangowner

    Can you make properties of Header and GPUEntry not read only? For example, I want to set different settings for different GPUs, and I have to remove flag MACM_SHARED_MEMORY_FLAG_SYNC from header's flags and modify MasterGPU.

    Thanks
     
    Last edited: Mar 6, 2018
  19. Lino

    Lino Guest

    Messages:
    2
    Likes Received:
    0
    GPU:
    1080 Ti 11gb
    By the way...
    MONITORING_SOURCE_ID enum
    lacks values 113 (power limit), 112 (temp limit) and 114(voltage limit) and since the
    public HardwareMonitorEntry GetEntry(uint gpuIndex, MONITORING_SOURCE_ID id)
    checks for enum validity, even if I have the corresponding srcid in memory, I can't use it...

    Edit: I work-arounded it with:
    var TLM = HM.Entries.First(e => e.GPU == idx && e.SrcId == 112);
    var PLM = HM.Entries.First(e => e.GPU == idx && e.SrcId == 113);
    var VLM = HM.Entries.First(e => e.GPU == idx && e.SrcId == 114);
     
    Last edited: Mar 6, 2018
  20. ragesaq

    ragesaq Member

    Messages:
    29
    Likes Received:
    6
    GPU:
    MSI Sea Hawk EK X 1070 8g
    I've got it up to 134, no problems here!
     
    stangowner likes this.

Share This Page