Guru3D.com Forums

Go Back   Guru3D.com Forums > Affiliates > MSI AfterBurner Overclock Application Discussion forum
MSI AfterBurner Overclock Application Discussion forum This forum is intended for MSI customers for questions on the AfterBurner Overclock Utility based off Rivatuner. In this section the users help each other out with answers as well as support staff from MSI.



Reply
 
Thread Tools Display Modes
Old
  (#176)
Unwinder
Moderator
 
Videocard:
Processor:
Mainboard:
Memory:
Soundcard:
PSU:
Default 10-21-2016, 15:05 | posts: 13,053 | Location: Taganrog, Russia

Sadly Pascal SLI VRAM usage monitoring issue is still not fixed in NVAPI in recently released 375.57 drivers, but I remind you that you can configure MSIAB to use alternate D3DKMT VRAM usage monitoring source:

http://forums.guru3d.com/showpost.ph...&postcount=126


Alexey Nicolaychuk aka Unwinder, RivaTuner creator
   
Reply With Quote
 
Old
  (#177)
XcroN
Master Guru
 
XcroN's Avatar
 
Videocard: GTX 1080Ti Aorus xtreme
Processor: i7 6900K 4.0 Ghz|H100i V2
Mainboard: Rampage V Edition 10
Memory: 64GB HyperX 3000Mhz
Soundcard: HD 650 + Essence STX II
PSU: Seasonic Platinum 860W
Default 10-21-2016, 18:30 | posts: 326 | Location: Israel

RTSS Latest beta.
I don't know if i'm doing something wrong but ROTTR sinply crash when i try to inject the OSD.

Tried to edit the global config(RivaTuner Statistics Server\Profiles) and added this:
InjectionDelay=10000
InjectionDelayTriggers=steam_api.dll,steam_api64.d ll,dxgi.dll (my chrome is bugged no space in d and double l)
Relaunched AB + RTSS.
Have the game launcher and game is running for 10 seconds and game crashes.
Even tried to edit in ProfileTemplates, Same issue.
ROTTR is the only game, even in steam overlay that refuse to work.
Black ops 3 with and without steam overlay is good.
ROTTR without steam overlay is bugged for me on both DX11 and DX12.
I'm probably doing something wrong here as the beta 6 works well in all other games but ROTTR.
Tried everything for the past 4 hours to fix this.
   
Reply With Quote
Old
  (#178)
Unwinder
Moderator
 
Videocard:
Processor:
Mainboard:
Memory:
Soundcard:
PSU:
Default 10-21-2016, 18:48 | posts: 13,053 | Location: Taganrog, Russia

Quote:
Originally Posted by XcroN View Post
RTSS Latest beta.
I don't know if i'm doing something wrong but ROTTR sinply crash when i try to inject the OSD.

Tried to edit the global config(RivaTuner Statistics Server\Profiles) and added this:
InjectionDelay=10000
InjectionDelayTriggers=steam_api.dll,steam_api64.d ll,dxgi.dll (my chrome is bugged no space in d and double l)
Relaunched AB + RTSS.
Have the game launcher and game is running for 10 seconds and game crashes.
Even tried to edit in ProfileTemplates, Same issue.
ROTTR is the only game, even in steam overlay that refuse to work.
Black ops 3 with and without steam overlay is good.
ROTTR without steam overlay is bugged for me on both DX11 and DX12.
I'm probably doing something wrong here as the beta 6 works well in all other games but ROTTR.
Tried everything for the past 4 hours to fix this.
No need to clone the same in multiple threads, I'm reading everything and comment what I find important without such "bumping". Also I've already recommended you to AVOID touching delayed engine and using default RTSS config instead. Putting a lot of unrelated libraries there is a BAD idea.


Alexey Nicolaychuk aka Unwinder, RivaTuner creator
   
Reply With Quote
Old
  (#179)
Thunk_It
Newbie
 
Thunk_It's Avatar
 
Videocard: 2 GTX 1080's in SLI
Processor: i7 4820 @ 3.7 Ghz
Mainboard: ASUS P9X79
Memory: 16 GB DDR3 2133 Mhz
Soundcard: Creative ZX
PSU: EVGA 1200 Watt Platinum
Default 10-21-2016, 19:51 | posts: 33 | Location: South GA, USA

Quote:
Originally Posted by Unwinder View Post
Sadly Pascal SLI VRAM usage monitoring issue is still not fixed in NVAPI in recently released 375.57 drivers, but I remind you that you can configure MSIAB to use alternate D3DKMT VRAM usage monitoring source:

http://forums.guru3d.com/showpost.ph...&postcount=126
Many thanks Unwinder for posting this comment, especially the link - happily, I did follow your directions. The only thing I noticed now in my LCD display (this you are probably aware of), is the memory usage for GPU 2 is not displayed.
   
Reply With Quote
 
Old
  (#180)
Unwinder
Moderator
 
Videocard:
Processor:
Mainboard:
Memory:
Soundcard:
PSU:
Default 10-21-2016, 20:00 | posts: 13,053 | Location: Taganrog, Russia

That's suppised to be that way on SLI/CF, D3DKMT memory usage is available for the primary card only.


Alexey Nicolaychuk aka Unwinder, RivaTuner creator
   
Reply With Quote
Old
  (#181)
gedo
Master Guru
 
Videocard: XFX 280X DD 3GB
Processor: Intel Core i5-4690K @4.3
Mainboard: Gigabyte Z97X-Gaming 3
Memory: 8GB DDR3 2133 CL9 Ares
Soundcard: Asus Xonar DX + ATH-AD700
PSU: Seasonic G-550
Default 10-21-2016, 22:43 | posts: 225 | Location: Finland

[QUOTE=Unwinder;5348254]Try this build to see if it works better on your system. I've added one more switch to delayed injection engine (InjectionDelayDirect3D12), which may forcibly disable delayed injection for D3D12 specific functions.


I'm no longer getting the CTD. I tested once with the above build (no CTD), went back to the previous one (MSIAfterburnerSetup430Build9267_16_10_2016) to confirm, but still got no CTD. Tested multiple times (~7) - only got a CTD once and couldn't reproduce. (Getting a CTD in OSD intiation happens every once in a while - usually once a month - three months or so.)

The trial period ended and I'm now running the full bought game. Maybe there's some other component in Origin (or function of Origin In Game) that only runs when you have timed access to a game and that is the cause of the CTD (that happens always)? It's conceivable that such a function would be heavily protected against tampering by outside processes, so you couldn't alter the available gaming time.

Also, the full game uses a different .exe than the trial. (bf1.exe instead of bf1Trial.exe). My tests were done without a profile in RTSS for either.

(I reactivated Origin In Game for testing purposes, BTW.)

Last edited by gedo; 10-21-2016 at 22:54.
   
Reply With Quote
Old
  (#182)
Unwinder
Moderator
 
Videocard:
Processor:
Mainboard:
Memory:
Soundcard:
PSU:
Default 10-22-2016, 10:28 | posts: 13,053 | Location: Taganrog, Russia

Thanks for testing, that's good info.


Alexey Nicolaychuk aka Unwinder, RivaTuner creator
   
Reply With Quote
Old
  (#183)
XcroN
Master Guru
 
XcroN's Avatar
 
Videocard: GTX 1080Ti Aorus xtreme
Processor: i7 6900K 4.0 Ghz|H100i V2
Mainboard: Rampage V Edition 10
Memory: 64GB HyperX 3000Mhz
Soundcard: HD 650 + Essence STX II
PSU: Seasonic Platinum 860W
Default 10-22-2016, 11:14 | posts: 326 | Location: Israel

Quote:
Originally Posted by Unwinder View Post
No need to clone the same in multiple threads, I'm reading everything and comment what I find important without such "bumping". Also I've already recommended you to AVOID touching delayed engine and using default RTSS config instead. Putting a lot of unrelated libraries there is a BAD idea.
I guess the custom Direct3D Support is not good for this game. Without it game works well.
   
Reply With Quote
Old
  (#184)
Unwinder
Moderator
 
Videocard:
Processor:
Mainboard:
Memory:
Soundcard:
PSU:
Default 10-22-2016, 12:52 | posts: 13,053 | Location: Taganrog, Russia

Quote:
Originally Posted by XcroN View Post
I guess the custom Direct3D Support is not good for this game. Without it game works well.
Context help for custom Direct3D support option is strongly recommending you to avoid enabling this option globally for all games. Proper usage scenario for it is to enable it via game profiles and only for those games where you use modded Direct3D libraries.


Alexey Nicolaychuk aka Unwinder, RivaTuner creator
   
Reply With Quote
Old
  (#185)
Unwinder
Moderator
 
Videocard:
Processor:
Mainboard:
Memory:
Soundcard:
PSU:
Default 10-22-2016, 13:09 | posts: 13,053 | Location: Taganrog, Russia

MSI Afterburner 4.3.0 RC distributive has been updated. MSI Afterburner itself is exactly the same build 9267 as in the previous RC with no changes, but bundled RTSS 6.5.0 distributive has been upgraded to newer build 8709 (also available as separate download a few posts above). I'm rather satisfied with testing results so most likely this RC will be submitted to MSI for release on Monday.


Alexey Nicolaychuk aka Unwinder, RivaTuner creator
   
Reply With Quote
Old
  (#186)
Shadowdane
Maha Guru
 
Videocard: EVGA 1080Ti SC
Processor: i7-6700K @ 4.7Ghz
Mainboard: Asus Maximus VIII Hero
Memory: GSkill 32GB DDR4-3200 C14
Soundcard: SBZ/ProMonitor800/M8 Sub
PSU: Seasonic X-1050
Default 10-22-2016, 15:55 | posts: 1,262 | Location: Virginia

Quote:
Originally Posted by Unwinder View Post
BTW, considering that I just installed 1070 SLI system, I reproduced Pascal SLI VRAM usage bug myself and submitted some important info on it directly to important contacts from NV to help them fixing it faster.
And I'm not sure if you realize that guys, but while it is temporarily broken in NVAPI, you may still force AB to use alternate source to report videomemory usage. For AMD/Intel graphics cards there are no dedicated VRAM usage reporting interfaces inside AMD/Intel driver API, so for AMD/Intel AB is using internal DX performance counters to track how many videomemory is currently allocated. You may force AB to use the same codepath for NV cards as well, to do it edit MSIAfterburner.cfg in root AB forlder and change GenericMemoryUsageMonitoring from 1 to 0 in [NVAPIHAL] section then restart AB. It can be done on any version of AB including the stable 4.2.0 release or new 4.3.0 betas.
Thanks for this bit of info! That memory usage bug had been driving me mad!!!

Why not make that a switch in the configuration settings??
   
Reply With Quote
Old
  (#187)
Unwinder
Moderator
 
Videocard:
Processor:
Mainboard:
Memory:
Soundcard:
PSU:
Default 10-22-2016, 16:24 | posts: 13,053 | Location: Taganrog, Russia

Because it is a waste of development resources to do so just to provide temporary solution wich will become completely useles after releasing a fixed driver.


Alexey Nicolaychuk aka Unwinder, RivaTuner creator
   
Reply With Quote
Old
  (#188)
jststojc
Maha Guru
 
Videocard: gigabyte gtx285
Processor: amd 940be@ 3.4ghz 1.3125v
Mainboard: asus m3a79-t deluxe
Memory: 2x2gb ocz reaper 1066
Soundcard: onboard thingy
PSU: tagan piperock 800w
Default 10-26-2016, 10:00 | posts: 1,312 | Location: Slovenia

Hi, i searched around in the forums, but couldnt find anything similar.
I have a 1060 TI 6GB (Windforce OC 2x) running on win 8.1 with msi afterburner 4.3.0 beta 14 and drivers 372.70.
I have OCed/Undervolted the card with the Curve editor. I found really stable curves depending on what i wanted (just lowest voltage with 1800mhz for instance and highest Frequency at lowest possible voltage) by thoroughly testing the curves with MSI Kombustor, Doom, Call of duty. Thank i saved each of the curves into one of the 5 slots. Now often when i restart my pc and than try to apply one of the profiles, it autimatically shifts the whole curve a few mhz higher, so that the whole thing becomes unstable, even reloading the preset and applying again, it does the same thing. I saw that the same happens even if i have it set to load config on start.
For example i have one curve that allows max freq 1800 mhz and the lowest voltage for it is 0.8v, which is perfectly stable, but sometimes (not always and i cant figure the reason for it - it is randomly reproducible though) it ups the frequency to 1817 by itself at the same 0.8v, than it gets unstable.
My question now, is this a bug, or wanted behavior and how one could get around this, because otherwise i have to always load the profile, apply, than readjust the frequency table and apply again. Thanks
   
Reply With Quote
Old
  (#189)
Unwinder
Moderator
 
Videocard:
Processor:
Mainboard:
Memory:
Soundcard:
PSU:
Default 10-26-2016, 10:43 | posts: 13,053 | Location: Taganrog, Russia

It is a bug in fundamental understanding of GPU boost. It was repeated a lot of times that you CANNOT expect to see exact resulting fixed clock values in any point of the curve. In GPU boost 3.0 you can only control the offset applied to each point and expect it (i.e offset) to be fixed, but you cannot expect that the base clock in each point will remain static. It is supposed to change dynamically depending on thermal and power conditions.


Alexey Nicolaychuk aka Unwinder, RivaTuner creator
   
Reply With Quote
Old
  (#190)
jststojc
Maha Guru
 
Videocard: gigabyte gtx285
Processor: amd 940be@ 3.4ghz 1.3125v
Mainboard: asus m3a79-t deluxe
Memory: 2x2gb ocz reaper 1066
Soundcard: onboard thingy
PSU: tagan piperock 800w
Default 10-26-2016, 11:31 | posts: 1,312 | Location: Slovenia

Quote:
Originally Posted by Unwinder View Post
It is a bug in fundamental understanding of GPU boost. It was repeated a lot of times that you CANNOT expect to see exact resulting fixed clock values in any point of the curve. In GPU boost 3.0 you can only control the offset applied to each point and expect it (i.e offset) to be fixed, but you cannot expect that the base clock in each point will remain static. It is supposed to change dynamically depending on thermal and power conditions.
Thanks,
i did read your other post on the forum about boost 3.0, but as far as i understood that, i expected fixed behavior when using the curve function. Apparently boost 3.0 sets the base frequency on the lower steps, which we cant(?) control. Not sure how power or thermal envelope would play a role with under 50% TDP and under 50 degrees celsius temps in this case.
I guess ill have to set my curves a bit less aggressive.
   
Reply With Quote
Old
  (#191)
LocoDiceGR
Master Guru
 
LocoDiceGR's Avatar
 
Videocard: Gigabyte R9 380 4G
Processor: Intel®Core i5 750 2.67Ghz
Mainboard: ASUS P7P55-M
Memory: HyperX Fury @1866
Soundcard: SoundblasterZ
PSU: XFX TS 550w
Default 10-26-2016, 11:56 | posts: 683 | Location: Greece

Quote:
Originally Posted by Unwinder View Post
I'm rather satisfied with testing results so most likely this RC will be submitted to MSI for release on Monday.
Any news?
   
Reply With Quote
Old
  (#192)
Unwinder
Moderator
 
Videocard:
Processor:
Mainboard:
Memory:
Soundcard:
PSU:
Default 10-26-2016, 12:07 | posts: 13,053 | Location: Taganrog, Russia

Final build of 4.3.0 has been submitted to MSI on Monday, it is the same build as I provided here a few posts above. It normally takes them about a week to prepare PR stuff and update it officially on their servers.


Alexey Nicolaychuk aka Unwinder, RivaTuner creator
   
Reply With Quote
Old
  (#193)
Unwinder
Moderator
 
Videocard:
Processor:
Mainboard:
Memory:
Soundcard:
PSU:
Default 10-26-2016, 12:15 | posts: 13,053 | Location: Taganrog, Russia

Quote:
Originally Posted by jststojc View Post
Not sure how power or thermal envelope would play a role with under 50% TDP and under 50 degrees celsius temps in this case.
I guess ill have to set my curves a bit less aggressive.
Base clocks are moving up/down by half of crystal clock step (27/2=13.5MHz) with each 5C temperature change. So if you adjust the curve at some static temperature then your card become 5C cooler, you can see higher clocks than you could expect assuming that base clock is static.


Alexey Nicolaychuk aka Unwinder, RivaTuner creator
   
Reply With Quote
Old
  (#194)
jststojc
Maha Guru
 
Videocard: gigabyte gtx285
Processor: amd 940be@ 3.4ghz 1.3125v
Mainboard: asus m3a79-t deluxe
Memory: 2x2gb ocz reaper 1066
Soundcard: onboard thingy
PSU: tagan piperock 800w
Default 10-26-2016, 12:34 | posts: 1,312 | Location: Slovenia

Quote:
Originally Posted by Unwinder View Post
Base clocks are moving up/down by half of crystal clock step (27/2=13.5MHz) with each 5C temperature change. So if you adjust the curve at some static temperature then your card become 5C cooler, you can see higher clocks than you could expect assuming that base clock is static.
Thank you, this is very informative.
Might it be possible/reasonable for to implement such a logic in Afterburner, that it would compensate for this, so that a once set frequency at a voltage point would always be held? (as in my example to stay at 1800mhz at 0,8v instead of going to 1813mhz at 0,8v if the starting temp is lower - or with several freq/voltage points). It might make ocing for many easier and stabler.
   
Reply With Quote
Old
  (#195)
Unwinder
Moderator
 
Videocard:
Processor:
Mainboard:
Memory:
Soundcard:
PSU:
Default 10-26-2016, 12:37 | posts: 13,053 | Location: Taganrog, Russia

No, sorry. There won't be any dynamic OC logic in AB. GPU Boost 3.0 is complex even without it.


Alexey Nicolaychuk aka Unwinder, RivaTuner creator
   
Reply With Quote
Old
  (#196)
shuieryin
Newbie
 
Videocard: 16g
Processor: i7 3960x
Mainboard:
Memory:
Soundcard:
PSU: 1300w
Default 01-14-2017, 17:32 | posts: 1

Quote:
Originally Posted by Unwinder View Post
Yep, that's definitively not supposed to be that way, thanks for reporting. I'll add RTSSHooksTypes header to SDK as well.
Dear Unwinder,

I was having below errors when building the sample project and couldn't find "RTSSHooksTypes.h" from sample code locally. Could you kindly advise how to create this file? Thanks a lot~!

d:\program files (x86)\rivatuner statistics server\sdk\include\rtsssharedmemory.h(9) : fatal error C1083: Cannot open include file: 'RTSSHooksTypes.h': No such file or directory
RTSSSharedMemorySampleDlg.cpp
d:\program files (x86)\rivatuner statistics server\sdk\include\rtsssharedmemory.h(9) : fatal error C1083: Cannot open include file: 'RTSSHooksTypes.h': No such file or directory
Generating Code...
Error executing cl.exe.
   
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump



Powered by vBulletin®
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
vBulletin Skin developed by: vBStyles.com
Copyright (c) 2017, All Rights Reserved. The Guru of 3D, the Hardware Guru, and 3D Guru are trademarks owned by Hilbert Hagedoorn.