I just said I read it... RivaTuner Statistics Server can be incompatible with some third-party On-Screen Display software (e.g. XFire or Steam In-Game Chat). The limitation is not specific to our product, many third-party products are designed to be the only On-Screen Display products in the system and to block their own functionality when any other On-Screen Display software is running That doesn't explain at all why every single version up until now worked fine, nor does it explain why it stops actual hardware from working. I asked a question nicely, I didn't insult you and yet all I got was a passive aggressive response. I thought the whole point of a beta was to ask question, give feedback and get helpful responses. I'll leave this now as to not clutter a thread made for talking about it.
And I just said that you didn't read it at all. Revision history and 6.5.0 changes list contains the answer and your posts will be ignored until you finally bother to read it. I spend about a DAY before each new version launch to document all changes and create the context help with useful info for users. If someone thinks that he is too busy to read the documentation supplued with software and at the same time expect that the developer will give him personal quotes of documented things, then I'm sorry but I don't find such questions "nice, genuine and politely put" because they are just stealing my work time.
Thank you for the quick response. I'm still a bit confused however. I understand that the GPU boost 3.0 voltage percentage slider doesn't allow mapping a direct offset in millivolts but rather a maximum vBios controlled voltage offset influenced by power and thermal conditions. But in my case the voltage does not react at all after setting a +100% voltage%. This while under conditions where the temps are below 50 degrees Celcius and powerdraw at +- 90-100%. (120% max. applied). I also watched the youtube clip u linked in the other thread. In the clip I saw the guy get an instant +50mV increase after setting voltage percentage to 100. When i try this the voltages of both my 1080's stay identical as they where. I also tried setting up a manual curve adjusting voltage offset for each frequency and even locking a particular frequency and voltage offset using CTRL+L. It appears I can only lock voltages lower than the bios would allow me. The limit appears to be 1.025mv for GPU1 and 1.065mv for GPU2. Am I still held back by something other than temps or powerdraw? A bios limitation perhaps? Thank you!
You must carefully examine the vector of performance limit related graphs (i.e. "Temp limit", "Power limit", "Voltage limit" and "No load limit") as they are providing you important dynamic feedback from GPU Boost 3.0 voltage controller, allowing you to see what is currently holding your clocks/voltages. In my video stating from 0:26 I moved mouse cursor to highlight OSD line with current performance limits to show that GPU Boost 3.0 will increase voltage ONLY when "Voltage limit" graph is signaled.
^ Why wouldn't it be? I just tested on 980TI works great in BF1 beta.. And I just override, no fresh install. For a second was also excited about this But then I saw mine uses higher number 1983 c1::nerd:
If you saw the earlier build threads, Unwinder preferred people only with Pascal to use them for some reason.
Many thanks for updating this excellent utility! I have only one ongoing "fly in the ointment"; since upgrading to Windows 10, RTSS states the following: "Some system components cannot be hooked right now. It is strongly recommended to restart application." I read that this can be caused by security software (antivirus, etc.) blocking the program. With that in mind, I went through Kaspersky Internet Security (which I currently use) and gave RTSS total access to my system. Nevertheless, when I boot-up or restart my computer the same message pops up on my screen. Does anyone have a helpful suggestion? If so, I would greatly appreciate it. Also, let me mention that while I was using Windows 7 and then Windows 8, I never saw this message; it only started after I upgraded to Windows 10 (my version of 10 is Pro 64 bit).
That's because he wanted feedback from Pascal users. There were no new features added for Maxwell and below. So it was pretty pointless for us to report anything that wasn't bug related.
Brilliant bit of software but I'm curious as to why there is an expiration date built into the software ? Should it not be up to the individual if they want to move onto newer software ?
I think all beta versions of Afterburner have used that sort of timer, of course the non-beta release won't have one. (Might be a bit annoying to some but it does make sure people upgrade from the beta version to the release one I suppose.) EDIT: Huh so it's possible to check, I did not know that.
I have a problem with this version. I have 2x 1070 SC and first card dont go to idle frequencies but it's ever at 1050/4000MHz with my 144Hz monitor. With 120Hz frequencies go down to 240/800MHz. Same problerm with last PrecisionX OC 6.0.5.
AB compilation and expiration dates (if set) can always be seen in AB info window ("i" button). There is no expiration date defined for this version.
It is not a "problem with this version". Neither AB nor Precision is controlling the clocks in realtime and defining idle clock dependency on the refresh rate.