Guru3D.com Forums

Go Back   Guru3D.com Forums > Affiliates > MSI AfterBurner Overclock Application Discussion forum
MSI AfterBurner Overclock Application Discussion forum This forum is intended for MSI customers for questions on the AfterBurner Overclock Utility based off Rivatuner. In this section the users help each other out with answers as well as support staff from MSI.


Reply
 
Thread Tools Display Modes
Old
  (#51)
Simplex
Member Guru
 
Videocard: MSI GTX 770 TwinFrozr
Processor: Core i7-2600K@4.6 GHz
Mainboard: ASUS P8Z68 DELUXE
Memory: 2x4GB GoodRAM 1333MHz
Soundcard: SB Zx, Z-5500 THX
PSU: Corsair RM750
Default 04-27-2012, 02:03 | posts: 58 | Location: Poland

Thanks!
   
Reply With Quote
 
Old
  (#52)
Unwinder
Moderator
 
Videocard:
Processor:
Mainboard:
Memory:
Soundcard:
PSU:
Default 04-27-2012, 06:31 | posts: 11,192 | Location: Taganrog, Russia

Quote:
Originally Posted by bdub5886 View Post
MSI Afterburner -> RTSS -> Options -> Frame limiter
It is OSD server's own framerate limiting technology supported on both AMD and NVIDIA cards. The technology quoted above and adjustable via EVGA Precision X is NVIDIA driver's feature. Afterburner doens't control it.


Alexey Nicolaychuk aka Unwinder, RivaTuner creator
   
Reply With Quote
Old
  (#53)
Simplex
Member Guru
 
Videocard: MSI GTX 770 TwinFrozr
Processor: Core i7-2600K@4.6 GHz
Mainboard: ASUS P8Z68 DELUXE
Memory: 2x4GB GoodRAM 1333MHz
Soundcard: SB Zx, Z-5500 THX
PSU: Corsair RM750
Default 04-27-2012, 11:59 | posts: 58 | Location: Poland

Quote:
Originally Posted by Unwinder View Post
It is OSD server's own framerate limiting technology supported on both AMD and NVIDIA cards. The technology quoted above and adjustable via EVGA Precision X is NVIDIA driver's feature. Afterburner doens't control it.
Thanks for clarification. That is what I suspected, and because of that I asked this question on the forums.

Is there any "real" (practical) differences between these two implementations? Will I be better of in some way if I use dedicated nvidia frame limiter on nVidia hardware? Should I switch from Afterbuner to Precision X?

Since you are the author of both precision and afterburner, will you be implementing this nVidia feature also in afterburner?

As a side note, it seems absolutely bizarre to me that nVidia is introducing and advertising a feature of their card which is not available through their drviers, but only through third party software!
   
Reply With Quote
Old
  (#54)
TFL Replica
Master Guru
 
TFL Replica's Avatar
 
Videocard: NVIDIA GTX 570
Processor: Intel Core i5 2500K
Mainboard: ASUS P8P67 Deluxe Rev 3
Memory: 2x2GB Corsair 1600MHz
Soundcard: X-Fi XtremeGamer Fatal1ty
PSU: PCP&C Silencer 750w
Default 04-27-2012, 13:07 | posts: 369

Quote:
Originally Posted by Simplex View Post
it seems absolutely bizarre to me that nVidia is introducing and advertising a feature of their card which is not available through their drviers, but only through third party software!
I don't recall any advertising for a framerate limiter and it was announced for all cards way before the Kepler launch. They did however advertise adaptive vsync and that can be enabled without the need for third party software.
   
Reply With Quote
 
Old
  (#55)
Simplex
Member Guru
 
Videocard: MSI GTX 770 TwinFrozr
Processor: Core i7-2600K@4.6 GHz
Mainboard: ASUS P8Z68 DELUXE
Memory: 2x4GB GoodRAM 1333MHz
Soundcard: SB Zx, Z-5500 THX
PSU: Corsair RM750
Default 04-27-2012, 14:45 | posts: 58 | Location: Poland

Quote:
Originally Posted by TFL Replica View Post
I don't recall any advertising for a framerate limiter and it was announced for all cards way before the Kepler launch.
Personally, I don't recall nVidia announcing frame limiter for all cards "way before the Kepler launch", but I might have missed it.
It was available on kepler's launch and was for example tested by fudzilla:
http://www.fudzilla.com/home/item/26...ested?start=12

Some sources refer to adaptive vsync as "framerate limiter".

Quote:
They did however advertise adaptive vsync and that can be enabled without the need for third party software.
On nVidia's own website Geforce.com they clearly advertise this feature as nVidia exclusive, at the same time informing that this feature can only be enabled using third-party software. This still seems bizarre to me.

Last edited by Simplex; 04-27-2012 at 14:48.
   
Reply With Quote
Old
  (#56)
TFL Replica
Master Guru
 
TFL Replica's Avatar
 
Videocard: NVIDIA GTX 570
Processor: Intel Core i5 2500K
Mainboard: ASUS P8P67 Deluxe Rev 3
Memory: 2x2GB Corsair 1600MHz
Soundcard: X-Fi XtremeGamer Fatal1ty
PSU: PCP&C Silencer 750w
Default 04-27-2012, 14:55 | posts: 369

Quote:
Originally Posted by Simplex View Post
Some sources refer to adaptive vsync as "framerate limiter".
You'd have to be pretty dumb to mix those two up. One includes vsync, the other doesn't.

Quote:
On nVidia's own website Geforce.com they clearly advertise this feature as nVidia exclusive, at the same time informing that this feature can only be enabled using third-party software. This still seems bizarre to me.
If you read your own link you'll see that the "framerate target" activated by third party tools and "adaptive vsync" available in the CP are two separate features.
   
Reply With Quote
Old
  (#57)
Simplex
Member Guru
 
Videocard: MSI GTX 770 TwinFrozr
Processor: Core i7-2600K@4.6 GHz
Mainboard: ASUS P8Z68 DELUXE
Memory: 2x4GB GoodRAM 1333MHz
Soundcard: SB Zx, Z-5500 THX
PSU: Corsair RM750
Default 04-27-2012, 15:34 | posts: 58 | Location: Poland

Quote:
Originally Posted by TFL Replica View Post
You'd have to be pretty dumb to mix those two up. One includes vsync, the other doesn't.
I dit not mix these two up, I know the difference between adaptive vsync and frame rate limiter. This is what I was referring to:
Quote:
Originally Posted by http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_680/2.html
The last of the three big features is Adaptive V-Sync. The feature improves on traditional V-Sync, by dynamically adjusting the frame limiter to ensure smoother gameplay.

Quote:
Originally Posted by TFL Replica View Post
If you read your own link you'll see that the "framerate target" activated by third party tools and "adaptive vsync" available in the CP are two separate features.
I think my previous posts indicate that I am perfectly aware of this and I merely expressed my surprise that nVidia's own feature (frame target) cannot be enabled in nVidia's own drivers, but requires a specific third party tool (as far as I know, EVGA Precision X is the only tool that supports that feature.
   
Reply With Quote
Old
  (#58)
heroxoot
Master Guru
 
Videocard: 7970 lightning @1225/1700
Processor: FX 8150 @ 4.4ghz
Mainboard: MSI 990FXA-GD80
Memory: 16GB 2133mhz@1866 G.skill
Soundcard: Craptek/logitech
PSU: OCZ ZX 850w gold
Default 05-01-2012, 04:41 | posts: 218 | Location: TN

any reason to upgrade from beta 15 if i just have my 6850? Does this version have the kernel mode support so punkbuster doesnt rage quit my pc?
   
Reply With Quote
Old
  (#59)
IKnowJack
Master Guru
 
Videocard: GTX470
Processor: Q-Z80 OC@30000Ghz
Mainboard: Z80-xXx hot pink edtition
Memory: Perkie 32DD
Soundcard: hearing aid
PSU: Fukushima Nuclear Plant I
Default 05-03-2012, 12:40 | posts: 176 | Location: darkside of the moon

bodgy release :/
why even have a bloody beta? half the stuff added in this has never been tested..
wouldnt it have been wiser to make this final based on the last beta(with minor fixes only)..and make a new beta with all the new stuff?

clocks are showing half their true value in hardware monitor, both core and shader...i have uninstalled beta...even removed all profiles.. this on a GTX470 + 9400GT..both old enough to have these stupid bugs sorted..

clocks showed correct on first run after removing profiles, but it returned to half clocks on the next run..

...rivatuner used to be good..but now theres corporate backing...the code has gone to hell..i guess thats why this crap is called a final release...the bigwigs wanted a full release to package in with the newer cards..while they forget about the rest of us

time to remove this corporate bloatware and move back to a small dev who hasnt been corrupted by $$ and higher powers

Last edited by IKnowJack; 05-03-2012 at 12:46.
   
Reply With Quote
Old
  (#60)
Unwinder
Moderator
 
Videocard:
Processor:
Mainboard:
Memory:
Soundcard:
PSU:
Default 05-03-2012, 12:45 | posts: 11,192 | Location: Taganrog, Russia

Quote:
Originally Posted by IKnowJack View Post
bodgy release :/

clocks are showing half their true value in hardware monitor, both core and shader...i have uninstalled beta...even removed all profiles.. this on a GTX470 + 9400GT..both old enough to have these stupid bugs sorted..

clocks showed correct on first run after removing profiles, but it returned to half clocks on the next run..

...rivatuner used to be good..but now theres corporate backing...the code has gone to hell :/

time to remove this corporate bloatware and move back to a small dev who hasnt been corrupted by $$ and higher powers
Do you really think that small dev wish to work for community after meeting agressive knowledgeless users like you daily? Corrupted by $$, heh. Someone's brain is corrupted lack of IQ.


Alexey Nicolaychuk aka Unwinder, RivaTuner creator
   
Reply With Quote
 
Old
  (#61)
Pill Monster
Ancient Guru
 
Pill Monster's Avatar
 
Videocard: 7950 Vapor-X 1175/1550
Processor: AMD FX-8320 @4.8
Mainboard: ASUS Sabertooth 990FX R2
Memory: 8GB Kingston HyperX 2400
Soundcard: Audigy 2 Platinum Ex 5.1
PSU: AcBel M8 750
Default 05-03-2012, 12:49 | posts: 23,460 | Location: NZ

lol...
   
Reply With Quote
Old
  (#62)
IKnowJack
Master Guru
 
Videocard: GTX470
Processor: Q-Z80 OC@30000Ghz
Mainboard: Z80-xXx hot pink edtition
Memory: Perkie 32DD
Soundcard: hearing aid
PSU: Fukushima Nuclear Plant I
Default 05-03-2012, 12:55 | posts: 176 | Location: darkside of the moon

so you dont get paid by msi at all unwinder? i notice you dont deny being corrupted by higher powers...

lack of IQ? knowledgeless? then enlighten me...

why are the clocks doing this?

edit: second thought...screw it.. uninstalled.. i dont need to waste my time when the dev cant even take a little criticism about a major bug

Last edited by IKnowJack; 05-03-2012 at 13:00.
   
Reply With Quote
Old
  (#63)
Unwinder
Moderator
 
Videocard:
Processor:
Mainboard:
Memory:
Soundcard:
PSU:
Default 05-03-2012, 13:05 | posts: 11,192 | Location: Taganrog, Russia

Quote:
Originally Posted by IKnowJack View Post
so you dont get paid by msi at all unwinder? i notice you dont deny being corrupted by higher powers...

lack of IQ? knowledgeless? then enlighten me...

why are the clocks doing this?
Don't tell me that you don't see a difference between being paid for job and being corrupted by $$. So troll yourself outside this section, boy.
Aslo, elighten yourself and fill the void in your head by reading beta 15 thread and understanding info about new NVIDIA clock monitoring API introduced with Keplers in 300.xx drivers, then try to feel the difference between previousle displayed target and currently displayed generated clocks. Your limited knownede is your own problem.
You have one month to browse forums in read-only mode and read without posing nonsense. Enjoy it.


Alexey Nicolaychuk aka Unwinder, RivaTuner creator
   
Reply With Quote
Old
  (#64)
pimp_gimp
Ancient Guru
 
pimp_gimp's Avatar
 
Videocard: EVGA Geforce GTX 680 SLI
Processor: Core i7-3770k 4.5Ghz/H220
Mainboard: Asus P8Z77-V Premium
Memory: 8GB Corsair Vengeance
Soundcard: X-Fi Titanium Fatal1ty
PSU: Corsair AX1200
Default 05-03-2012, 19:17 | posts: 5,815 | Location: Tacoma, Washington

Quote:
Originally Posted by IKnowJack View Post

edit: second thought...screw it.. uninstalled.. i dont need to waste my time when the dev cant even take a little criticism about a major bug
I know you can't read my reply, but if you would've taken the time to read the changes in the beta 15 thread, you would understand that it is not a bug. Sorry but you are an idiot for saying something is a bug when it's clearly not. As Unwinder (and Nvidia) pointed out the way clocks are displayed has changed in the 300.xx drivers.
   
Reply With Quote
Old
  (#65)
kazama
Master Guru
 
kazama's Avatar
 
Videocard: SLI EVGA GTX 680 SC
Processor: i7 2600k @ 4.5
Mainboard: P8P67 PRO B3 (rev 3.0)
Memory: 8GB Gskill Ripjaws X 1600
Soundcard: ROG Xonar Phoebus
PSU: OCZ ZX 1250w
Default 05-04-2012, 23:59 | posts: 264 | Location: Spain

Hi, im on 6990m on my alienware, why can oc my card far than 714mhz? with trixx i can oced it to 800 without problems, but preffer by far AB.Core bar dont pass 714 value.

Can it be fixed please?
   
Reply With Quote
Old
  (#66)
St. bluedrop
Member Guru
 
Videocard: MSI GTX 570 TFIII@950MHz
Processor: Intel i5 750@3.8GHz
Mainboard: MSI P55-GD65
Memory: 4GB DDR3 CL8 1600MHz
Soundcard: Monitor Audio RX6
PSU: Corsair TX 650W
Default 05-06-2012, 18:29 | posts: 99 | Location: Sweden

Hey I'm using the GTX 570 Twin Frozr III and I wonder if the kombuster counts as furmark?
Any GTX 5xx cards gets downclocked when using such applications.
   
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump



Powered by vBulletin®
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
vBulletin Skin developed by: vBStyles.com
Copyright (c) 1995-2014, All Rights Reserved. The Guru of 3D, the Hardware Guru, and 3D Guru are trademarks owned by Hilbert Hagedoorn.