Guru3D.com Forums

Go Back   Guru3D.com Forums > Affiliates > ATI Tray Tools > ATI Tray Tools Generic Discussion
ATI Tray Tools Generic Discussion This forum is intended for comments, ideas and general discussion of the ATI Tray Tools Utility which is hosted here at the Guru of 3D. This forum is visited by programmer himself.



 
 
Thread Tools Display Modes
Holding Colour Calibration in Full Screen - Solution!
Old
  (#1)
Cruachan47
Newbie
 
Videocard: (PCI-E) Radeon X1950 Pro
Processor: AMD Athlon 64X2 6400+
Mainboard: ASRock 939Dual-SATA2(AM2)
Memory: 2GB CrucialBallistix DDR2
Soundcard: SB Audigy2 ZS Platinum
PSU: NeoHE 650W PSU
Default Holding Colour Calibration in Full Screen - Solution! - 11-27-2008, 17:52 | posts: 16

Hi,

I realize the responses to my previous posts have been somewhat sparse, but for those who have been accompanying me on this somewhat frustrating journey I have at last found a solution!

First, this probably only applies to those of you who, like me, prefer to have accurate monitor colour calibration and wish to preserve this while running games or simulations in full screen mode. Or, in other words, we want to prevent the graphics driver from replacing the colour lookup table (LUT) in the graphics adapter with default colours and modified gamma setting.

Second, the only way I have been able to achieve accurate colour calibration is by using a 3rd party utility - I use Spyder2express. Following calibration, a custom .icm monitor profile file is created with all the necessary calibration data and this is loaded automatically into the graphic adapter's LUT when Windows loads. In the case of Spyder2express, this is accomplished by a LUT data loader ColorVisionStartup which appears as an item in msconfig.


PowerStrip

This clever and powerful utility can do many things but, for the purposes of this post, we are only interested in one capability: it's ability to capture data preloaded into the Graphic Adapter's Colour LUT.

I quote from a post by Rik Wang at Entech Taiwan:

"1. Calibrate your monitor using the software of your choice
2. PowerStrip menu > Color profiles > Configure
3. Click the "camera" speedbutton, select "Capture" from the dropdown menu,
and provide a descriptive name to the captured gamma ramp.
4. Check "Apply adjustments to non-linear ramp", and select the gamma ramp you just saved from the list.

This gamma ramp is stored in the Windows registry and now replaces the graphics card's default, linear gamma ramp as the basis for all of PowerStrip's other color controls, including PowerStrip's hotkeys and color and application profiles. There are no limits on the number of captured gamma ramps you can use, or their origin.
"

After completing Rik's instructions you should then Save as.. a profile. I called mine "Spyder2express Color Profile".

Next, you should test to see that this is now working. Make sure that Powerstrip is set to load automatically with Windows and uncheck your LUT data loader, ColorVisionStartup in the case of Spyder2express, in msconfig. Reboot your system.

Now for the neat bit.

Right-click the PowerStrip icon in the system tray and select Application Profiles > Configure....

Under Display, color and performance preferences select the colour profile you created earlier.

Under Application or shortcut browse to the shortcut of any application with which you wish to use this colour profile.

Then, under Profiles save the profile with the same name as the application.

Lastly, place a check next to Apply this profile whenever program starts.

That's it!

Now, when you start such an application using its shortcut it will automatically load the customized accurate colour profile into the adapter's LUT and PowerStrip will override any attempts by the driver to impose the defaults.

Hope this has been helpful.

I still say it would be very nice to have this capability in ATI Tray Tools

Mike

P.S.
I did encounter one problem. PowerStrip appears to conflict with ATI Tray Tools. When the two programs are running together I lose the ability to control the GPU fan speeds at different temperatures and monitor the Environment and GPU temperatures in the system tray.

When I go into ATT System Tray Hardware Monitoring applet, the "Enable hardware monitoring" is checked, but the GPU Temperature and Environment Temperature options are greyed out.

Also, in the Overclocking module, when the Fan tab is selected it displays the message: "No supported hardware found"

When PowerStrip is disabled at startup ATT regains full functionality.

I have posted this observation at Entech Taiwan in the hope that there is a solution, but if anyone reading this has cracked this one already please feel free to let us all know how it can be resolved. I wish to continue using ATI Tray Tools for all my 3d settings.

Mike

Last edited by Cruachan47; 11-27-2008 at 18:03.
   
 
Old
  (#2)
Cruachan47
Newbie
 
Videocard: (PCI-E) Radeon X1950 Pro
Processor: AMD Athlon 64X2 6400+
Mainboard: ASRock 939Dual-SATA2(AM2)
Memory: 2GB CrucialBallistix DDR2
Soundcard: SB Audigy2 ZS Platinum
PSU: NeoHE 650W PSU
Default 11-28-2008, 14:34 | posts: 16

Quote:
Originally Posted by Cruachan47 View Post
I did encounter one problem. PowerStrip appears to conflict with ATI Tray Tools. When the two programs are running together I lose the ability to control the GPU fan speeds at different temperatures and monitor the Environment and GPU temperatures in the system tray
Mike
Rik Wang of EnTech Taiwan was kind enough to provide the solution:

1. Close PowerStrip if its running
2. Open the pstrip.ini file in Notepad
3. Add the following switch:

[Global Options]
DisableI2C=1

4. Save the file back to disk
5. Reboot

Thank you Rik, now I have the best of both worlds!

Mike

Edit: I've since discovered that this does not work with every card. I tried it with an X800XL and Windows locked up while loading the desktop. A reboot and all was well once again and I found that the DisableI2C=1 line had been removed automatically from pstrip.ini. ATT continues to work properly monitoring the environment and gpu temps and adjusting the fan speeds. Clearly this tweak is not needed with every card.

(PCI-E)Sapphire ATI Radeon X1950 Pro 512MB (Catalyst 8.10 WHQL)
Samsung SyncMaster 226BW 22" LCD Display Monitor (1680x1050x32)

Last edited by Cruachan47; 11-29-2008 at 00:26.
   
Old
  (#3)
saltesc
Newbie
 
Videocard: 8800GTX 1GB
Processor: Athlon 64 4400+
Mainboard:
Memory:
Soundcard:
PSU: 450W
Default 01-16-2009, 09:16 | posts: 1

Man I can't get this to work. It's something that has driven me crazy on and off for months now.

Probably 70% of the games I play dump my profile even though I have Windows set to it, MCW forcing it and Powerstrip apparently taking care of the LUT.

I had Fallout 3 going today and for some reason everything was working with just MCW open. I got to a part where I couldn't see anything and was drowning so I went to in-game settings and up'd the brightness... this reset the adapter to the crappy colour settings ignoring my profile. I restarded the game many times but it seems like this was a once off and F3 is back to being tinted blue... like FarCry2... and Crysis.... and GTR2... Need For Speed: Undercover works but it's a NFS game, it's exciting for 30mins.

Don't suppose you've found any other methods on your Google searches? As I have said I've tried for months and searched for hours...
   
Old
  (#4)
gx-x
Maha Guru
 
gx-x's Avatar
 
Videocard: Sapphire 280X Dual-X OC
Processor: intel i5 3570K
Mainboard: ASRock Extreme4 Gen3
Memory: patriot 4x2GB @1333 ddr3
Soundcard: Yamaha RX-V550 w/!JBL
PSU: tT SmartSE 530W
Default 11-06-2009, 19:54 | posts: 873 | Location: Serbia

I am sorry to bump this topic up again.
Has anyone found a solution for this LUT problem yet?

I am on windows 7, I h ave my hardware calibrated .icc profile which I use and it is fine on boot. However, whenever I run CCC it just reverts to default LUT and I have to re-apply the profile either trough color management or with power strip. When I bind CCC.exe to power strip aplication profile that is supposed to load my LUT - it fails to do so.
Also, most games are fine, aplications also, but for instance, Crysis also screws LUT up back to default.

Is there a way TO LOCK aplications from changing it? I mean, I would need hours to create a profile i power strip for each and every program or game or what ever. Can this be done globally?
   
 
A (very technical) solution...
Old
  (#5)
Orum
Newbie
 
Videocard: Sapphire 5850 1GB
Processor: i7-860
Mainboard:
Memory:
Soundcard:
PSU: Corsair 550W
Lightbulb A (very technical) solution... - 06-04-2010, 19:13 | posts: 5

First off, what exactly is the problem?

The problem is D3D9, and likely previous versions of D3D, have a function called SetGammaRamp. D3D10/11 have an equivalent function called SetGammaControl. These functions wouldn't normally be a problem except for one annoying fact: They completely ignore any monitor calibration profile! For D3D9, despite what Microsoft appears to be saying on their flags section if you use D3DSGR_CALIBRATE
Quote:
Originally Posted by MSDN
If a gamma calibrator is installed, the ramp will be modified before being sent to the device to account for the system and monitor response curves. If a calibrator is not installed, the ramp will be passed directly to the device.
calling SetGammaRamp even with D3DSGR_CALIBRATE it will ignore your color profile and assume standard sRGB.

Is there a workaround?

Yes, as these functions only take effect if the application is fullscreen, you can simply play in windowed mode. In some cases, you can play in a "windowed fullscreen" mode, which some games (e.g. Starcraft 2) offer, which also will not cause color override issues.

So, how do we fix it? Well...

It seems the only way to fix this is to prevent the offending application(s) from ever calling SetGammaRamp/SetGammaControl (hereafter referred to as simply SGR/SGC). This is not exactly "easy". What I've done is created a DLL that is injected into the calling process that prevents the application from ever calling the real SGR/SGC and instead redirecting the calls to our impostor function that does nothing instead.

This has a few potential problems, a few of which I'll list here.
  • If SGR (and probably SGC too, but I haven't tested this yet) is EVER called by the process, there's no point in redirecting the call--it's simply too late to undo any damage caused by the function. Why? Well, there's no way to restore the original "gamma ramp", in that the default gamma ramp is bogus. If you GetGammaRamp before you ever call SGR, you'll find that the default gamma ramp is simply red[x] = green[x] = blue[x] = x;, where x is the index, which doesn't make sense--the RGB values are supposed to scale between 0 and 65535. If you were to SGR this default ramp, your whole screen would basically be black (the "whitest" point would be RGB(1, 1, 1)).
  • As a consequence of the above, injection and subsequent redirection has to happen before the function calls are ever made. This is made more difficult by the fact that these functions are often called immediately after creation of the D3D device, which in turn is usually created immediately after process execution. This is easy when we can simply CreateProcess w/ CREATE_SUSPENDED, but sometimes games need to be launched from elsewhere--e.g. steam (AFAIK), or if you want to use another application that necessitates external process launching, e.g. Texmod (which I use w/ GuildWars). As such I am working on a way to still inject early into these processes. It will likely be something ugly like hooking CreateProcess, but I don't see another way around this that doesn't carry some variability of success.
  • Some antivirus software treat DLL injection applications as viruses, as they use syscalls that are suspicious to say the least. DLL injection certainly can be used for harmful purposes as well as in game cheating, which brings me to my next point...
  • It's quite possible that some anticheat (e.g. punkbuster, VAC) would accidentally detect such function redirection as a cheat. In the sense that you're injecting a DLL and overwriting the vtable, I can see how this would happen. Hopefully it will not be detected as a false positive, in that I'm not redirecting functions typically used for cheats, like EndScene and DrawIndexedPrimitive, but I don't know how the anticheats work and what they do and don't detect as a cheat. This has another consequence in that if an anticheat overwrites the entire vtable (including SGR/SGC), instead of just the ones common to hacks, in order to prevent hooking, it would then destroy our detour and future calls to SGR/SGC would cause incorrect color again.

Where can I get it?

I haven't released anything yet, for a couple reasons. I need to clean up the code, add the feature to support other launchers, verify that everything works (in a lot more games than I've tested with so far), and check to see what licenses the code I've used & modified are under. Once I've finished I'll repost here to let you guys know.

Last edited by Orum; 06-04-2010 at 19:18.
   
Old
  (#6)
gx-x
Maha Guru
 
gx-x's Avatar
 
Videocard: Sapphire 280X Dual-X OC
Processor: intel i5 3570K
Mainboard: ASRock Extreme4 Gen3
Memory: patriot 4x2GB @1333 ddr3
Soundcard: Yamaha RX-V550 w/!JBL
PSU: tT SmartSE 530W
Default 06-04-2010, 20:42 | posts: 873 | Location: Serbia

Actually, I found a solution, it's free and it's called "Monitor Calibration Wizard". It has an option of locking the LUT to a color profile your select/load at startup, so for instance Crysis will reset LUT but MCW will just load it up again. It works in windows 7/Vista/XP.

I use it in conjunction with "QickGamma", another free app that I use for the actuall calibration, then I lock that profile with MCW.

Enjoy
   
Old
  (#7)
Orum
Newbie
 
Videocard: Sapphire 5850 1GB
Processor: i7-860
Mainboard:
Memory:
Soundcard:
PSU: Corsair 550W
Default 06-04-2010, 23:59 | posts: 5

I've tried the monitor calibration wizard, but it doesn't seem to work for me, at least in the few games I tried. Quickgamma I'll have to try though, when I get back to my desktop...

Thanks!

(edit: whoops, misread that...I already have the profile from my i1 Display, but maybe I'll mess around with MCW again)

edit2: Yeah, I can't get MCW to lock the profile. I think the problem is I'm not using a profile generated by it or Quickgamma, but instead an ICC profile from my calibration device. It doesn't let you import a profile anywhere that I can see, and trying to save the current setup doesn't work.

Last edited by Orum; 06-05-2010 at 16:14.
   
Old
  (#8)
gx-x
Maha Guru
 
gx-x's Avatar
 
Videocard: Sapphire 280X Dual-X OC
Processor: intel i5 3570K
Mainboard: ASRock Extreme4 Gen3
Memory: patriot 4x2GB @1333 ddr3
Soundcard: Yamaha RX-V550 w/!JBL
PSU: tT SmartSE 530W
Default 06-06-2010, 15:27 | posts: 873 | Location: Serbia

@Orum:

No, it can't import any regular profiles :/ BUT it can save the current profile you are using under it's own "standard". What you need to do is set your monitor profile, open up MCW then just go to save current profile. After you did that, load it to confirm it's saved. If it is, then select it to auto load on startup and select that lock thingy (no need to use that force checkbox afaik)

if it doesn't save the profile, then leave it open, uncheck lock, load your icc again and apply it (outside of MCW while MCW is running) then save it again in MCW and that should work.
I use quickgamma because I do not have calibration hardware and MCW calibration gives me very weird results (lighter blacks or darker whites, usually both. Probably doesn't like non-CRT displays, or at least it doesn't go well with my Dell US 2209WA IPS panel )
   
Old
  (#9)
Orum
Newbie
 
Videocard: Sapphire 5850 1GB
Processor: i7-860
Mainboard:
Memory:
Soundcard:
PSU: Corsair 550W
Default 06-07-2010, 23:33 | posts: 5

I applied my profile using the "Windows color management", started up MCW, saved the profile, and (unlike w/ the default profile) it didn't revert to standard sRGB right away. It still, however, doesn't prevent problems arising from games, even if I check the "driver level" option. I'm not sure why, but if I had to take a guess I'd say it can't figure out how to take my current profile (ICCv4 LUT) and convert it to its own internal format. Also, there are no guarantees that its own internal format would even contain all the information stored in the other profile.

Last edited by Orum; 06-07-2010 at 23:36.
   
Old
  (#10)
gx-x
Maha Guru
 
gx-x's Avatar
 
Videocard: Sapphire 280X Dual-X OC
Processor: intel i5 3570K
Mainboard: ASRock Extreme4 Gen3
Memory: patriot 4x2GB @1333 ddr3
Soundcard: Yamaha RX-V550 w/!JBL
PSU: tT SmartSE 530W
Default 06-08-2010, 00:30 | posts: 873 | Location: Serbia

I don;t know. Ati ccc used to reset my LUT every time it (has mercy to) opens, warhead also reverts lut, with mcw it just reloads my desired profile. But like I said, I don't know if it works with all icc profiles. Afaik, they are all just gamma values...should work fine...
   
Old
  (#11)
sarelc
Newbie
 
sarelc's Avatar
 
Videocard: R4870 512MB
Processor: Q9550
Mainboard: EP45-DS4P
Memory: 4GB Redline
Soundcard:
PSU: HX-750
Default 06-29-2010, 02:26 | posts: 2

Orum I just wanted to send you some encouragement regarding your project. If you're able to release a workable solution for this then you'd better setup a Paypal donations box.
   
Old
  (#12)
dragonlore
Newbie
 
Videocard: 280/asus/1000 mb
Processor: core2duo
Mainboard:
Memory:
Soundcard:
PSU: seasonic
Default 07-04-2010, 15:18 | posts: 10

Quote:
Originally Posted by Orum View Post
I applied my profile using the "Windows color management", started up MCW, saved the profile, and (unlike w/ the default profile) it didn't revert to standard sRGB right away. It still, however, doesn't prevent problems arising from games, even if I check the "driver level" option. I'm not sure why, but if I had to take a guess I'd say it can't figure out how to take my current profile (ICCv4 LUT) and convert it to its own internal format. Also, there are no guarantees that its own internal format would even contain all the information stored in the other profile.
I hope that you can release your solution as soon as possible because it's a big problem.

Monitor calibration wizard doesn't work under seven 64 (it works very well under xp with all games, partially under vista 32 (it works with some games like just cause 2, nfs hot pursuit 2, crysis 1 etc.. and not with some games like hawx, resident evil 5 etc..)) and the only solution is to paly in window mode but only few games allow this choice.

I have a iiyama 19" HM903DTB crt monitor
   
Old
  (#13)
Orum
Newbie
 
Videocard: Sapphire 5850 1GB
Processor: i7-860
Mainboard:
Memory:
Soundcard:
PSU: Corsair 550W
Default 07-05-2010, 16:49 | posts: 5

Thanks for the encouragement! I knew I wasn't alone with this problem... Just wanted to post an update.

I've got the fix working for DX9, DX10, and DX11. I've created a SourceForge project (website to follow after releases) to host the tools once I've put the finishing touches on the injector, which will be done soon, barring any major complications. Actually releasing this will be a little complicated due to all the licenses involved (the code is based on other code I pulled from numerous sources under varying licenses).

Thanks again.

Last edited by Orum; 07-05-2010 at 16:56.
   
Old
  (#14)
Orum
Newbie
 
Videocard: Sapphire 5850 1GB
Processor: i7-860
Mainboard:
Memory:
Soundcard:
PSU: Corsair 550W
Default 07-15-2010, 01:00 | posts: 5

Sorry to double post, but just wanted to let everyone know it's finally up. You can download v0.1 directly here. Please read the readme if you are lost (well you should probably read parts of it regardless).
   
Old
  (#15)
janos666
Master Guru
 
janos666's Avatar
 
Videocard: Sapphire R9 290X 4Gb
Processor: Intel i5-2500K @ 4.4Ghz
Mainboard: Asrock EZ68 E3G3
Memory: Corsair 2x4Gb 2133Mhz CL9
Soundcard: Denon 2809 + Focal Chorus
PSU: Corsair AX860
Default 08-10-2010, 02:28 | posts: 475 | Location: Hungary

Thanks for the DX "fix"es. I think they will come in handy in the future.
But today, I want to do something with an OpenGL program: Riddick - AotDA
It has a SystemGammaRamp.xrg file which contains this data:
Code:
	*none "0, 1"
	*vga "0.000000, 0.118110, 0.173228, 0.196850, 0.228346, 0.244094, 0.267717, 0.291339, 0.307087, 0.330709, 0.354331, 0.377953, 0.401575, 0.425197, 0.440945, 0.464567, 0.480315, 0.503937, 0.527559, 0.551181, 0.574803, 0.598425, 0.629921, 0.653543, 0.677165, 0.700787, 0.732283, 0.763780, 0.795276, 0.826772, 0.881890, 0.952756"
	*component "0.000000, 0.236220, 0.267717, 0.283465, 0.307087, 0.322835, 0.338583, 0.354331, 0.370079, 0.393701, 0.409449, 0.425197, 0.448819, 0.464567, 0.480315, 0.503937, 0.519685, 0.535433, 0.559055, 0.574803, 0.598425, 0.622047, 0.637795, 0.661417, 0.692913, 0.708661, 0.740157, 0.763780, 0.787402, 0.826772, 0.866142, 0.929134"
	*composit "0.000000, 0.291339, 0.314961, 0.330709, 0.346457, 0.362205, 0.377953, 0.393701, 0.417323, 0.425197, 0.448819, 0.456693, 0.472441, 0.496063, 0.511811, 0.527559, 0.543307, 0.559055, 0.582677, 0.598425, 0.614173, 0.637795, 0.661417, 0.677165, 0.700787, 0.724409, 0.748031, 0.771654, 0.803150, 0.834646, 0.866142, 0.952756"
	*s-video "0.20000, 0.291339, 0.314961, 0.338583, 0.354331, 0.370079, 0.385827, 0.401575, 0.417323, 0.433071, 0.448819, 0.456693, 0.472441, 0.488189, 0.511811, 0.527559, 0.543307, 0.559055, 0.582677, 0.598425, 0.622047, 0.637795, 0.653543, 0.669291, 0.700787, 0.724409, 0.740157, 0.763780, 0.795276, 0.826772, 0.866142, 0.913386"
So, it applies some kind of gamma remap for some deliberate reasons (may be a better fit for a HDTV or an old analogue CRT, ect.). But the digital connections (DVI/HDMI/DP) are not listed here, so I think they should trigger the simple "none" (or may be the VGA but it is a bug then) which won't hurt if we bypass it.
But how...?
These are simple curves, I can't insert my RGB LUT here. How can I avoid the erase of the VGA LUT?

By the way, the game is very bright with the default settings (even the black is gray and the overall gamma is too low as well) and it is still too bright after I set the gamma slider to 0% (correct black level but low gamma -> my display has low gamma as well, so may be it is only the the loss of my LUT.). So, may be it accidentally applies one of the settings above. But it is not good...

EDIT:
I tried to rename this file but nothing changed. The image is still too bright with the default 50% gamma settings. And the LUT is also cleared when the game goes into FS mode.
There is little gray ramp test pattern with a few steps on that option page. The last few steps are equally white with any gamma settings. The gamma slider looks more like a brightness slider.
Any ideas?


Otherwise, every game should have a console command to render a 8 bit gray ramp test pattern.

Last edited by janos666; 08-10-2010 at 02:50.
   
 

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump



Powered by vBulletin®
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
vBulletin Skin developed by: vBStyles.com
Copyright (c) 1995-2014, All Rights Reserved. The Guru of 3D, the Hardware Guru, and 3D Guru are trademarks owned by Hilbert Hagedoorn.