[quote]Ian (the author of cclutch) can probably explain more in detail if he decides to sign up.
Indeed, I really would be quite interested in the technical details of implementing it. Right now most games are made exactly the same way, a window is created with WinAPI with a 'flag' to set it to fullscreen. So really by default it isn't the games fault. It's either Windows or the video drivers that disable it in Fullscreen apps.
If there is a way for the application to override it and explicitly ask to use the active color profile It would be really nice for documentation, and without it you're idea really cannot go anywhere sadly.
Alright, sorry it has taken me so long to get signed up here, I've been very busy these last few weeks with personal and school obligations.
Anyway, the problem is, at least on the DirectX front, not a problem of what a game doesn't do, but rather a problem of what a game does do. Full-screen Direct3D applications do
not inherently screw up color calibration. The problem occurs when a specific function is called--in Direct3D9 and earlier, this is SetGammeRamp(), and in later versions it was moved to the DXGI and renamed SetGammaControl() (simply referred to as SGR()/SGC() from here on). These are the functions that cause loss of color calibration. While I can't say with 100% accuracy whether the problem itself lies in Microsoft's design of the API or the implementation by the various graphics card vendors (nVIDIA/ATi/Intel), with the problem occurring on several vendors cards I would assume it is behaving exactly how Microsoft designed it.
It's even more ambiguous when you look at the
documentation. There is actually a flag that can be passed to the function, D3DSGR_CALIBRATE, which "If a gamma calibrator is installed, the ramp will be modified before being sent to the device to account for the system and monitor response curves." However, upon testing, I found that this either doesn't work as expected (at least on my 5850), or works exactly as expected but the "gamma calibrator" doesn't include installed monitor profiles and therefore is seemingly useless anyway. If it doesn't include the "curves" of monitor profiles, what does it refer to then?
The reasons for even calling such a function to begin with are dubious. Originally, it seems this was used by developers to "fade out" or "fade in" a scene, or provide some basic color filtering, which was not very easy to do efficiently back in the day. This doesn't seem to be in practice in any game made in the last ~5 years that I've played, judging from my experience playing them both with and without Color Clutch in use (explained later).
More recently, games seem to use it in a different fashion--to provide users who have no such monitor calibration some in-game adjustment. Ultimately, developers want the game to look the same way to the end user that it looked to them, and setting gamma ramps is a quick, though incomplete and incorrect fix for this. A better way would allow users to set brightness, contrast, and gamma using a shader, which is what I believe some of the newer games are doing. For instance, I recently purchased BF:BC2, and it has brightness and contrast (no gamma) adjustments which thankfully never call SGC().
The easy fix for those who want full screen gaming while holding on to their monitor calibration is to simply
not call the function. Unfortunately, most games call SGR()/SGC() no matter what, even if there have been no adjustments to the gamma in the settings. That was why I wrote Color Clutch. The theory behind it is simple--prevent the games from calling these functions, and your color calibration will survive. There aren't a whole lot of good ways to do this, so I took what I thought was the best way; injecting a DLL into the process, and then, whenever it tries to call SGR()/SGC(), instead it calls my "bogus" function with the same parameters. The only difference between the real function and mine is that my function doesn't actually do anything.
As for current plans regarding Color Clutch, I'm working on support for some older versions of DirextX (specifically 6, which I think is the first version to include a SGR() function, and 7). This is made difficult by the complete lack of documentation pertaining to these old APIs, but I should eventually be able to get something out that works. OpenGL, though, has no apparent analog to SGR()/SGC(), so I don't believe I can do anything there, and I'm not entirely sure how, why, or even if some users are losing calibration on OGL games.
Sorry about the long post, but I thought it was better to be thorough than lacking. :)