Nvidia said:Thank you for the update and I apologize for the confusion.
After the doing further research, I found that NVIDIA does not control or have any function to control the color settings while playing games in full screen mode. Since, these features are application specific it is controlled via the applications. Therefore, please use the games setup if applicable.
Write directly to palette DAC" is an EnTech expression that predates NVidia, and essentially refers to going around the OS and the display driver, and writing directly to the hardware. NVidia does not approve of this method, and starting with the Geforce8 no longer shares hardware register data with EnTech.
Anyway, the problem is, at least on the DirectX front, not a problem of what a game doesn't do, but rather a problem of what a game does do. Full-screen Direct3D applications do not inherently screw up color calibration. The problem occurs when a specific function is called--in Direct3D9 and earlier, this is SetGammeRamp(), and in later versions it was moved to the DXGI and renamed SetGammaControl() (simply referred to as SGR()/SGC() from here on). These are the functions that cause loss of color calibration. While I can't say with 100% accuracy whether the problem itself lies in Microsoft's design of the API or the implementation by the various graphics card vendors (nVIDIA/ATi/Intel), with the problem occurring on several vendors cards I would assume it is behaving exactly how Microsoft designed it.
It's even more ambiguous when you look at the documentation. There is actually a flag that can be passed to the function, D3DSGR_CALIBRATE, which "If a gamma calibrator is installed, the ramp will be modified before being sent to the device to account for the system and monitor response curves." However, upon testing, I found that this either doesn't work as expected (at least on my 5850), or works exactly as expected but the "gamma calibrator" doesn't include installed monitor profiles and therefore is seemingly useless anyway. If it doesn't include the "curves" of monitor profiles, what does it refer to then?
The reasons for even calling such a function to begin with are dubious. Originally, it seems this was used by developers to "fade out" or "fade in" a scene, or provide some basic color filtering, which was not very easy to do efficiently back in the day. This doesn't seem to be in practice in any game made in the last ~5 years that I've played, judging from my experience playing them both with and without Color Clutch in use (explained later).
More recently, games seem to use it in a different fashion--to provide users who have no such monitor calibration some in-game adjustment. Ultimately, developers want the game to look the same way to the end user that it looked to them, and setting gamma ramps is a quick, though incomplete and incorrect fix for this. A better way would allow users to set brightness, contrast, and gamma using a shader, which is what I believe some of the newer games are doing. For instance, I recently purchased BF:BC2, and it has brightness and contrast (no gamma) adjustments which thankfully never call SGC().
The easy fix for those who want full screen gaming while holding on to their monitor calibration is to simply not call the function. Unfortunately, most games call SGR()/SGC() no matter what, even if there have been no adjustments to the gamma in the settings. That was why I wrote Color Clutch. The theory behind it is simple--prevent the games from calling these functions, and your color calibration will survive. There aren't a whole lot of good ways to do this, so I took what I thought was the best way; injecting a DLL into the process, and then, whenever it tries to call SGR()/SGC(), instead it calls my "bogus" function with the same parameters. The only difference between the real function and mine is that my function doesn't actually do anything.
As for current plans regarding Color Clutch, I'm working on support for some older versions of DirextX (specifically 6, which I think is the first version to include a SGR() function, and 7). This is made difficult by the complete lack of documentation pertaining to these old APIs, but I should eventually be able to get something out that works. OpenGL, though, has no apparent analog to SGR()/SGC(), so I don't believe I can do anything there, and I'm not entirely sure how, why, or even if some users are losing calibration on OGL games.
Sorry about the long post, but I thought it was better to be thorough than lacking.
MCW uses some Windows API (application programming interface) to adjust the colors, basically a couple functions to set the gamma ramp lookup table. It works well within Windows, and a number of games (specifically OpenGL based games). However, support within Direct3D games become dependent on the video drivers to "care" about this lookup table and to carry the settings over into what is rendered through Direct3D. Support in that regard has typically been kept, but it does get broken from time to time by the video card driver developers.
The other issue is that the video card drivers themselves have their own color correction support built in, and often that correction takes precedence over the Windows gamma ramp tables. I have an option in MCW to "apply fix to override driver level color correction", which attempts to workaround some video card driver implementations. It does this by setting the gamma ramp table twice, with two different profiles. Oddly, that has worked decent as a workaround for the video card driver implementations, but may do nothing to help in your case.
In the long run, new methods other than the gamma ramp table may be needed to be used to support games. I haven't dug into it too much, but it may be possible to modify the Direct3D drawing surface being used by a game (unless Windows security prevents it). Otherwise, I would be looking at utilizing the APIs available in each video card driver. I know ATI does a good job of providing a programming interface to their drivers, so I wouldn't expect it to be difficult to get things working well on an ATI based video card. I haven't looked at nVidia yet, but I would imagine they have something similar available. Of course, going that route also means you have to build against a number of video card manufactures' programming interfaces (APIs), and support multiple versions and changes to those APIs, which gets to be a pain. This is why I think it might be better to bring pressure on the video card developers to keep proper support for the Windows gamma ramp API, so existing software doesn't break.
Back when Vista came out, Microsoft broke support for the gamma ramp API, requiring application to ask for permission before it could be used. This basically broken all the calibration software out there that wasn't written for Vista, and virtually all the OpenGL based games (as far as color correction built into the games). Later, Microsoft backed down on the security so everything worked as it did before. Some of the video card manufactures also built in workarounds into the video card drivers because of how slow Microsoft was in dealing with it.
There's no universal fix, but there are generally workarounds. I am a gamer myself, so I would certainly attempt to get things working within games if I can.
"This is your code. These are also your bugs. Really. Yes, the API runtime and the
driver have bugs, but this is not one of them. Now go fix it already." -fgiesen
Beast Mode said:
I have to reactivate it every time I start up my computer.
i7 @ 4GHz
Radeon HD7950 @ 1180/1500
A-DATA 4x2GB @1600MHz CL7
Intel X25-V 40GB + few HDD >4.5TB total
Vista Home Premium + rarely used XP&Win7
Xonar Essence STX + Technics RP-F30
Sony GDM-FW900 24" Trinitron CRT (no AG)
LG Flatron W2420R RGB-LED A-TW H-IPS
we want custom LUTs as there are a lot of good monitors with flaws that can only be corrected with calibration...
nvidia had loading ICC profiles directly in their control panel. They removed it long time ago...
You must Log In to add a comment.
New Private Message
Like Us On
Follow Us On
Copyright © 2016 NVIDIA Corporation