Would it be possible to force a screen-space shader to do a 3D lookup on the output RGB values? (Like the new "Morphological Anti-alising" post-processing filter works on every output?)
I imagine it as an ultimate color correction method for PC displays.
VGA cards already have a 1D LUT for display calibration (white balance and gamma correctinon) but we need 3D lookup for gamut correction and it has to be done on software level right now.
The problem is that every single software need to implement a color management modul on it's own. It happens with some professional photo editing softwares and even with some end-user video player softwares (more or less) but never happens with any PC video games. (And it's usually limited to the ICC standards. -> More about it later...)
I think this is an incresing problem today. It's getting harder or even impossible to find quality displays (modern IPS or PVA panels with high resolution ; or even plasma TVs for that matter) with standard sRGB gamut. Wide gamut displays take over the standard gamut which is good for some professionals (who are the main targets of the quality displays ; or some "blind end-users" who like to see red skin tones and neon-green vegetables in moves, etc...) but very bad for some end-users (and also can hurt some professionals in some cases).
*And this is another personal problem, but I am sick about the ICC color management standards. This separated calibration+profileing (with many but mostly useless rendering intents and other stupid limitations from traditional assumptions/practices) is an outdated approach with today's hardware.
I think it would be much better to do every possible corrections at one single place, through a carefully created 3DLUT. (With everything I mean: measure many color patches to construct the 3D device gamut and provide an output which corrects the white point, white balance, tonal response, saturation; everything in 3D at once.)
#This is not only my idea. Some high-end displays do it already. They store these 3DLUTs in the hardware memory and do the 3D lookup internally.
But those are rare and very expensive. But this could be also done with shaders.
I think this idea is getting more reasoneable with high precision shader calculations and high bit-depth display formats. -> Deep-color compatible displays.
I don't even ask for a complete solution (but I would have some suggestions how to do it relatively easily with low costs ; one single talented person could do it in some days). I only want to have the possibility in the drivers to run custom shaders as a very last processing step on the output values.
It could be useful for many people as it could be used for any kind of custom post-processing. But to keep with the current idea: Everybody would be free to create his own custom 3DLUT and his own shader to make the lookup.
It would require some flexibility to serve every possible needs, but a fixed 8-bit RGB -> 16-bit RGB lookpu would be more than good. (At least for the first time to test. ; Yes, I know, it would require DX11 cards but this is the second DX11 generation of the hardwares already.)
But a 8-bit RGB -> 10-bit RGB lookup would be good as well, as the 10-bit display output is the best you can practically get with current hardwares.
Nothing more but a post-processing filter, like the MorphAA, which does a lookup between the originally planned 8 bit output and their corrected 10+ bit pairs. You don't need to do the 3DLUTs itself, there is a solution for that already (and if this feature become availabel, I think some people would do their own software for this too, by modifing their current CMS softwares...). IT would be a format-question only but I will offer some possible solutions if an AMD developer would say that there is little chance that this idea could be materialized.
Let me know your opinions about this idea (both developers and users ; feel free to suggest even better solutions for the problem or to show why it is impossible, etc...).