Hi there,
I have written a very simple program in DirectX 9.0 using Visual C++ 6.0 that can display RGB colors. The test program is a simple gradient from black to white (with all the colors in between). It is written with default of X8R8G8B8 color format. I am in the process of testing 10 bit per component (bpc) color mode. I have performed enumeration and the video card can support 10bpc.
The parameter I change is:
D3DParams.BackBufferFormat = D3DFMT_A2R10G10B10
It doesn't seem to be able to show any difference in the gradient. Does anybody know how to perform this mode of operation? It seems it is rarely used and posters on other forums suggest it is simply changing this parameter.
Why is this important? I am testing high bit depth. I have a device that can capture the outcoming raw bits. I can see that it is never utilizing 2 bits of each portion.
Any help would be appreciated! 😃
Hi avk,
Thanks for the suggestion. I do have one application written for the DOS environment which actually is able to set the bit value to 10 bpc and the resulting gradient is much smoother. That's the only hunch that I have between 8 bit and 10 bit. It seems like in Windows XP I am unable to perform such an operation =(. Could it be a driver issue that restricts it? It would seem odd this day and age that drivers don't support this bit depth.
Good morning
I was reading posts here and I would like to say that use 10 bit for professionals is essential.
When a photographer is analyzing his work in monitors, they have images from digital cameras with 12 bits or 14 bits.
Then, see gradients and other color information smoother in the monitor is very helpful.
I am looking for a way to define which graphics cards can handle 10 bit colors or more to advise these professionals.
Another area that I believe that is very important is medicine. For obvious reasons.