stephen2008

Enabling 10Bit Color depth on ATI graphic cards

Discussion created by stephen2008 on Apr 28, 2008
Latest reply on Oct 2, 2012 by marcelocopetti
Going from X8R8G8B8 to A2R10G10B10

Hi there,

I have written a very simple program in DirectX 9.0 using Visual C++ 6.0 that can display RGB colors. The test program is a simple gradient from black to white (with all the colors in between). It is written with default of X8R8G8B8 color format. I am in the process of testing 10 bit per component (bpc) color mode. I have performed enumeration and the video card can support 10bpc.

The parameter I change is:

D3DParams.BackBufferFormat = D3DFMT_A2R10G10B10

It doesn't seem to be able to show any difference in the gradient. Does anybody know how to perform this mode of operation? It seems it is rarely used and posters on other forums suggest it is simply changing this parameter.

Why is this important? I am testing high bit depth. I have a device that can capture the outcoming raw bits. I can see that it is never utilizing 2 bits of each portion.

Any help would be appreciated! =)

Outcomes