5 Replies Latest reply on Oct 2, 2012 6:39 AM by marcelocopetti

    Enabling 10Bit Color depth on ATI graphic cards

      Going from X8R8G8B8 to A2R10G10B10

      Hi there,

      I have written a very simple program in DirectX 9.0 using Visual C++ 6.0 that can display RGB colors. The test program is a simple gradient from black to white (with all the colors in between). It is written with default of X8R8G8B8 color format. I am in the process of testing 10 bit per component (bpc) color mode. I have performed enumeration and the video card can support 10bpc.

      The parameter I change is:

      D3DParams.BackBufferFormat = D3DFMT_A2R10G10B10

      It doesn't seem to be able to show any difference in the gradient. Does anybody know how to perform this mode of operation? It seems it is rarely used and posters on other forums suggest it is simply changing this parameter.

      Why is this important? I am testing high bit depth. I have a device that can capture the outcoming raw bits. I can see that it is never utilizing 2 bits of each portion.

      Any help would be appreciated! =)

        • Enabling 10Bit Color depth on ATI graphic cards
          I must admit that I'm not a graphics programmer, but if I was you, I'd try to save the gradient image into disk file and analyze it. I think that not every human eye can recognize the difference between 24-bit (~16,8 million colors) and 30-bit (~1 billion colors) color depth . Did you do it?
          • Enabling 10Bit Color depth on ATI graphic cards

            Hi avk,

               Thanks for the suggestion.  I do have one application written for the DOS environment which actually is able to set the bit value to 10 bpc and the resulting gradient is much smoother.  That's the only hunch that I have between 8 bit and 10 bit.  It seems like in Windows XP I am unable to perform such an operation =(.  Could it be a driver issue that restricts it?  It would seem odd this day and age that drivers don't support this bit depth. 

            • Enabling 10Bit Color depth on ATI graphic cards
              Alas, I can't help you in that matter further . Definitely, you'll need a help from some graphics specialist, maybe on the different forum, if nobody will help you here.
              • Enabling 10Bit Color depth on ATI graphic cards

                First of all, make sure you are running in Fullscreen mode. In Window mode the contents of your A2R10G10B10 back buffer will be converted to whatever is the color depth of your GDI (probably 8888). You'll also need to make sure you've initialized fullscreen exclusive mode properly, otherwise you may think you're in fullscreen mode when in fact you could still be in "maximized windowed" mode.

                Also this may sound stupid but do you actually have a display supporting 10 bits per channel color depth? Those display do exist but they're usually more expensive than more common LCDs. If your display cannot handle 10-bits per channel then you obviously won't be able to see any better quality in your gradient. I know you mentioned that you could see all 10 bits under a DOS program, but maybe the resulting smoother gradient was due to something else (e.g. dithering?).
                  • Re: Enabling 10Bit Color depth on ATI graphic cards

                    Good morning


                    I was reading posts here and I would like to say that use 10 bit for professionals is essential.


                    When a photographer is analyzing his work in monitors, they have images from digital cameras with 12 bits or 14 bits.


                    Then, see gradients and other color information smoother in the monitor is very helpful.


                    I am looking for a way to define which graphics cards can handle 10 bit colors or more to advise these professionals.


                    Another area that I believe that is very important is medicine. For obvious reasons.