8 Replies Latest reply on Jan 5, 2016 7:55 PM by tenderchkn

    How does the "Color Depth" setting work?

    tenderchkn

      I did some research on this issue, and this post is the most relevant information that I have found. However, it does not completely answer my question.

       

      I understand that the GPU's internal LUT and the output signal can have different bit depths. When I change the "Color Depth" setting in CCC, am I changing the bit depth of the internal LUT, or is it changing something else? The reason I am asking is because I am having a banding issue on my Dell UP3216Q monitor. By default, the "Color Depth" setting is 10bpc for my monitor (which is a true 10-bit display). However, very curiously, changing the setting to 6bpc greatly reduces the banding.

       

      I have posted additional details on the Dell support forums: http://en.community.dell.com/support-forums/peripherals/f/3529/t/19666103

       

      Here is a summary of what I am experiencing:

       

      (Prior to calibration) No banding or clipping with monitor at default settings at 6bpc, 8bpc, or 10bpc

      Significant banding with regardless of ICC profile with "Color Depth" at either 8bpc or 10bpc (around 50 levels of gray missing out of 256)

      Minor banding or clipping with custom ICC profile created by calibration software at 6bpc (around 10 levels of gray missing out of 256)

      No banding, but minor clipping at black and white with default sRGB profile at 6bpc (around 10 levels of gray clipped)

       

      Some amount of banding would be expected from a software calibration of the GPU. However, this was a hardware calibration of the monitor, which has a 14-bit internal LUT, so there should be no color loss resulting from banding or clipping at all.

       

      Can someone explain the mechanics of how "Color Depth" works? Clearly, it is not setting the output color depth to 6bpc, because that would mean only 64 levels of gray. I can see 256 levels of gray with the monitor at default settings, and around 245 levels post-calibration with "Color Depth" set to 6bpc. Also, why would a setting of 8bpc or 10bpc cause additional banding? Finally, with the depth set to 6bpc, am I losing colors elsewhere, or is the output signal still 8bpc?

        • Re: How does the "Color Depth" setting work?
          waltc

          Sounds like a condition relative to your monitor--check the Dell forums to see if anyone sees the same thing.  You seem to be saying that "prior to calibration" you don't see any banding--but that you see it only after calibration.  If so, that's simple--the monitor is correctly auto adjusting to your GPU's output prior to calibration and does what it is supposed to do; when you attempt to calibrate via software you are displacing the correct calibration with an incorrect calibration--hence you see banding where there should be none.

           

          Wanted to add that you also want to make sure that the  software calibration isn't being done for purposes of printing, etc.

            • Re: How does the "Color Depth" setting work?
              tenderchkn

              Yes, I am aware that this is a condition specific to my monitor. As I detailed in the thread on the Dell forums, this behavior happens when there is no AMD driver installed and when I take out my GPU and connect my monitor to the integrated graphics. Therefore, I am not saying that the AMD drivers are causing the banding issues. Instead, setting the "Color Depth" to 6bpc is the only thing that is helping the banding. Since this is counterintuitive, I want to know how this setting works, and why it is making the banding better.

               

              Also, I mentioned in my OP, but I want to make it clear that I am calibrating the internal LUT in my monitor. I am not doing a software calibration on the GPU, as you seem to think. The monitor is not "auto adjusting" the GPU output in either the default or calibrated modes of my monitor. There is only one calibration in the display chain, and that is entirely in the monitor itself. Since the banding is exactly the same when there are no drivers, when I use the integrated graphics, and also when I connect other devices (set top boxes, media players, Blu Ray players), I can safety rule out settings in CCC contributing to the banding.

                • Re: How does the "Color Depth" setting work?
                  waltc

                  So you're saying that you don't have a problem unless you disconnect the AMD GPU and hook up your Intel IGP?  I have to confess this isn't terribly clear...;)

                    • Re: How does the "Color Depth" setting work?
                      tenderchkn

                      Then I guess I should spell it out, so that everything is crystal clear.

                       

                      For the following cases:

                      • 390X, with 15.12 drivers, default settings = banding
                      • 390X, no drivers = banding
                      • Integrated graphics (Intel HD 3000) = banding
                      • Other devices (laptop, blu-ray player) = banding
                      • 390X, with 15.12 drivers, Color Depth at 6bpc = no banding (or so I initially thought)

                       

                      Nowhere in this thread have I asked for advice regarding calibration, nor suggested that this is a GPU or driver problem. My only question, which has not been answered, is about how the "Color Depth" setting works.

                       

                      I did some further testing, and measured the output luminance of my display with my i1 Display Pro colorimeter, and created the following graph in Excel.

                       

                      0009.gif

                      The change from 10bpc to 6bpc does not affect the factory sRGB mode at all, but it seems to have a dithering or smoothening effect (to eliminate the stair stepping) on the calibrated mode. It helps the visible banding, but unfortunately, does not eliminate it entirely, as I initially thought. The banding is "technically" gone, because each gray level has a distinct luminance value, but the output curve is still lumpy, so banding is still visible when viewing a smooth gradient. The output curve should be smooth like the factory sRGB mode.

                       

                      My full testing methodlogy, along with additional photos, are available in the Dell support thread that I included in my OP.

                • Re: How does the "Color Depth" setting work?
                  noelc

                  I'd love to know the answer to this too, but frankly I don't think anyone around here has the expertise to answer it.  I've never seen the subject discussed lucidly here.

                   

                  I have Dell U3014 monitor hardware capable of 30 bit color, and hooked up the right way (MiniDP), but I've never seen anything better than 8 bits per color from my Radeon HD 7850...  With a pure medium/dark grayscale gradient I can just see the adjacent levels of 256 shades of gray.  I think the hardware may be capable, but I'm not sure about the firmware on the card, or the drivers, even though there's a 10 bpc setting in the Radeon software and I have Photoshop set to 30 bit color.

                   

                  I don't have the issue you have, however, where 2 or more level changes occur in adjacent bands.  Fo me - on 8 or 10 bit settings - just 256 levels of gray are visible, and all of them are there.  I find it acceptable.

                   

                  I really think that if something isn't going to deliver 10 bpc / 30 bit color then it shouldn't allow the setting - or at least put up a warning.

                   

                  -Noel

                    • Re: How does the "Color Depth" setting work?
                      tenderchkn

                      If you look at the post that I linked in the first paragraph in my OP, you will get a pretty good explanation on why setting "Color Depth" to 10bpc is not the same as 10-bit output. Both Nvidia and AMD cripple their GeForce and Radeon cards with regards to 10-bit output (except through D3D). They force you to get Quadro and FirePro cards to use those features, even though the hardware between the consumer and professional cards is probably 99%+ the same.

                       

                      Additionally, 10-bit output in Photoshop has been broken since CS6, so even if you did have a professional card, you would not be able to utilize 10-bit anyway. Adobe acknowledges this here:

                       

                      https://helpx.adobe.com/photoshop/kb/photoshop-cc-gpu-card-faq.html

                       

                      "Note: 30-bit display is not functioning correctly with current drivers. We are working to address this issue as soon as possible."

                       

                      Have you calibrated the CAL1 or CAL2 modes on your U3014? Or are you using one of the other presets? When I use anything except CAL1 or CAL2, I can see all 256 levels of gray as well. When I use CAL1 or CAL2 after hardware calibration, I only see about 200 levels of gray. This is the case when I have the AMD drivers at default settings, when I uninstall the AMD drivers, when I pull out my GPU and use the internal graphics, and when I am using other devices (set top box, media player, Blu Ray player).

                       

                      However, when I set "Color Depth" to 6bpc, the banding is reduced, and I can see about 245 levels of gray. So that setting is clearly doing something. But I just don't know how it works and why it would be helping.

                        • Re: How does the "Color Depth" setting work?
                          noelc

                          I use the sRGB monitor preset, then the color settings provided by CCC to fine tune and get all 3 monitors (2 x old Dell 2001FP turned up sideways and 1 x U3014) very close to sRGB output with the bog standard Windows sRGB profile.  For my needs as a graphics software developer, having the 3 monitors closely match sRGB and one another is very helpful.  Now with the Crimson drivers I have to run a CLI.exe command in a startup batch file to achieve the auto-load of my preset.

                           

                          I've never been able to get a solid explanation from someone who knows the insides about why the non-workstation cards are limited to 8 bpc.  I thought that maybe AMD was lifting that restriction with the 2015 drivers that expose the 10 bpc setting, but alas it has not been so.

                           

                          -Noel

                          • Re: How does the "Color Depth" setting work?
                            qwixt

                            In my experience, AMD does a much better job of 10-bpc than NVidia. The difference was big enough for me that I thought AMD did not cripple their cards.