5 Replies Latest reply on Mar 21, 2018 5:13 PM by pokester

    How do you get the WX 5100 to use 10-bit?

    dealio

      The WX 5100 supports 10-bit per channel color (or 30-bit color) and I have 2 x BenQ SW320 that support 10-bit color depth as well. 10-bit depth is one reason I got the WX 5100 as I do photography work. When I installed the v17.12.2 driver, I right-clicked on the desktop and opened the "AMD Radeon Pro and AMD FirePro Settings", clicked on the Display icon and saw there is a Color Depth setting and it was set to 8 bpc. I noticed there were 3 options for 6, 8 and 10 bpc. I selected 10 bpc, the monitors flickered a bit then stabilized (as usual), but the setting went immediately back to 8 bpc. Don't know why. I then selected the "AMD Radeon Pro and AMD FirePro Advanced Settings" context menu from the desktop which brought an old style Catalyst dialog box. There is a panel that has a checkbox at the bottom called "Enable 10-bit pixel format support" and I checked it, and it said I needed to reboot. I did. I went back into both settings panels again after the reboot to check the settings and the first driver panel still showed Color Depth at 8 bpc, while the Advanced driver settings still showed the checkbox checked. I then tried to set the Color Depth to 10 bpc, thinking that maybe I had to enable 10-bit in the Advanced panel before changing the Color Depth. The change would not stick -- Color Depth always reverts back to 8 bpc. In summary, current settings are

       

      AMD Radeon Pro and AMD FirePro Settings:

        AMD Freesync = Not Supported

        Virtual Super Resolution = Off

        GPU Scaling = Off

        Scaling Mode = Preserve aspect ratio

      Color Depth = 8 bpc

        Pixel Format = RGB 4:4:4 Pixel Format PC Standard (Full RGB)

      (Note: these were the default settings)

       

      AMD Radeon Pro and AMD FirePro Advance Settings:

        10-bit pixel format support = Enabled

        (all other settings at default values)

       

      I am confused. Just wondering if anyone has experience with this and knows which one is accurate? How do I use the 10-bit capability of the card?

       

      TIA,

      David

       

      PS: My o/s installation is a fresh install of Windows 10 Pro x64 with the Fall Creators update.

       

      PSS: I have a ticket in with AMD and they recommended using a third party DDU utility to uninstall the current driver and install the v18 driver (the latest release). This seemed questionable to me -- not sure I would trust a third-party uninstall utility over the manufacturer's integrated uninstall process.

        • Re: How do you get the WX 5100 to use 10-bit?
          pokester

          I don't have that card but my monitor only does it through Display Port. It could be the port you are using?

            • Re: How do you get the WX 5100 to use 10-bit?
              dealio

              Thanks pokester. I appreciate your response and suggestion!

               

              I have the WX 5100 which only has displayport connections, supporting version 1.4 revision of the standard. The cables I have are also displayport, but I guess that's obvious :-). And I'm running Windows 10 Pro x64, which supports 10 bpc or 30-bit workflow.

               

              Are you using 10 bpc mode? Which graphics card do you have?

               

              Would you mind sharing what your settings are in the driver? From what I can see, there are two interfaces to the driver. When I right click on my desktop, 2 options appear at the top of the context menu 1) AMD Radeon Pro and AMD FirePro Settings and 2) AMD Radeon Pro and AMD FirePro Advanced Settings. I'm curious what your settings are in the Settings for Color Depth, Pixel Format, Virtual Super Resolution and AMD FreeSync. And in the Advanced Settings I'm curious whether you have the Enable 10-bit pixel format support option checked.

               

              TIA,

              David

                • Re: How do you get the WX 5100 to use 10-bit?
                  dealio

                  I would also wonder what resolution you are using. I am using 4K resolution (3840x2160) on two monitors in extended desktop mode.

                   

                  Thanks again!

                  David

                    • Re: How do you get the WX 5100 to use 10-bit?
                      pokester

                      1440p I wonder if it is the dual monitors throwing things off. Maybe disconnect one to test if things change by themselves. Make sure to reboot to, to make sure the change takes. Don't know if this will change anything. Just the best troubleshooting idea I have to at least figure out if, it even works. 

                       

                      If you have quality cables they should not be an issue. Are they the ones that came with your monitor? If not you might pick another one up retail that you can test with if the other troubleshooting you do doesn't pan out.

                       

                      Are each monitor connected to the card with it's own display port connection  or  is it one connection to one monitor then the next monitor daisy chains off that?

                    • Re: How do you get the WX 5100 to use 10-bit?
                      pokester

                      I am not on a pro card. I am on a RX 580. It does support 10 bpc. I am running display port to a 1440p IPS monitor. When I get wide gammut color images I can tell the difference so it is working. I primarily do Photoshop / Illustrator work with mine.