cancel
Showing results for 
Search instead for 
Did you mean: 

General Discussions

dealio
Adept III

How do you get the WX 5100 to use 10-bit?

Jump to solution

The WX 5100 supports 10-bit per channel color (or 30-bit color) and I have 2 x BenQ SW320 that support 10-bit color depth as well. 10-bit depth is one reason I got the WX 5100 as I do photography work. When I installed the v17.12.2 driver, I right-clicked on the desktop and opened the "AMD Radeon Pro and AMD FirePro Settings", clicked on the Display icon and saw there is a Color Depth setting and it was set to 8 bpc. I noticed there were 3 options for 6, 8 and 10 bpc. I selected 10 bpc, the monitors flickered a bit then stabilized (as usual), but the setting went immediately back to 8 bpc. Don't know why. I then selected the "AMD Radeon Pro and AMD FirePro Advanced Settings" context menu from the desktop which brought an old style Catalyst dialog box. There is a panel that has a checkbox at the bottom called "Enable 10-bit pixel format support" and I checked it, and it said I needed to reboot. I did. I went back into both settings panels again after the reboot to check the settings and the first driver panel still showed Color Depth at 8 bpc, while the Advanced driver settings still showed the checkbox checked. I then tried to set the Color Depth to 10 bpc, thinking that maybe I had to enable 10-bit in the Advanced panel before changing the Color Depth. The change would not stick -- Color Depth always reverts back to 8 bpc. In summary, current settings are

AMD Radeon Pro and AMD FirePro Settings:

  AMD Freesync = Not Supported

  Virtual Super Resolution = Off

  GPU Scaling = Off

  Scaling Mode = Preserve aspect ratio

Color Depth = 8 bpc

  Pixel Format = RGB 4:4:4 Pixel Format PC Standard (Full RGB)

(Note: these were the default settings)

AMD Radeon Pro and AMD FirePro Advance Settings:

  10-bit pixel format support = Enabled

  (all other settings at default values)

I am confused. Just wondering if anyone has experience with this and knows which one is accurate? How do I use the 10-bit capability of the card?

TIA,

David

PS: My o/s installation is a fresh install of Windows 10 Pro x64 with the Fall Creators update.

PSS: I have a ticket in with AMD and they recommended using a third party DDU utility to uninstall the current driver and install the v18 driver (the latest release). This seemed questionable to me -- not sure I would trust a third-party uninstall utility over the manufacturer's integrated uninstall process.

0 Likes
1 Solution

Accepted Solutions
dealio
Adept III

Re: How do you get the WX 5100 to use 10-bit?

Jump to solution

I believe this issue may have been resolved! As posted by fsadough‌ in this post, the 19.Q2 driver released on May 8, is working just as described with 10 bpc enabled at 4K@60Hz in RGB 4:4:4 Pixel Format mode! Fantastic! I am so thrilled! Thanks to fsadough‌ and the development team at AMD for bringing this fix to us!

I noticed only an oddity in the driver settings panel, and they are underlined in red. But it appears to have no functional impact so far, which is great. If I recall correctly, I've seen the same artifact in previous Radeon Pro drivers when the 10-bit Pixel Format advanced setting is enabled, though I thought that it had been fixed (The reason I've enabled the setting is because Photoshop requires it in order to use the 10 bpc functionality).

pastedImage_2.png

Now the last step can be completed -- FINALLY to calibrate my monitors using this new driver! Calibration worked on older drivers (I believe 18.Q1), but failed on 19.Q1.1, though I'm not sure why. BenQ's calibration software seems a bit sensitive, especially with dual monitor setups. Here's hoping!

Thanks fsadough!

David

View solution in original post

123 Replies
pokester
MVP

Re: How do you get the WX 5100 to use 10-bit?

Jump to solution

I don't have that card but my monitor only does it through Display Port. It could be the port you are using?

0 Likes
dealio
Adept III

Re: How do you get the WX 5100 to use 10-bit?

Jump to solution

Thanks pokester. I appreciate your response and suggestion!

I have the WX 5100 which only has displayport connections, supporting version 1.4 revision of the standard. The cables I have are also displayport, but I guess that's obvious :-). And I'm running Windows 10 Pro x64, which supports 10 bpc or 30-bit workflow.

Are you using 10 bpc mode? Which graphics card do you have?

Would you mind sharing what your settings are in the driver? From what I can see, there are two interfaces to the driver. When I right click on my desktop, 2 options appear at the top of the context menu 1) AMD Radeon Pro and AMD FirePro Settings and 2) AMD Radeon Pro and AMD FirePro Advanced Settings. I'm curious what your settings are in the Settings for Color Depth, Pixel Format, Virtual Super Resolution and AMD FreeSync. And in the Advanced Settings I'm curious whether you have the Enable 10-bit pixel format support option checked.

TIA,

David

0 Likes
dealio
Adept III

Re: How do you get the WX 5100 to use 10-bit?

Jump to solution

I would also wonder what resolution you are using. I am using 4K resolution (3840x2160) on two monitors in extended desktop mode.

Thanks again!

David

0 Likes
pokester
MVP

Re: How do you get the WX 5100 to use 10-bit?

Jump to solution

I am not on a pro card. I am on a RX 580. It does support 10 bpc. I am running display port to a 1440p IPS monitor. When I get wide gammut color images I can tell the difference so it is working. I primarily do Photoshop / Illustrator work with mine.

0 Likes
pokester
MVP

Re: How do you get the WX 5100 to use 10-bit?

Jump to solution

1440p I wonder if it is the dual monitors throwing things off. Maybe disconnect one to test if things change by themselves. Make sure to reboot to, to make sure the change takes. Don't know if this will change anything. Just the best troubleshooting idea I have to at least figure out if, it even works. 

If you have quality cables they should not be an issue. Are they the ones that came with your monitor? If not you might pick another one up retail that you can test with if the other troubleshooting you do doesn't pan out.

Are each monitor connected to the card with it's own display port connection  or  is it one connection to one monitor then the next monitor daisy chains off that?

0 Likes
yifei-xue
Journeyman III

Re: How do you get the WX 5100 to use 10-bit?

Jump to solution

I have the same issue now. Did you fix it?

0 Likes
dealio
Adept III

Re: How do you get the WX 5100 to use 10-bit?

Jump to solution

I did fix it some time ago. I found that somehow during the installation of the driver, the monitor frequency settings for both monitors got set to 59Hz. Once I changed that setting to 60Hz, I was able to enable 10-bit color in the driver (as long as the 10 bit setting in the Advanced Settings dialog box was enabled as well).

Then I noticed a crippling issue with CorelDraw 2017 performance (for example extremely slow creation of new documents, copy and paste, file save, file print, etc). But when the Windows 10 April update automatically installed, it removed my AMD driver (without telling me) and replaced it with a much older one (2016 vintage) and the CorelDraw issue went away; but then so did my ability to use/control 10-bit color.

After reading about much finger pointing between AMD, Microsoft and Corel, I decided to wait on one of them to fix the issue. AMD has now produced an Adrenalin driver that fixes the performance issue with CorelDraw, but doesn't allow me to enable 10-bit color again! Arggg. I checked my monitor refresh rates and they are at 60Hz, so that may not be the real solution I thought it was when I made the change from 59Hz to 60Hz before.

I've had it with this WX5100 video card and the poor quality drivers. In January I bought two BenQ 10-bit capable monitors on the promise of the WX5100 being able to produce 10-bit per color resolution, and it has been 7 months and the drivers still don't provide that capability. And AMD support hasn't helped. They're knowledge of their product is lacking, and their approach is trial and error "try this driver version, try that driver version"...and by the way, don't use the built in AMD uninstall capability to uninstall the AMD driver, instead download a third-party DDU utility to uninstall the AMD driver. Apparently not even AMD trusts their software.

Nvidia seems to have figured things out and everywhere I read, they have a better reputation for graphics card features that work seamlessly with the O/S and applications. I think I'll look there going forward.

0 Likes
dealio
Adept III

Re: How do you get the WX 5100 to use 10-bit?

Jump to solution

One other fact. At one point when working with AMD support, I asked if there was a manual for the Radeon Pro WX5100 that explained the driver settings so that I could read it and be sure that I didn't have any incompatible settings. There are so many settings in the driver, and there's an Advanced Settings driver as well. I never got a manual, so apparently there is no manual that explains the settings and how they work together. Some settings may be obvious and no manual would be required, but others are not. For example, there are 2 places in the drivers where 10-bit color is referenced: in the Settings driver and in the Advanced Settings driver. However, in the Advanced Settings driver, the 10-bit setting has a caption that mentions use this setting for medical devices that need higher color resolution, making it seem as if the setting is not necessarily related to the 10-bit setting in the Settings driver. Is there a dependency or not? The drivers don't say and there is no manual to check.

0 Likes
m0reilly
Adept II

Re: How do you get the WX 5100 to use 10-bit?

Jump to solution

This is not acceptable. My cheap WasabiMango panel displays as being 10 bit, but as soon as the sw 320 is attached I also get 8 bit. I've tried this with an HD series and a Vega 64, and now have made what will most likely be a very costly mistake and ordered a Vega Frontier to test a pro series card before finding this thread. I have contacted BenQ and am waiting for any info regarding this issue.

Edit: using DP port on both HD and Vega cards with a certified DP 1.4 Club3D cable.

0 Likes