cancel
Showing results for 
Search instead for 
Did you mean: 

Drivers & Software

dealio
Adept III

Re: My monitor supports 10bit but cannot be enabled in Radeon setting

That’s a great question. I have several apparent duplicates that show up in the drop down box as well. I don’t know why they are there.

0 Likes
thatnerdgui
Adept I

Re: My monitor supports 10bit but cannot be enabled in Radeon setting

I don't know how similar my issue was, but I have the same issue connecting to my 4k TV over HDMI on ver.19.3.1. Its a 10-bit panel and the only way I get a stable video signal its at 8-bit and 4:2:0 instead of the supported 4:2:2 or 4:4:4. The default settings the radeon software assigned when I first plugged it in don't work at all. This is very frustrating because it worked just fine in the Nvidia GPU i just replaced.

0 Likes
fsadough
Moderator
Moderator

Re: My monitor supports 10bit but cannot be enabled in Radeon setting

This doesn't help. You need to provide detailed information. Which GPU? Which TV? Which HDMI cable version? 

buffalo2102
Journeyman III

Re: My monitor supports 10bit but cannot be enabled in Radeon setting

I just registered to say that I have the same issue and the same monitor as the OP.

CPU: i7-4790x

MB: Asus Maximus VII Ranger

GPU :RX VEGA 64  (Also RX580 Nitro)

OS :win10

monitor: benq ew3270u

3840x2160 @ 60Hz using DP 1.2 I can set 10-bit at 4:2:0 but it always drops back to 8-bit when on 4:4:4.

My old RX580 was the same, although I could set 10-bit successfully if I rolled back to Crimson Relive v17.4.3 drivers.  That's not an option for the Vega though.

The fact that I use the same monitor as the OP tends to suggest that the monitor is the problem.  But the fact that the 17.4.3 Crimson drivers seemed to work also suggests a driver/compatibility issue.  Any help would be appreciated.

My EDID (very slight difference to the OP's);

00 FF FF FF FF FF FF 00 09 D1 50 79 45 54 00 00
26 1C 01 04 B5 46 27 78 3F 59 95 AF 4F 42 AF 26
0F 50 54 25 4B 00 D1 C0 B3 00 A9 C0 81 80 81 00
81 C0 01 01 01 01 4D D0 00 A0 F0 70 3E 80 30 20
35 00 C0 1C 32 00 00 1A 00 00 00 FD 00 28 3C 87
87 3C 01 0A 20 20 20 20 20 20 00 00 00 FC 00 42
65 6E 51 20 45 57 33 32 37 30 55 0A 00 00 00 10
00 00 00 00 00 00 00 00 00 00 00 00 00 00 01 B8
02 03 3F 70 51 5D 5E 5F 60 61 10 1F 22 21 20 05
14 04 13 12 03 01 23 09 07 07 83 01 00 00 E2 00
C0 6D 03 0C 00 10 00 38 78 20 00 60 01 02 03 E3
05 E3 01 E4 0F 18 00 00 E6 06 07 01 53 4C 2C A3
66 00 A0 F0 70 1F 80 30 20 35 00 C0 1C 32 00 00
1A 56 5E 00 A0 A0 A0 29 50 2F 20 35 00 80 68 21
00 00 1A BF 65 00 50 A0 40 2E 60 08 20 08 08 80
90 21 00 00 1C 00 00 00 00 00 00 00 00 00 00 20

Thanks

0 Likes
fsadough
Moderator
Moderator

Re: My monitor supports 10bit but cannot be enabled in Radeon setting

This is under investigation by AMD Display Team

0 Likes
insfi
Journeyman III

Re: My monitor supports 10bit but cannot be enabled in Radeon setting

fsadough

I have AMD HD7950. Does it support 10 bit color depth?

Driver: 19.3.2
Connection: Display Port 1.4
Monitor: BenQ SW320

Video Card: AMD Radeon HD7950
CPU: i7-3770K
OS: win10

Somehow I was switching around setting and was able to switch to 10 bpc. But then it automaticaly switched back.

When I set fps to 30Hz. It's indeed able to perform 10 bpc.


On 60Hz I have only 6, 8 bpc available

00 FF FF FF FF FF FF 00 09 D1 54 7F 45 54 00 00
1D 1C 01 04 B5 46 28 78 26 DF 50 A3 54 35 B5 26
0F 50 54 25 4B 00 D1 C0 81 C0 81 00 81 80 A9 C0
B3 00 01 01 01 01 4D D0 00 A0 F0 70 3E 80 30 20
35 00 C0 1C 32 00 00 1A 00 00 00 FC 00 42 65 6E
51 20 53 57 33 32 30 0A 20 20 00 00 00 10 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 10
00 00 00 00 00 00 00 00 00 00 00 00 00 00 01 40
02 03 2B 40 56 61 60 5D 5E 5F 10 05 04 03 02 07
06 0F 1F 20 21 22 14 13 12 16 01 23 09 7F 07 83
01 00 00 E3 06 07 01 E3 05 C0 00 02 3A 80 18 71
38 2D 40 58 2C 45 00 E0 0E 11 00 00 1E 56 5E 00
A0 A0 A0 29 50 30 20 35 00 80 68 21 00 00 1A 04
74 00 30 F2 70 5A 80 B0 58 8A 00 C0 1C 32 00 00
1A 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 61

Thanks in advance!

0 Likes
fsadough
Moderator
Moderator

Re: My monitor supports 10bit but cannot be enabled in Radeon setting

Sorry, but I think you need to wait for the fix in May 2019

0 Likes
dealio
Adept III

Re: My monitor supports 10bit but cannot be enabled in Radeon setting

insfi‌, I have 2 BenQ SW320s and the Radeon Pro WX 7100, and I have been battling this issue for over a year. I have learned a few things. My experience may help you in the interim until there is a fix for this issue. I have found 3 ways to achieve 10 bpc for my hardware/software combination -- YMMV, but it can't hurt to try. Perhaps one of these will be satisfactory for you until a permanent fix is available:

  1. set the refresh rate on your monitor (Advanced Display settings in Windows Settings) to 30 Hz -- this one you already found
  2. set the Pixel Format in the GPU driver to YCbCr 4:2:2. The impact of this change is to reduce the amount of color information displayed (some of it gets thrown away, unlike full RGB 4:4:4). This will be most noticeable in line art, vector drawings, CAD, small text, Excel cells -- anything with very thin lines. If you do color-critical work, this may not be acceptable for you. When you make this change, you may find that the Color Depth automatically changes to 10 bpc. If not, try changing it to 10 bpc. This worked for me at the full resolution of the BenQ SW320 (3840x2160) or 4K@60Hz
  3. use the Enterprise driver from Feb 2018 (18.Q1) with monitor refresh rate at 60Hz, Pixel Format at RGB 4:4:4 and Color Depth at 10 bpc. Although this driver works at the full RGB 4:4:4 pixel format, the driver has other issues that may be an issue for you. One of them is a performance issue in CorelDraw, which was fixed in the 18.Q3 driver from AMD, and in a CorelDraw update from Corel (for the newest versions since v2017). I don't know if other applications suffer this issue (glacial performance opening files, copying and pasting, printing).

In addition to these changes, you may need to enable the 10 bit pixel format setting in the Advanced Settings of the Radeon Pro driver settings (old Catalyst style dialog box). To verify that the whole pipeline (from O/S settings, VESA certified DP cables, BenQ monitor driver, AMD GPU driver to application) was delivering 10 bpc, I used a 10-bit ramp PSD file (attached) in Photoshop. It's the only test I know to use for quick visual verification. When 8 bpc is active, there is obvious banding in the gradient that the file contains. When 10 bpc is active, it's smooth gray from one end to the other with no hint of banding in the gradient. Good luck.

fsadough
Moderator
Moderator

Re: My monitor supports 10bit but cannot be enabled in Radeon setting

dealio‌ 10 bit pixel format and 10bpc are two different things

0 Likes
dealio
Adept III

Re: My monitor supports 10bit but cannot be enabled in Radeon setting

Thanks fsadough. I am aware that they are different. I hope my post was not confusing to anyone -- I was careful to use the same terms AMD uses, Color Depth and Pixel Format, to distinguish them. But I think it's the unfortunate inconsistency in their names across software vendors that adds to the confusion. AMD uses "Color Depth" and "Pixel Format", but Windows 10 uses "Bit Depth" and "Color format", and Adobe just uses "30 bit display". Even the term 10 bpc is not consistently used: 10 bits per color vs. 10 bits per channel vs. 10 bits per component.

Maybe it's the last paragraph where I seem to confound the two. To explain, my experience is that when the Radeon Setting Color Depth is set to 10 bpc, and the Radeon Pro Advanced Setting labeled "10-bit pixel format" is not enabled, the file named "10-bit test ramp.psd" (found elsewhere on this forum) will show banding in the gradient in Photoshop CS6 and Photoshop CC. And banding in this test file is a visual indication that Photoshop is not using a color depth of 10 bpc. Why Photoshop is not using 10 bpc when both the O/S and the Radeon Settings panel report that it is enabled is a separate and open question that I don't understand. However, when I enable the "10-bit pixel format" setting and load the same test ramp file into Photoshop, I see no banding in the gradient (dark gray to light gray), and this is the expected outcome with 10 bpc color depth. So it seems that in Photoshop, the two -- 10 bpc color depth and 10-bit pixel format -- are somehow tied to each other. This is a relevant test for me because I use Photoshop frequently.

If this visual test is somehow inaccurate or invalid, please explain. In addition, if you know a better visual test that can be used to verify that 10 bpc color depth is enabled, please advise. It's not that I don't trust the AMD driver console, but I tend to live by "trust but verify" when it comes to technology.