cancel
Showing results for 
Search instead for 
Did you mean: 

Graphics

Highlighted
Adept I
Adept I

10bit color correction on consumer GPUs

As far as i could find out, AMD consumer GPUs do not support 10bit color/channel for applications (eg. photoshop).

My question is if icc color profiles used with Windows 10 (measured with a device like a Spyder) can use 10bit color - providing a 10bit monitor is used.

0 Kudos
9 Replies
Highlighted
Esteemed Contributor III

Re: 10bit color correction on consumer GPUs

Consumer AMD GPUs support 10bpc on 8+2 and 10 bpc capable monitors and in applications, but some applications, such as Adobe Photoshop, are coded by the developer to require professional level Firepro cards to enable 10 bit support in the program.

Highlighted
Esteemed Contributor III

Re: 10bit color correction on consumer GPUs

I had no issue enabling it on an RX 580. 

0 Kudos
Highlighted
Adept II
Adept II

Re: 10bit color correction on consumer GPUs

You require 10 bpp feature for true 10 bit color support NOT 10 bpc.

* 10 bpp = 10 bits per pixel presicion

* 10 bpc = 10 bits per color channel

These 2 are not the same thing but many people do not know this.

Highlighted
Adept I
Adept I

Re: 10bit color correction on consumer GPUs

Thank you for this clarification! I was not aware that there is a difference between pbb and bpc.

0 Kudos
Highlighted
Adept I
Adept I

Re: 10bit color correction on consumer GPUs

I just tried to do some research to find out about the difference between 10 bpp and 10 bpc but it is still not really clear to me.

I think i understand that 10 bpp means that the application is able to use the whole range of 10 bpp for each color to draw each pixel. So Photoshop can use 1024 shades of red/green/blue for each pixel. So a middle gray would be sent to the gpu driver as 512/512/512 and this information is passed on to the graphics card.

But what about 10 bpc? Does it mean Photoshop can only use 8 bit to communicate with the gpu driver and the driver itself sends 10 bit information to the display? So for middle gray Photoshop would send 128/128/128 and the driver passes on 512/512/512 to the screen?

If so, can the graphics driver on a consumer gpu use 10 bit for color correction (which should reduce banding)?

0 Kudos
Highlighted
Adept II
Adept II

Re: 10bit color correction on consumer GPUs

10 bpc has do with the bandwidth of your display connection. Most professional LCDs use displayport because it can handle more bandwidth than HDMI. Think of it as GPU memory bus width; some gpus have 256 bit wide bus some have 1042 bit wide bus and some even have 4096 bit wide bus memory. 

So, 10 bpc increases the width of each color channel (RED,GREEN and BLUE) so you can send a single large data in one cycle rather then on waiting on multiple cycles to send that single large data. 

And about consumer gpu using 10 bpp (10 bit pixel precision); that is kind of iffy. They can but you will run into some stability issues. If you have a consumer RADEON gpu you can try to use RADEON PRO drivers or Adrenaline PRO drivers and see how that goes.

0 Kudos
Highlighted
Big Boss
Big Boss

Re: 10bit color correction on consumer GPUs

my lg 27ul500 is hdr10 compatible and 100% sRGB etc and I have to use DisplayPort to get it working to its full potential

i use uppity monitors all the time but i do have a cheap one for the service desk with VGA, DVI and HDMI 

0 Kudos
Highlighted
Adept I
Adept I

Re: 10bit color correction on consumer GPUs

I am a bit confused about the terminology which is used to describe the way color is represented. I could not find any sources validating the concept of 10 bpc describing the memory bandwidth of the signal.

According to everything i found out so far "bit per channel" is a measure for the amount of colors for each channel. So with 10 bpc you can address 1042 shades of red, green and blue.

Bit per pixel on the other are a measure for the amount of information which is available for each pixel. Without any (color) compression you need 30bpp to display 10 bpc. But using compression the amount of information can be reduced. So with 4:2:2 chroma subsampling 10 bpc only uses 20bpp of information (Details can be found here).

I am not sure though if this is consistent with the terminology in AMDs driver dialogs.

Also this does not help me with my initial question (which i still do not have an answer to unfortunately). Basically i wanted to know it current AMD consumer GPUs have a 10bit capable LUT which can be used to load the icc profile for color correction.

0 Kudos
Highlighted
Adept II
Adept II

Re: 10bit color correction on consumer GPUs

Yes, both these terminologies are confusing and they get mixed up often. It seems you didn't get what I was trying to say.

Okay, ''"bit per channel" is a measure for the amount of colors for each channel. So with 10 bpc you can address 1042 shades of red, green and blue.'' 

It is a measure or availability of amount of color per channel that means 10 bpc can address 1042 shades of rgb as long as your gpu and driver allows it. Enabling 10 bpc does not mean you are sending 10 bit rgb information to your lcd it means that the bandwidth is there it is available but not utilized; your lcd is still receiving 8 bit rgb information.

''Bit per pixel on the other are a measure for the amount of information which is available for each pixel. Without any (color) compression you need 30bpp to display 10 bpc. But using compression the amount of information can be reduced. So with 4:2:2 chroma subsampling 10 bpc only uses 20bpp of information''

That is correct; 10 bpp deals with the actual color information.

''Also this does not help me with my initial question (which i still do not have an answer to unfortunately). Basically i wanted to know it current AMD consumer GPUs have a 10bit capable LUT which can be used to load the icc profile for color correction.''

Sorry, I was thinking that you were asking if consumer RADEON can do 10 bpp. You can brut force turn on 10 bpp (pixel precision) on normal RADEON using some tweaks in registry edit for the driver in windows 10.

And I do not know if consumer RADEON have 10 bit capable LUT or may be they have 14 bit internal LUT; you to have ask AMD about it.

0 Kudos