cancel
Showing results for 
Search instead for 
Did you mean: 

Drivers & Software

ShadowcoreX
Adept I

AMD graphic settings 10-bit colour when display is set to YcbCr 420 12-bit

Hi,

I have RX 5600 XT Pulse (21.11.1 driver) and this card can output 12-bit in YcbCr 420 on my LG C1 in 4K 60hz, but there's an option in graphics that I can set 10-bit colour in games. So, my question is this setting downgrade my YcbCr420 12-bit output or it actually turn on 10-bit in games which were working in 8-bit standard? My Freesync info showing me that Metro Exodus is running Ycbcr420 12-bit. So idk really.

RX5600XT (hdmi 2.0) is connected via hdmi 2.1 cable to LG C1 (hdmi 2.1)

OS: Windows 11

Please halp.

 

0 Likes
1 Solution

I have finally tested whole thing in many games and I have come to some conclusion.

[All of these settings are for RX5600XT HDMI 2.0 graphic card. Every new card with HDMI 2.1 will work better, but now i have this one so I had to test limits on LG C1 HDMI 2.1].

Windows is running in 8-bit as an OS and Adrenalin setting 8-bit is just for it. Just for OS, best result will be 60 frames RGB 4:4:4 8-bit. If we go above 60 frames (120), we lose better quality on lower resolution and HDR (about this later). Windows cant go higher than 8-bits. I saw comments about forcing 10-bits on nvidia cards and OS had problems with colors. For HDR I use HGIG tone mapping. It's new standard from Microsoft and Sony which actually using HDR properly without additional tone mapping from TV itself.

Now, about games. This stuff is really tricky, not gonna lie. If the game is advanced enough with engine it will change pixel format automaticly by itself on specific resolution, but only when graphic card scaling is off in Adrenalin settings. To check actual format you click few times green button on TV remote control. It will open VRR Freesync info. For example Assassin's Creed Odyssey switching pixel format to RGB 4:4:4 12-bit on 1080p. If we go above 1440p, game will switch to 8-bit (all 60 frames, HDR on). 1080p 12-bit looks much better than 1440p 8-bit.

Metro Exodus is more tricky. on 1080p in game settings it runs 4k upscaling (from full HD) 8-bit. When you switch in game settings to 1440p it will push native full HD 12-bit. I dont know why this works like that, but it is. I guess the TV always try to upscale every time. If you run really high resolution YcbCr 12-bit still looks better. More depth. Its just this game. HGIG HDR.

Witcher 3 runs RGB 4:4:4 12-bit in 1080p 60 frames. Its only one native resolution. If we go above this, TV will always upscale to 4K with RGB 8-bit. YcbCr 12-bit personal preference. Also you can use TV HDR tone mapping for this game instead of HGIG, because no HDR setting in game.

Disco Elysium is even more tricky. It basicly runs on Windows native 4K, but with artifical resolution scaling inside. So we have RGB 4:4:4 8-bit (Windows), and you adjust by swithing "resolution" in game. In reality it's just resolution scaling. Less advanced engine, so it cant switch pixel format by itself.

Overall on lower resolution, the RGB 4:4:4 12-bit in games is godlike if the game engine is working on it. If we go above, for example 1440p RGB 4:4:4 8-bit is looking slighty worse in color, but still works great. HDR still works, but it isnt as good as it shoud be.  F1 2020 can run 1440p HDR HGIG RGB 4:4:4 8-bit in 120 frames. Engine allow that. Above 1440p HDR works only in 60 frames. If you want real HDR on native 4K, you need to go for YcbCr 12-bit 60 frames. Some games looks better in SDR after adjustmens on TV, some better in HDR. Depends on implementation, but definitly HGIG standard doing great job.

(if you run games like Tekken 7 or any "console" port use YcbCr 4:4:4 for low fps cutscenes).

 

 

View solution in original post

4 Replies
Kostas1983
Adept I

RGB is 8 bit per color, 16 million colors

YCbCr  is same thing as RGB, but instead of colors you have Y = Brightness (Luminance), Chroma Blue and Chroma Red (color information) ... a formula is used to determine red, green and blue from these 3.

4:4:4 means that for every pixel, the monitor receives red, green and blue , or ycbcr information .. 3 x 8 bits or 3 x 10 bits or 3 x 12 bits

4:2:2 and 4:2:0 means that the video cards makes groups of 4 pixels, sends the brightness information for each pixel untouched, but makes an average of the color information for those 4 pixels and sends only 1 or 2 values for the whole group... so instead of 4 Cb , only 1-2 Cb values are sent, or only 1-2 Cr values.

Human eyes are much more sensitive to light, but less sensitive to variations in color, so in most situations you wouldn't notice that averaging.

Bluray movies, Youtube content etc is compressed using YCbCr 4:2:0

Some professional recording equipment records in YCbCr 4:2:2 

So 12 bit > 10 bit > 8 bit   RGB = YCbCr 4:4:4  > YCbCr 4:2:2 > YCbCr 4:2:0

YCbCr 4:2:0 is acceptable for watching movies, sometimes ok for playing games (there's issues with tiny text and some colors like intense reds being affected by the conversion from 4:4:4 to 4:2:0)

If you can do 10 bit, I'd say try it out. I wouldn't bother with 12 bit ... few games are even aware and support it, and your tv is probably not calibrated well enough that having 12 bit would make a difference.

YCbCr 4:2:0 is the worst option, and it's not worth using it just for 12bit, because all the video content is 8 bit or 10 bit... you'd get no quality increase forcing 12 bit.

I'd suggest going with YCbCr 4:2:2 10 bit  or YCbCr 4:4:4 / RGB 8 bit.

I have 4 methods possible in the adrenaline software:

RGB 4:4:4, YcBCr 4:4:4, RGB 4:4:4 studio (limited),  YcBCr 4:2:0

In 60 frames all works in 8-bits with one exception of YcBCr 4:2:0 (max 12-bits). To achive 10-bit on RGB or any other I need to reduce frames to 30. Not possible in my case. The options for higher bits are visible on Adrenalin, but dont apply on higher frames. The 120 frames works only on YcBCr 4:2:0 8-bits. I have hdmi 2.1 optical cable 5m.

I'm using TV as wall panel 3,5m from me, not as PC on the desk. Also this LG C1 model have game optimizer and hdmi switch to PC, so the clearity is still really good even on YcBCr 4:2:0. Also from my point of view games looks much more like movies, not just graphics like on pc. More real in 4:2:0 12-bits. If you sit really close to TV, you see the difference in pixels, but as you gaming far from it you getting better view. I guess personal preference. For me RGB 8-bit looks worse than YCbCr 4:2:0 12-bits. Definitly less colors. Depth.

I know people are using custom resolutions with custom profiles. Can I force somehow let's say YCBCr 4:4:4 in 10-bits on this card or RGB 10-bits? Also what's my real pixel output on the game? Is Metro Exodus running in 8-bit when my display shows 12-bits? Any software to check it? If I'm running 12-bits do I need to check adrenalin 10-bit graphics option? If I'm running RGB 8-bits TV still stwitch to HDR mode, but doesnt provide accurate picture. TV apply HDR, but graphics card doesnt provide right picture? F1 2020 seems to struggle with it. Washed colors etc. About calibration. TV is well calibrated, colors matters.

You definitely you don't want to play games under RGB 4:4:4 studio (limited) or YcBCr 4:2:0. 

RGB 4:4:4 studio (limited) stands for limited RGB. Limited RGB has a range of 16-235. Its absolute black is 16 levels brighter (or less dark) than full RGB. By the same token, max white (or brightness) for limited RGB is 15 levels lower (less bright) than for full RGB. RGB 4:4:4 stands for Full RGB. Full RGB means the ability to show 0-255, or the full range. That’s what PC monitors have been using for years.

Something similar happens with YcBCr 4:2:0. 4:2:0 chroma subsampling model compress the video files. Even though you can yield good image quality from the 4:2:0 video, you might still encounter problems when doing chromakeying or post-editing because of the low resolution for chroma information. Compared to 4:4:4 images, it will be more difficult and time-consuming to achieve a clean chromakey result with 4:2:0 videos. This is why professional video producers still prefer working with 4:4:4 or 4:2:2 video, which contains more chroma information facilitating the post-edits, only the final video is compressed in 4:2:0 for saving the size of file. 

So, why those two profiles, RGB 4:4:4 studio (limited),  YcBCr 4:2:0 are suitable for movies? When displays transitioned into an all-digital phase, content creators such as cinematographers and directors noticed that the default full RGB range causes issues for movies and TV shows. Full RGB has a wider darkness range, so details in dark areas show more clearly. For content makers that’s a problem because it makes “hiding” stuff harder. Horror movies, for example, love hiding things in dark visuals. Action shows use wires to make people fly but need to conceal said wires in post production. Science fiction movies and series have lots of effects and CGI. In a full and vivid dynamic range a lot of these elements look less realistic and overly exposed. After much experimentation, the 16-235 range was adopted by pretty much all cinematic and creative arts applications. Your streaming services and Blu-rays carry content that’s nearly always mastered in limited RGB. So, why is this a problem? 

Well, PC monitors by default run the full RGB range. But if you leave it at that and then use the monitor to view limited RGB sources, you’ll get crushed black levels. In essence, the dark areas of the image will appear too dark and completely lacking in detail. So, to enjoy movies and TV shows on a monitor you should theoretically switch to limited RGB. Fortunately, internal or downloaded apps like Netflix have gotten pretty good at auto adjusting to the HDMI or DisplayPort connection used to convey them. Blu-rays are hardcoded to limited RGB and just look bad on a full RGB display.

The reverse is true as well. Force your full RGB monitor to always run limited RGB, including in the OS, and you’ll get a washed out image. Blacks become dark grey and a lot of detail disappears. Similarly, if you use a TV as your monitor, chances are you should be on limited RGB. Forcing the TV to work in full RGB will cause crushed blacks, as mentioned above.

After all above, you probably want to choose RGB 4:4:4 or YcBCr 4:4:4 from adrenaline software for gaming because they contains more chroma information. The Tv will probably adapted to those settings automatically if you you use PC Mode. I guess you want to use one of those two also in Game mode also. I don't know if Metro Exodus are mastered in 8 bits and i don't know if there any software out there to check it. 

Also you need to consider that HDR isn't always good either for movies or games. In many cases all HDR mode do, is to rise the gamma of the image midtones and that's why a lot of times you see washed out colors. Most of the TV products are HDR ready products. They are great for SDR reference but not for HDR and for now HDR has no industry standard.

I have finally tested whole thing in many games and I have come to some conclusion.

[All of these settings are for RX5600XT HDMI 2.0 graphic card. Every new card with HDMI 2.1 will work better, but now i have this one so I had to test limits on LG C1 HDMI 2.1].

Windows is running in 8-bit as an OS and Adrenalin setting 8-bit is just for it. Just for OS, best result will be 60 frames RGB 4:4:4 8-bit. If we go above 60 frames (120), we lose better quality on lower resolution and HDR (about this later). Windows cant go higher than 8-bits. I saw comments about forcing 10-bits on nvidia cards and OS had problems with colors. For HDR I use HGIG tone mapping. It's new standard from Microsoft and Sony which actually using HDR properly without additional tone mapping from TV itself.

Now, about games. This stuff is really tricky, not gonna lie. If the game is advanced enough with engine it will change pixel format automaticly by itself on specific resolution, but only when graphic card scaling is off in Adrenalin settings. To check actual format you click few times green button on TV remote control. It will open VRR Freesync info. For example Assassin's Creed Odyssey switching pixel format to RGB 4:4:4 12-bit on 1080p. If we go above 1440p, game will switch to 8-bit (all 60 frames, HDR on). 1080p 12-bit looks much better than 1440p 8-bit.

Metro Exodus is more tricky. on 1080p in game settings it runs 4k upscaling (from full HD) 8-bit. When you switch in game settings to 1440p it will push native full HD 12-bit. I dont know why this works like that, but it is. I guess the TV always try to upscale every time. If you run really high resolution YcbCr 12-bit still looks better. More depth. Its just this game. HGIG HDR.

Witcher 3 runs RGB 4:4:4 12-bit in 1080p 60 frames. Its only one native resolution. If we go above this, TV will always upscale to 4K with RGB 8-bit. YcbCr 12-bit personal preference. Also you can use TV HDR tone mapping for this game instead of HGIG, because no HDR setting in game.

Disco Elysium is even more tricky. It basicly runs on Windows native 4K, but with artifical resolution scaling inside. So we have RGB 4:4:4 8-bit (Windows), and you adjust by swithing "resolution" in game. In reality it's just resolution scaling. Less advanced engine, so it cant switch pixel format by itself.

Overall on lower resolution, the RGB 4:4:4 12-bit in games is godlike if the game engine is working on it. If we go above, for example 1440p RGB 4:4:4 8-bit is looking slighty worse in color, but still works great. HDR still works, but it isnt as good as it shoud be.  F1 2020 can run 1440p HDR HGIG RGB 4:4:4 8-bit in 120 frames. Engine allow that. Above 1440p HDR works only in 60 frames. If you want real HDR on native 4K, you need to go for YcbCr 12-bit 60 frames. Some games looks better in SDR after adjustmens on TV, some better in HDR. Depends on implementation, but definitly HGIG standard doing great job.

(if you run games like Tekken 7 or any "console" port use YcbCr 4:4:4 for low fps cutscenes).