cancel
Showing results for 
Search instead for 
Did you mean: 

PC Drivers & Software

timk
Adept II

The right Color Temperature Setting in Driver: Automatic or 6500k?

I'm using my monitors own colour profile in Windows 10 and wan't to know which AMD driver setting I should use in order to display colors accurately. Within the driver there are two options for color temperature, Automatic and Manual (6500k).

Switching between automatic and 6500k I see a difference in blue and red colours.

My question is which one of those settings is the "correct choice" for accurate color display in my case?

I have set the pixelformat to Full RGB 4:4:4 PC Standard.

Thanks.

0 Likes
11 Replies

Automatic.

0 Likes
timk
Adept II

Are you sure? Because the default driver setting is 6500k, so when you click on reset it will go to manual.

Can you explain as to why automatic is the right choice? Thx

0 Likes

Because it doesn't override any values set by Windows or your monitor.

timk
Adept II

Thx for the input. Just to get a better understanding, do you by chance know what the manual setting is for then? Why would someone use 6500k over automatic?

0 Likes

If they had no other way to adjust their calibration, such as a laptop, but it's really a holdover from pre Windows 7 days when the display calibration was very limited. Also, if you wanted to temporarily adjust your settings to cheat at games, you could use it.

cronus
Staff

I recommend 6500K.
HDMI/DP use standard colour spaces (sRGB, BT2020 etc) that typically use 6500K and well-behaved monitors will convert to their own colour temp/gamut. Some do not, they simply reinterpret data in their internal colour space which is wrong.
In those cases Automatic is better - it means the driver will convert to monitor gamut. But if the monitor is well-behaved, then Automatic is counterproductive since both driver and monitor are doing that conversion.

0 Likes

@6500k the colors on my monitor are a bit stronger compared to "auto", but I can't tell if 6500k is oversaturated or natural, because the difference is marginal -but still noticable. On my Samsung 4k TV the setting is called "native" and "auto", and I can tell that "native" is oversaturating the picture there, so I go with auto, which seems more natural. But on my monitor it's hard to tell which setting is the correct one, since both look ok, just that 6500k is slightly more vibrant.

PS: The new AMD driver 19.12.2 changed the menus and now the setting is called "Custom color", and I noticed when the setting is "off", it equals to 6500k.

This must be a bug, because setting "custom color" to "on" and turning off "color temperatur control" will equal to "auto" setting. I think that they made a mistake there.

0 Likes

It's not a bug, though I admit the naming is confusing.

Longer explanation:
HDMI and DP have video metadata so GPU can tell the monitor which color space it's using. The options are limited to a few standard color spaces, sRGB, BT601, BT709, BT2020, P3 etc.
Fact 1: It just so happens that all these standard color spaces in PC space use D65 white point (i.e. 6500K color temperature).

In 19.12.2, "Custom Color" disabled means that color temperature is matching the color temperature of the color space indicated in the video metadata. This is really how it's supposed to work - HDMI/DP use some standard color spaces, GPU tells monitors which one it's using, and monitors convert to their native color spaces.
When you enable "Custom Color", default setting is Color Temperature Control enabled with 6500K. Due to Fact 1, this is the same as if you disabled "Custom Color" altogether. This is why it's not a bug, using 6500K is the right default.

With Custom Color enabled, and Color Temperature Control disabled, GPU is doing the conversion to panel's native color space. It's really not obvious from the name, it used to be called differently.
The problem is that monitor doesn't know this, and if it follows HDMI/DP spec and honors video metadata (this is what I meant by well-behaved in my earlier reply), it will convert from the color space in the video metadata to its native color space, without being aware that GPU already did that. Basically, this setting will only work if the monitor is ignoring HDMI/DP spec regarding color space info. Or maybe with DVI where there's no metadata.
This is why in most cases "Color Temperature Control disabled" isn't right.

0 Likes

On the off chance that cronus gets a notification for this reply, I just want to bring to attention a bug that I've logged relating to to this feature: https://community.amd.com/t5/drivers-software/radeon-software-20-12-1-quot-custom-color-quot-resets-...

jlmcr87
Adept II

I have checked it with my colorimeter. Here you have the correct setting:

Custom color disabled= accurate color

Custom color enabled+color temperature control enabled= accurate color

Custom color enabled+color temperature control disabled= innacurate color

0 Likes

cronus, jlmcr87

Ok, that means the drivers prior to Adrenalin 2020 must have switched the naming, because its the exact opposite.

Adrenalin 2019 "Automatic"         = Custom color enabled + color temperatur disabled in the new driver

Adrenalin 2019 "Manual 6500k"  Custom color disabled, or custom color enabled+ color temperature enabled

In other words, the "accurate color" as you say, is the manual setting in the old driver. Shouldn't the "automatic" setting be the "accurate color" setting, opposed to the manual setting which leads to believe its overwriting the monitors own color temp setting?

0 Likes