From the latest Doom patch notes
Removed AMD HDR warning message on startup (AMD has since fixed the issue with their current release drivers)
Eh... HDR is a difficult subject to delve into, because the term is so often used incorrectly.
In regards to Modern HDR in Games., specifically it's referring to HDR10 (REC 709 ST 2048 Format) that yes, requires 10bit Colour and Fullscreen ("Exclusive") to be enabled.
Still, I'll go a little further with this because I'm sure some will have noticed a dramatic improvement in image quality switching from their "SDR" Display to a "HDR" Display even though you're using it in 8bit with Dithering.
This is where things get a little frustrating to explain...
From most peoples perspectives, this will appear to simply provide a much more Vibrant output than their old SDR Display.
Well... that's because most SDR Displays were actually LDR., so while the LEDs were capable of the full 16.7 Million Colour Range; because of how Backlighting was typically handled, you'd actually have a reduced Contrast Range producing the more washed out LDR Output.
As such, now you're seeing SDR how it's supposed to actually output as; well it feels like a big upgrade despite the fact that actually you're not even using the actual HDR that your Display Supports.
The problem with HDR Support however, and this is why I'd actually recommend a Smart TV with HDR over a Monitor., is that Windows 10 has TERRIBLE HDR support.
If you have an Xbox One S/X, then I'd strongly recommend you try your HDR Display on that... Gears of War Tactics, Sea of Thieves and Forza Horizon 3 all have HDR Support on both Windows 10 and Xbox One; plus they're all on Game Pass.
You'll see instantly what I mean by "Windows 10 has Terrible HDR support" because you'll see what your Display is ACTUALLY capable of on that Console, because here's the kicker... AMD are the ones who designed and developed the HDR Support there, where-as on Windows 10 they have to go through Microsoft's Implementation. It's light night and day.
The same is true in regards to the Smart TV Apps for Netflix, Amazon Prime, YouTube, etc. you'll see the Native Formats for HDR on there and it looks great; where-as on the Desktop, you often have to roll the Dice to see if it'll even work; and that has very little to do with the Drivers.
NVIDIA does convert most HDR into the Windows 10 ST2048 Format., but even then they also switch it to SDR (because that's all you can really do in Real-Time) thus it isn't really HDR. I dunno, it's all very frustrating when it comes to Windows 10. Microsoft really need to get their act together and allow the Graphics Manufacturers have control over the Display Output again... as they KNOW what they're doing, Microsoft doesn't.
Hey thanks for the info. How all this translates into what is being seen on the TV is beyond me.
I am connected to a Samsung TV with HDR. Albeit it is a low end mode. It does however say it supports both 8 bit dithering and 10 bit.
I had my RX 580 hooked up to it only for a short wile but I could choose 8 or 10 bit.
My Nvidia card however only had and option for 8 bit. I knew based on the old card that it could do 10. And I did see a difference in 8 to 10, not so much in color but in smoothness.
So upon a lot of research and luckily someone said this in a forum or I would have likely never figured it out even though it is in the studio driver release notes.
Only the Nvidia Studio Drivers support 10 bit HDR. The game ready only do 8 bit. I have no idea why Nvidia puts this limitation on it.
Now since using the both settings I have actually found that some games require the 8 bit or the 10 bit setting vs the other to get HDR to work.