I have a Radeon 7850 connected to a 10-bit capable display via Displayport. Until recently I've been using a Eizo display with internal calibration so have not used the card's LUTs for calibration until now.
Everything I've read on the internet regarding display calibration suggests that AMD (consumer and professional) cards are able to expose high precision Video LUTs to software, and they then use dithering to present this to the output device to avoid banding. 10-bit LUT precision is often mentioned, and I believe with 1024 entries.
My attempts at calibration (using i1 Profiler, Basiccolor, ArgyllCMS dispcal) have all resulted in quite noticeable banding, despite only small gamma changes required. Argyll dispcal has a function to estimate lut precision and indeed this suggests it's effectively 8-bit.
I am using the 18.5.1 drivers. I have tried setting the display bit-depth to 10- 8 - and even 6-bit in the control panel with no apparent effect. I'm using up to date Windows 10 Pro. The display is an NEC EA275UHD. Gradients look very smooth with linear LUTs which suggest the monitor is doing it's job. The banding with a curve applied is what I'd expect with only 8-bits.
To be clear, I am not expecting full 10-bit output from windows applications, as I understand this requires a FirePro card. I am simply wondering if there is anything I can do to improve the calibration precision, or enable dithering, which many, many sources suggest shouldn't be a problem and indeed is a benefit of having an AMD card over an Nvidia one.
Do I need a different card?