8 Replies Latest reply on Jul 25, 2018 8:11 PM by elstaci

    Untouched (bitperfect) output from AMD driver

    aproc

      Is there any way to get untouched (bitperfect) output from AMD GPU driver? My understanding is that it is impossible with AMD cards. So, I wanted to get a confirmation on this from anybody in the AMD driver development team.

       

      The leading calibration software companies, viz. LightIllusion (provides LightSpace CMS software) and Portrait Displays (formerly SpectraCAL, provides CalMAN software), specifically recommend to not use AMD cards for display profiling and calibration. Portrait Displays also did some experiments related to Dolby Vision test patterns (which require bitperfect output) and found that only Intel and NVIDIA graphics can produce bitperfect output. Below are some quotes and links from these companies:

       

      ‘AMD/ATI graphics cards have proven to not be transparent, and are therefore not recommended for direct patch generation and profiling.’

      ‘The reality is that modern Nvidia graphics cards and Intel HD Graphics chip-sets have very accurate output signals, and can be used for extremely accurate patch generation and hence display profiling and calibration’

      Link: https://www.lightillusion.com/direct_profiling.html

       

      ‘We know from doing our dolby vision testing that you can get bit perfect output for RGB from intel and nvidia if configured correctly. We also know that getting an AMD to be bit perfect is impossible.’

      Link: http://www.spectracal.com/forum/viewtopic.php?f=5&t=5934&p=39972&hilit=dolby+vision+amd#p39972

       

      ‘We have found that laptops with Intel or Nvidia GPUs work properly, but we have found that AMD/ATI GPUs can not be configured to pass the Dolby Vision test patterns to the display with the Dolby Vision embedded metadata intact.’

      Link: https://www.dropbox.com/s/nwsmfssbw7b35dy/Dolby%20Vision%20HDMI%20Calibration%20Setup%20Guide.pdf?dl=0

       

      All users of LightSpace and CalMAN who I know and who were using a PC/laptop and an AMD card for test pattern generation have already switched to Intel/NVIDIA in the past two years. And so, I am not able to get any help from them in setting the AMD driver for bitperfect output (if that is even possible).

       

      Can AMD please look into this? Untouched/bitperfect color output should be the default setting of any GPU driver (of course this is for the condition when the desktop resolution and color bit-depth are set to match with the display native resolution and bit-depth, and also RGB 4:4:4 Full Range output is used). Any additional processing, color enhancement or unnecessary dithering should be optional and defeatable.