I am new here, but probably I have a real question that I did not find answer for on this forums using search.
I am writing a C++ application using ADL SDK 13 that is capable of tweaking GPU Frequency and Voltage, similar to 2020 Radeon Software tuning feature. I wrote code based on display-library/Overdrive8.cpp at master · GPUOpen-LibrariesAndSDKs/display-library · GitHub example and it works, I am able to set individual values for any of freq or voltage settings. What I am missing is auto-adjustment, like if you change Frequency slider in Radeon Software, and Voltage automatically tunes. I tried percentage scaling, but immediately run into limits for Freq and Voltage not matching, e.g. minimal voltage setting did not mean minimal frequency because GPU could perform with better-than-minimal freq at minimal voltage. Same situation with upper limit - max frequency could be saturated with less than maximum voltage. On the attached image you can see that in my case max possible freq is achieved on just 60% of voltage. And on other GPUs, I imagine, those proportions are different each time.
This leads to my question, how can I learn or calculate those proportions to do similar auto-adjust for any given GPU?
I have just noticed that there is 14 version of ADL released, but from brief looking at the changes I did not see anything related that could help me. I will update ASAP anyway.