When in any game, Radeon won't follow any custom voltage curve. Same is true whether running on global tuning profile, or game profile. Will apply all other settings, just not voltage. I cap my voltage at 1050mv, and it has been running that way for almost a year. Now, when I'm in a game it ignores my settings and runs the factory voltage, which hovers between 1100mv and ~1170mv most of the time.
This issue started when I updated to 19.12.2 the day it was released, though something similar happened once a couple of months ago, but I fixed it by resetting all of my Wattman profiles if I remember correctly. I updated to 19.12.3 today, and still have the same issue. I did a little more testing though.
Confirmed in one way or another:
memory clock applies
memory timing applies
fan curve applies
core clock applies
Games I've tested it in, all unsuccessfully: Witcher 3, Skyrim, Elite: Dangerous, and Unigen Heaven 4.0
Things I've tried:
-Using game profiles (with specific clocks so i knew which one it was applying)
-Deleting all tuning profiles except the global profile.
-Changing display settings (Enhanced Sync, image sharpening...).
Interestingly, the voltage curve I'm requesting DOES apply when running Realbench, in which the GPU caps at 1050mv max as it should. So at least under a computing workload it runs at 1050mv for some reason.
-ryzen 2600, PBO2
-Asus ROG b450-i
-2x g.skill 8gb 3200 ram
-Sapphire RX 580 Nitro+
-Corsair SF600 gold
-running Windows 1909
I use HWinfo to monitor everything. The voltage reported (factory curve) is confirmed by a significant increase in heat and power draw compared to my undervolt. The games still run fine, but if I play I have to run the fans at a significantly higher RPM to maintain temps around 70C.
Any help or insight would be greatly appreciated. I may end up just rolling back to an older driver if I can't get it sorted. Thanks for reading.