I am developing under Ubuntu, using a Radeon RX 5500 XT and the gpupro driver (20.30-1109583 version.)
I was puzzled by the lower than expected performance, so decided to investigate.
I must say, a lot of the AMD tools have been deprecated, no longer supported, or lack OpenGL support.
Anyways... I did find ROCM SMI.
Thanks for sharing that!
When I disable vsync, in order to load the GPU as much as possible, with my game, I get this report back from ROCM SMI:
========================ROCm System Management Interface========================
GPU Temp AvgPwr SCLK MCLK Fan Perf PwrCap VRAM% GPU%
0 68.0c 38.0W 1185Mhz 875Mhz 20.0% profile_peak 120.0W 26% 99%
==============================End of ROCm SMI Log ==============================
Now, onto my question: What puzzles me is that the GPU is reportedly at a load of 99%. Good.
But how come it is drawing 38Watts of the max 120Watt load?
Do I have an inefficiency in my OpenGL code that makes the GPU load 99% but yet, somehow, doesn't make the GPU work at full capacity? (I render 2560x1600 MSAA, no vsync, so I would expected a lot of power draw?)
OpenGL vendor string: Advanced Micro Devices, Inc.
OpenGL renderer string: Radeon RX 5500 XT
OpenGL core profile version string: 4.6.14752 Core Profile Context 20.30