On overclocking/undervolting videos for Radeon cards (specifically for my RX 6800 XT), I keep seeing instructions to increase the Minimum Frequency of the GPU from its default 500 MHz to something like 100 MHz less than what they're setting the Maximum Frequency to. So, if they're setting their Maximum Frequency to 2500 MHz, they'd set their Minimum Frequency to 2400 MHz. Why?
On the last (similar) video I saw for that:
https://youtu.be/mH9NGfZKU9I
I actually asked about it:
"9:30 I don't understand why I'd want to increase my Minimum Frequency. Why wouldn't I leave it at its default 500 MHz? Why would I want to have my card chugging away at 2300-2400 MHz (minimum) if the game doesn't use enough of a processing load to cause the card to ramp up its frequency on its own?"
The response I got was that it was to help with CPU bound games. To which I responded:
"I'm sorry, but I still don't understand. If the game is CPU bound, what difference would the GPU frequency make? The game will still be performing badly regardless of the GPU. Also, wouldn't something like setting the minimum framerate under Radeon Chill on the Graphics tab work better at keeping the GPU properly active?"
Can anyone explain this better? YouTube comment pages aren't exactly a good place for technical discussions.