May I ask, what gave you that assumption?
GPU Fan speed as far as I know is directly related to GPU Temperature. Granted, the higher Mhz the more power the GPU uses and the hotter the GPU will become thus the faster the Fan will speed up to keep it cool.
I can't see why a Fan Speed will be dependent on Mhz which can fluctuate rapidly at times depending what it is doing. Whereas, Temperature doesn't fluctuate as fast and my take a few seconds to cool or heat up.
Wattman is going to look different for different families of cards. The frequency component of the fan tuning was a noise related function which interacted with the zero rpm feature on Polaris cards. And possibly others. This basically cut fans on as the GPU hit or passed a frequency and if the GPU was above certain temperature when this happened.
I think 50c was a default for that trigger temperature for cool and quiet mode out of the box on some cards.
The fan curve had or has a temperature target which will also effect the fan cut on. This can push down that temperature cut on trigger when it was used with the frequency-noise component threshold. It was confusing then, and if it's still done like that, still is.
Lowering the frequency slider ( noise related slider below the fan speed spread ), and lowering the target temperature in the temperature spread will cut the fans on sooner and at lower temperatures.
These were aspects of Wattman as I remember them when I had a 480 a year in drivers ago. Don't known what it looks like now for those cards.