I don't need FPS higher than 30.
I use Shadow of Tomb Raider to test it. I start "New game", then watch starting cut scenes and exit the game at same point.
No Radeon Chill or FRTC enabled: Max temp is 67 degrees, max watts is 107.
Radeon Chill (Min=30, Max=30): Max temp is 55 degrees, max watts is 73.
FRTC (30 FPS): Max temp is 65 degrees, max watts is 102.
I tested both Radeon Chill and FRTC features, but strangely FRTC ALMOST doesn't chill or lower watts usage. I think it's ultimate weird (especially since Radeon Chill is busier by trying to determine if I am active in game or not). FRTC acts more simple, GPU does not produce crazy FPS, yet it heats and eat watts as there is no frame limiter at all.
Maybe there is something wrong with FRTC in latest driver? Please, check it.
P.S. I use Radeon RX 570 by the way.
I think Chill lowers the number of frames from the CPU to the GPU and syncronises the frame buffer output as well.
In other words the entire CPU-> GPU Pipeline slows down.
Therefore power is saved.
RE: I think it's ultimate weird (especially since Radeon Chill is busier by trying to determine if I am active in game or not)
You are correct in that assumption - Chill polls certain keys and mouse input.
Turning Chill on w/o limiting the real FPS in game can increase GPU power in some games I have tested in the past.
I think FRTC just lowers the "back end" of the GPU pipleline to Frame buffer output rate though.
I think that is why it saves less power.
It might be interesting to look at your CPU Utilization, with CPU running at a fixed frequency.
Or perhaps you are correct and FRTC is indeed "broken" from a power perspective.
We can only make certain guesses as we did not write the code for FRTC or Chill, and can only test it.
I am just glad Global FRTC is back in the drivers because i need it to limit mouse FPS and power when I use use Chill and run games with in game Vsync off.