Radeon Chill is designed to change the game’s FPS rate to correspond to what’s happening on-screen. Imagine a situation where you’re playing a game and either alt-tab out to do something else, or simply have to get up and leave the keyboard. Alternately, you might be crafting, playing a mini-game, or sorting through your inventory. Either way, there’s not much going on at this particular point. Ordinarily, your GPU will simply render the highest frame rate it can, regardless of whether those frames are actually being put to any kind of use.
Radeon Chill is, at least in theory, a way to get back some of the power you’re otherwise wasting on rendering 100+ frames per second on what amounts to a mostly still-life. It’s fully compatible with AMD’s FreeSync technology, and AMD claims it can improve frame rate responsiveness, not just improve power consumption.
There are some caveats right now. Radeon Chill is currently only DX9 / DX11 compatible, and whitelisted games are manually selected for inclusion (AMD doesn’t enable this feature by default, in other words). That’s probably for the best given how new it is, but the list of titles it supports is at least fairly inclusive, with a solid number of top-tier titles of the past few years. Chill targets a 40 FPS frame rate when there’s not much going on in-game and a 60 FPS target otherwise, but how does performance shake out, and can Radeon Chill actually improve GPU responsiveness?