Showing results for 
Search instead for 
Did you mean: 

Graphics Cards

Adept II

(7900 XTX) High idle power and high VRAM speed on idle

In the picture below you can see on a single 1440p 32" monitor, it uses 100 watts and occasionally jumps to 107 watts on idle. I have a lot of experience but I've never seen anything like this. Is there a temporary workaround? I've tried everything you can find on youtube, reddit, or just talking to people firsthand and I have not been able to fix it. Stuck on 2487 MHz for vram also on idle. No gpu tasks or any third party applications open in the background either. Just windows wallpaper and some desktop icons. It's 170hz but I turned it down to 60hz, literally made 0% difference.

I also tried it on a 7900 XT and it still was at 100 watts. When I unplug the monitor it drops drastically.


Monitor model: VG32AQL1A

EDIT: Also having the same problem on another monitor, Alienware AW2723DF.

Screenshot 2023-08-20 173152.png

62 Replies

Do you notice any performance issues when you lower the power limit in after burner? I am going to have to try this! I haven’t seen this recommendation before 


ya that doesn't work either.

Like others are echoing, if I have to jump through all those hoops in order to use a $1000 graphics card, the solution is going to be "buy a different brand next time". This needs to be dealt with in a plug and play way.

Adept I

I have a Powercolor 7900 xtx Hellhound hooked up to a TCL 4K TV. When I set refresh to 144Hz it uses a lot of power at idle - always over 50w, usually 75-90w.... But setting it at 120Hz drops idle power down to sometimes below 20w.

While this is an acceptable fix for me as I tend to like games that have very complex graphics and can't run over 120Hz anyway, it's not going to be for everyone.

I would think that given this knowledge, AMD would be able to figure out what the difference is between the two refresh rates and do something to drop consumption at 144Hz.

Adept I

It might do some good to uninstall the AMD driver and all monitors with DDU and then reinstall the driver. I had the same issue on a 4K monitor running at 144Hz but now turning it down to 120Hz drops the idle power down to like 20w.


already did I don't know how many times. Doesn't work.

Using DDU, would have ZERO, none, zip, nadda having to do with this issue, thats been present since day one. lol........  And again, im not ever, EVER turning my gear down because of their issue. Ever. 

LOL,. ddu is the fix now. 


Every driver update ive ever done, since this card was released, (bought it day one) has been done with ddu. Running DDU, isnt the fix dude. In fact, it would have zero impact. Zero. 

   And, as said above, then why did you buy a 144hz monitor, if your turning it down?  Good and glad you are happy,.. i refuse to "bandied" the fix, and neuter my monitor. There isnt a world, that i would "lower" my frame rate. Especially buying a card to get high frame rate. Glad your content and happy,... the other 99% that have the issue are pissed off. So you do you, .. we will do us. Also, unplugging a monitor, or lowering anything, does anything. So, once again, your system, isnt mine. And ill never "turn" my hard ware down, because of a software **bleep** up.  But cheers! and thanks,,.. well for nothing. 

(also, im shocked you are fine with that lol, but hey, do you)

Journeyman III

I have the same issue on a TCL series 7 4k 144Hz

I have the Sapphire Nitro+ version; getting similar consumptions regardless of the BIOS preset I choose (power saver setting is not noticeable when idle)

I've tried (unsuccessfully) to use the CRU workaround but only managed to bork my windows installation. Custom resolution/refresh rate doesn't affect the usage. Even running at 30Hz does not reduce the consumption under 60W; down from 100W.

I've seen other posts on reddit from people with the exact same graphics card, CPU, resolution and refresh rate with idle usage at max 20-30W.


Does anyone have a valid workaround? Any ways to reach out directly to AMD as I've never seen them reply on this topic before.

It feels especially hopeless since I see other people having this issues for a year now; is there an ETA or acknowledgment/WIP announcement sent out by AMD somehow?


For the record - GPU-Z is showing a much lower consumption compared to Adrenalin (10-40W compared to 70-100W)

You can notice when I turn on Adrenalin in the graph here: 


Could it be just the Adrenaline software overusing the GPU during the measurement process?!


Here's how the same graph looks after I send Adrenalin back to the tray. The blocky/bulky bit of the graph coincides perfectly with the time the software was maximized/out of the system tray.

Would be curious to find out if any of you find the same behavior.


Running the latest Adrenalin driver btw.


Scratch that - I had my refresh rate set to 60 - it seems like there's no difference in consumption when watching youtube on any refresh rate (consumes 60-90W); yet if I set it to anything higher than 60 it won't consume less than 50W even when afk in desktop. The 10W above were achieved while afk in desktop with Adrenalin closed and on 60Hz.


I tried this and for my PC it doesn't matter. which one I use for showing the wattage. Both, Adrenalin and GPU-Z, show 75-80 W if i don't use CRU. 

Before someone asks: yes, I have only used one of both programs at the same time.

For me I found CRU working good and I got rid of trying anything else. For users for which CRU don't works its no consolation.

Adept II

I see that this still has not been fixed from January I returned RX 7900 XT and bought RTX 4070 ti and have no regrets.

Everything works, no crashes and idle power is around 30W. Performance is pretty much same i see no difference, when i don´t use DLSS. 


I just love AMD's raw power for the price but this driver optimization is disappointing. Hoping the radeon department works more on this.

Journeyman III

Im not sure if im too late but i was having the same issue and realised it was the screen recording used by adrenaline. Ive turned off the desktop recording on the adrenaline software and now my vram speeds have dropped and only using about 10-20w when idle. Hope this helps


how do you turn it off?


Probably under record and stream, then "record desktop". Mines never been on, and still 95 watt at idle (newest driver even)


And for me, even with the newest driver, still at 95watt at idle. Sooo cool. Love it. 


yeah, mine too.


I got a new monitor in the past week with better HDR. It's 4k 10-bit color and 144hz. I set the desktop refresh rate to 120hz for better motion handling in games. Then I finally got around to enabling HDR in windows.

Then I noticed the memory on my 7900 xtx would not clock down and was always at 24xx. This drove power usage to near 100w. Turning off HDR would allow memory clocks to drop.

Eventually, I figured out that memory would down clock with HDR enabled if I set the refresh rate to maximum of 144hz on desktop. No idea why there is such a large difference in power usage between 120hz and 144hz desktop when I only use the one monitor.

Adept I

24.2.1 Released today but this problem is still persists.

7900XTX - VP2785-4K * 2EA (Both runs 60Hz) and one PS341WU (5K2K @ 60Hz).

If only two monitors are connected regardless of the model, VRAM clock stays at 909MHz (~60W) , and All 3 monitors are connected then VRAM locks at full speed. (2487MHz, ~90W)

And if only one monitor is connected, VRAM runs at less than 30MHz, (14W).

Set every monitors to 30Hz doesn't affect to this problem.

Journeyman III

this is a nightmare and I am really kicking myself for 'experimenting' with AMD (I put in quotes because its supposed to be a product ready for market)

I am using the same monitors and cables that I used for my 2080 and had no issues with power consumption or overly clocked VRAM. I am a total idiot on the complexities of display cables and timing windows etc.. all I know is my consumption was about 1/3 with the same exact hardware (only thing that changed was my GPU). If NVIDIA "just works" then this can too... what is AMD missing?

My solution is to unplug every monitor and clock my main monitor to 120hz... 

- save $400 on a GPU

- cannot use 3x$1000 monitors

I would need to be paid at least $2000 for AMD to be a good deal


I'm wondering if AMD GPUs software has an issue with how monitors refresh rates are structured in windows.

If you use the cru app you see how the data is structured.

I have two monitors both 144hz both are gsync monitors but my main monitor has the sync ultimate or what ever they call it now so can use freesync.

Both monitors refresh data is structured in a different way in windows as shown in cru. They use the extended sections differently too. I've never understood why it's like that. One shows 60hz as the main refresh while the other shows 85hz in the main section.


Using only my main ultra wide with freesync the idle power drops to 8-12w.

If I turn off freesync it jumps to 50w at idle.


Turn on my second monitor at 144hz and idle power is 100-120w.

Turn my second monitor to 60hz and idle power drops to 50w

Custom resolutions don't work on my monitors at all. I can drop them 1hz and that's it but idle power draw is the same.

It's the different refresh rates and maybe how windows sees them with power drivers that seems to maybe be the issue. The power draw is directly effected by the hz rates.