This is an issue I've seen sporadically mentioned or brought up since 2023, but no one's really gone into detail what this issue is or how it appears to work.
So, here's my detailed explanation on the issue & why I hope AMD fixes it in a newer driver release, because consuming 100w's at idle non-stop just because I have 3 monitors is insane. Not only does it waste electricity, it generates unnecessary heat, and produces unnecessary wear on the graphics card itself for no reason at all.
So, let's get to what I've figured out about it so far.
With a single 1440p 170hz monitor, power consumption is entirely fine. 30 - 50w idle (it seems to vary / hover around there). Whether 30 - 50w idle is actually fine I have no idea.
All 3 of my Monitors run off of Display Port cables.
The moment I turn on my 2nd monitor, this happens. The 2nd monitor is at 1080p 165hz btw.
Turning the refresh rate down to 144hz seems to sort of fix the issue? No idea why tho.
It will not met me use 165hz on my 2nd monitor at all without pushing 100w's into the GPU for no reason.
This set up uses 1440p 170hz & 1080p 165hz monitors. The 165hz monitor was set to 144hz to achieve the result below.
And as soon as I turn on my 3rd monitor which is 1440p 165hz...
We're now back to consuming 98 - 100w's again.
The only way I've figured out on how to solve this is by doing -
1st monitor - 1440p 144hz
2nd monitor - 1080p 60hz
3rd monitor - 1440p 60hz
Which gives me 50w idle usage again.
Strangely if I set my 2nd monitor to 100hz exactly, the power consumption will begin idling around 70 - 80 watts for unknown reasons. This is the only configuration I've found that actually does this, and Idk why it does.
If anyone has any suggestions on how I could possibly fix this, please let me know. I'm positive it's an AMD Driver issue though, I've been experiencing it since I got my RX 7900 XTX last year in March 2023.
My exact GPU model is this one -
XFX Speedster Merc310 RX 7900 XTX, model number is RX-79XMERCB9.
I run the latest version of Windows 11 Pro.
Yep, welcome to 2022.
My launch day card, on 2 Samsung 1440p 240hz monitors still pulls dumb power, just idle at the windows screen. Its pulling 100w just typing this, while my other rig (another 7900xtx) on a 4k tv and 144 1080p monitor pulls single digits. They have said its "fixed" for well over a year, but for my one rig, its not, and i just gave up. There are a billion threads, just on this site about the "issue", and "fixes" that sometimes do, and mostly don't help/work. Like try a custom resolution (In adrenaline, go to the gear top right, then display, set custom res. I've heard that works for many, but didn't for me on my Samsung monitors)
Cheers.
Just Checked mine,RX7900XTX TUF OC
2-27" BenQ monitors
1440p 144Hz
1080p 60Hz
Windows 10
The issue only seems to happen if your 2nd monitors refresh rate is above 144hz, if it's not, the issue will not present itself. At least it'll only consume around 20 - 50w's based off my experience above.
With the 1440p 144hz & 1080p 60hz configuration I get the exact same results you get.
Yep, been there done that. Even a single monitor can do it, and when you add multiple monitors with different refresh rates, well you see what happens. It's due to monitor timings not allowing the memory to fully clock down. Supposedly you can try CRU and create resolutions with more relaxed timings. I did not have success doing that. I currently have one monitor (3440x1440 @ 240hz) and if i turn off freesync, i get the high power consumption.
You should not use Adrenalin to monitor your power consumption. It uses GPU 3D cores and makes it draw some power. Use GPU-Z only to check the power draw.
I have a RX 6900 XT and I've found out a weird result in my dual screen setup:
4K 59.94Hz + 1080p 59.94Hz = 6W idle, very good
4K 60Hz + 1080p 59.94Hz = 34W idle, **bleep** !!!!?
4K 120Hz + 1080p 59.94Hz = 7W idle, very good
idle means, no youtube, no 3D rendering, nothing happening on the screen, mouse cursor not moving.
idle + adrenalin opened = 18W, good enough
idle + metrics overlay (from adrenalin) = 16W, good enough
This is a refresh rate only issue, if the refresh rate on your monitor does not exceed 60hz, you will not notice this problem.
Naw.... nope, nadda. And or, No. Not on my end anyway. I can drop both to 60 and still see high double digits. Cheers.
You are correct. From what I can see (measurement wise) in adrenaline/real time, you are correct. The high idle still sucks, but, again, i dgaf.
Adrenaline Vs Meter'd right at the card. Its "off" for sure.
(This is with this page open, 2- 1440p 240 samsung monitors fyi)
I've noticed same thing but considering my resolutions, refresh rate and hdr - i understand its a lot.
(4k 160Hz HDR, 3.5k Wide Screen 100Hz HDR) (freesync disabled)
I think freesync potentially should decrease the wattage, when Hz is going down; but i digress. I've worse issues, and higher draw when i run those monitors from nv3080ti (I cannot even get those Hz or proper HDR on nv).
I can lower both my samsung monitors to 60hz, and still have 100w at idle. Been that way since launch day, and to now. I just gave up. On my other rig (same gear) with a 4k 120hz and 1080 144hz , its at 15w. For me, i think it has something to do with my samsung monitors. And again, i just gave up.
Then again, i don't care. Never have. In fact, i want more.
Asrock card with their vbios- See gpu power
(during a firestrike 1440 benchmark. Scored 40,4xx) Also, see temps
Custom loop.
So, i have been over the "high idle power" issue about 9 seconds after buying and putting the card in the system.