I am seeing some very high power consumption numbers on RX5700XT GPUs I am testing.
Can anyone tell me if AMD have improved Power Monitoring on RX5700XT versus what was done on Polaris GPUs?
Has any current shunt monitoring of GPU Input Current and therefore Power been added on any RX5700XT RDNA GPUs?
Is improved current monitoring going to be implemented on RDNA2 GPUs?
I cannot see how AMD GPUs will realistically compete on power consumption if it remains unmeasured and uncontrolled on the GPU PCB components.
As far as I understand it, Nvidia GPUs have been using current shunts to carefully monitor and control Total Board Power for a long time now, wheras AMD GPUs, such as Polaris only have crude measurement of GPU Core input power. Everything else on AMD GPU card is uncontrolled and not measured, such as VRM power, Memory controller power, display output power etc.
Buildzoid video here:
The differences in AMD's and Nvidia power monitoring - YouTube
This article is interesting, although it just looks at GPU Power consumption with default driver settings on AMD and Nvidia GPUs:
Graphics Card Power Consumption Tested: Which GPUs Slurp the Most Juice? | Tom's Hardware
Reality is though, most AMD GPU Users will likely set fans to maximum and set power limit to +50% in an attempt to get more performance and match Nvidia competing Nvidia GPU performance.
That should push the power consumption levels up for AMD Users significantly.
Here is an example of a PowerColor Red Dragon running at stock GPU and Memory clocks running Furmark 2160p, Fans not even maxed out, with a +50% Power Limit.
The Radeon Performance Overlay reports 286 Watts of power input to the Navi 10 GPU die.
That number does not include Power Consumption on the rest of the RX5700XT Card.
It does not include Power Loss due to VRM inefficiency, Memory Controller power loss, Display Output Power, Fan Power, and other miscellaneous power loss etc.
The TDP specification for that PowerColor Red Dragon Card is 225 Watts:
PowerColor Red Dragon RX 5700 XT OC Specs | TechPowerUp GPU Database
I am running on a PC with an AX1600i PSU which uses Corsair iCUE software to monitor the Average Power Output from the PSU.
I can monitor the jump in Power Output taken by running FurMark 2160p test.
Anyone want to guess how much the Power Output Consumption jumps by?
Think "Vega Levels"
The interesting thing is that the card has PCIe-SIG specification connectors which allow:
PCIe Slot = 75Watts.
PCIe 6 pin = 75 Watts.
PCIe 8 Pin = 150 Watts.
Total Power Input within PCIe-SIG Specification = 300 Watts for this GPU.
Yet it is pulling 286 Watts through the GPU alone according to Radeon Perfomance Overlay.
Are RX5700XT's crashing and Blackscreening because users have been cranking up the Power Limit in Wattman and exceeding PCIe Slot Power like they did on RX480 Polaris years ago?