I purchased a WX 4100 planning to run 4 4K (3840x2160) displays for a NOC workstation. They work well until the GPU is under load, whether by 3D benchmarking or compute workloads like hashcat, I get flickering and visual distortion on all monitors which to my eye looks like digital noise -- like the cable or display output hardware is being overdriven. Turning off any one monitor immediately resolves the issue. I had also tried enabling 10-bit color on 4 displays just for fun but only 2 of them would hold the setting, the other 2 reverted back to 8-bit. After searching the forums I found this:
I set refresh to 30hz and no more visual artifacts under load, and all 4 monitors can also be set to 10-bit color now, so that completely resolved my issues. But 30hz has noticeable input lag compared to 60hz, probably because I have gotten used to the higher refresh. I understand the WX 4100 is an entry level Professional card, and I am running this at the edge of its limits, but I did my homework before purchasing, and it is absolutely running within its published spec. Needless to say, I find this very annoying and feel somewhat misled.
My question is: Since this WX 4100 isn't quite able to perform at spec when under load, would moving up to an WX 5100 allow stable performance under load with 4x 4K@60Hz monitors, even though it has an identical published size/refresh spec? I don't know if I can trust the spec sheet now, hence I'm asking here.
As a footnote, I respectfully request that AMD update the published spec sheet for the WX 4100 so it is accurate: 3x 4k@60Hz, 4x 4k@30Hz. Willfully neglecting to do so is knowingly deceiving your customers who will end up wasting money, or worse, eating crow. I sell about 30 low-end Quadro cards in multi-monitor systems to my customers every month and decided to try the WX 4100 in my own system due to the 4GB RAM, lower price point, and better compute performance. So far I have spent many hours troubleshooting what effectively comes down to a lie on paper. The only thing I'm happy about right now is that I didn't put this in front of a customer.
Enabling Radeon Pro Image Boost on all 4 monitors substantially reduces the glitching to almost tolerable levels (1-3 full screen flickers every 10 seconds or so under full load) while maintaining near identical image quality and better input responsiveness at 4k 60Hz on all 4 monitors.
I still would like to know if the WX 5100 (or any other Radeon Pro card for that matter) is an appropriate choice for running 4x 4k@60Hz