AnsweredAssumed Answered

Multi-monitor support broken after Windows 10 upgrade

Question asked by shockz on Aug 6, 2015
Latest reply on Aug 8, 2015 by shockz

Hey all, saw a couple people having similar issues when I browsed around the forums, but here's my personal experience.


I have an unorthodox multi-monitor setup on my gaming PC: two old 1600x1200 Dell flat-panels (connected with DVI) on either side of a 1920x1080 Asus flat-panel (connected with HDMI). It was...already kind of finicky getting it working properly with my R9 280 on Windows 7, but eventually it all worked fine.


And then I upgraded to Windows 10 and of course it all broke.


I did a clean installation of Windows 10 and downloaded the newest, shiniest drivers AMD had to offer (15.7.1) when it was done. And when I rebooted after the installation, I found that my two old monitors were now nonfunctional; only the HDMI monitor was showing a picture at all. Catalyst Control Center detects the other two monitors, but keeps them disabled. When I attempt to manually enable one of them in CCC, via right-clicking and selecting Duplicate, Extend, or Replace (yes, I tried all three) on the Creating and Arranging Desktops menu, the CCC window flickers for a second and then does absolutely nothing. (As if it's going "Hmm...nah." I don't like it when it feels like my computer is mocking me. >_>)


If I go through a convoluted process of turning down the resolution on the HDMI monitor to something the other two can support, unplugging it, and rebooting my computer, the other two monitors will work fine. If after doing so, I plug the HDMI monitor back in, CCC switches only one of the DVI monitors off, and switches the HDMI monitor back on. (So to be clear, that means that the HDMI monitor plus one DVI monitor is working, and the other DVI monitor goes black.) At that point, if I try to re-enable the second DVI monitor, CCC tells me that I must disable one of the other two first. So no matter what I do, my expensive video card is only giving me a maximum of two monitors.


And if I reboot again, after going through that whole process, it goes back to only the HDMI monitor working and CCC laughing at me if I try to enable one of the other two.


I know AMD would rather I buy an expensive DisplayPort adapter or something, because for some reason they're under the impression that DisplayPort is popular, but I'd really rather my setup just work. The way it used to under Windows 7. Please?