Having an issue where the GPU settings were changed when I installed and tried to play a new game. This caused the monitor to display "Input Not Support". After connecting a TV to use as a monitor, I was able to see the screen again. Read through forums and everything pointed me towards a possible driver issue. I went through the uninstall, used DDU to clean up any lingering issues and rebooted.
After the reboot, my original monitor worked again with the base windows display card. I downloaded a new driver from the AMD site (AMD Radeon R9 380 for Windows 10), and installed. Immediately after the install finished, my screen flickered, and went back to "Input Not Support".
I have tried changing the resolution through the AMD settings program, which points me back to the windows display settings. This did not yield anything successful.
I should also state, I can connect any other device to that monitor and it works fine. It seems to only happen with this computer, when the AMD drivers are installed.
Did the new game change the Resolution to the monitor before starting to play? Which game were you playing?
What does Windows Setup - Display show?
Does Device Manager show any errors anywhere?
Are there any Monitor OSD functions for changing the Resolution to your Monitor directly? What Make & Model Monitor do you have and how is it connected to your GPU card - HDMI, DVI, DP or VGA.?
Try using an older AMD driver and see if it helps any.
Thanks for your prompt response.
The game (Star Wars The Old Republic) has a login window, and then the game goes fullscreen after you enter credentials. The issue started when the game went fullscreen. I have seen that before, and it is usually a simple resolution change. This time,however, that didn't work.
Currently I have removed the AMD driver, and the monitor is working. The display settings for that monitor are at 1280 x 960 at 64 Hz. I am able to change the resolution without issue, but the refresh rate is grayed out for both cases (AMD driver uninstalled/installed). When it is installed, I have to revert back to a TV as a monitor.
Device manager shows no errors, for anything. This is true for both cases (AMD driver uninstalled/installed).
As far as I can tell, I cannot change the resolution/refresh rate on the monitor side. There is OSD setup, but it is mainly language, timeout, h. pos., v. pos, and transparence. It does change automatically based on input device. I connected a ps4 to the monitor and it went to resolution of 1920 x 1280, and a refresh of 67KHz horizontal, 60 Hz vertical. The monitor is an AOC I2367F (230LM00023), connected by HDMI at PC and DVI at monitor.
I will try an older driver, and let you know the results.
I installed the AMD drivers from the disk I got with the unit. I also found a DVI to DVI cable and connected the monitor, so I have the TV and monitor duplicating the same display.
The old drivers are working, and the monitor is working at this time. When I went to disconnected the TV (monitor still connected by DVI to DVI), the monitor immediately went to input not supported. Just for kicks, I connected the monitor HDMI to DVI again, and the screen still displayed input not support.
Not sure what to do next....
Sounds like your HDMI-DVI cable is not compatible with the GPU and Monitor. I have read using an "Active" HDMI-DVI adapter, not cable, is compatible with most Monitors and GPU cards. They are not cheap though.
In Windows Setup - Display - Multiple Displays. How do you have your Displays setup. Try to use Monitor number one, your computer monitor as the main monitor and see if you continue to have problems. If you do, disconnect the TV set and set it up for your Display for just the Computer Monitor.
I also have my TV set with my Computer Monitor connected but I am not using an adapter. Straight HDMI-HDMI for both. Under Windows Setup-Display-Multiple Displays I have it set up to "Duplicate the displays".
The HDMI-DVI cable was working for several years up until I started that game, so it is hard to believe the issue is with the cable.
For whatever reason, I cannot set the monitor to be display number one. I can, however, set it as the main monitor. Everything works fine until I unplug the TV connection, at which point it displays input not supported again.
I cannot set up the monitor to display without the TV connected, because I cannot see anything unless I uninstall the AMD drivers again. The monitor only has DVI and VGA, so I cannot connect to HDMI without an adapter. It is currently connected by a DVI to DVI though.
I tried all modes in the display settings (extended, duplicated, 1 or 2), and everything works fine until the TV connection is unplugged.
When I right click the desktop and try to open the AMD Catalyst Control Center, I get the following error message:
"Windows cannot find 'cli'. Make sure you typed the name correctly, and then try again."
Device manager shows no issues with anything. I can try to uninstall the drivers again, but I have done that so many times now, I don't know if it will make a difference.
Just rebooted the computer, and the AMD software/drivers had updated itself. I was not able to open the setting to turn off automatic updates. It has gone from the Catalyst version to Adrenalin. Also, the tab in display>additional settings for the catalyst control center is gone.
This is an older version of adrenalin, however, because it keeps asking me to update. I am able to open additional settings from the adrenalin interface, but it only gives me the options to set up eyefinity displays. No resolution or refresh rate changes can be made within the AMD software. It keeps referring back to the windows display settings.