50% of the time when I start my computer up, my main monitor gets set to a resolution of 1440x900 when its native resolution is 1920x1080. This has been happening for a very long time now, through multiple different Windows installations.
When the issue occurs, I can sometimes go into Radeon settings and it will automatically detect the correct resolution and fix it. Most of the time it doesn't do this though.
I have tried enabling EDID in the Radeon Additional Settings as well as fiddling with all the options to no avail.
Did this occur on older drivers?
What happens if you disconnect the monitor from the integrated GPU, does that have any effect on the issue?
Installing your display drivers may fix the problem,
ViewSonic didn't release drivers to the VA1912w
but Samsung released drivers to your 2333:
Still happening. I have all the latest drivers for both monitors.
My previous fix was the play around with the Catalyst Control Center display settings and it eventually would end up at my normal resolution.
However it looks like AMD removed CCC and those features (why!?)
I'm on the latest version of my graphics cards driver.
It only goes up to 1600x900 in Windows Display Properties. Note that this only happens about half the time I power my computer on. This time I fixed it by restarting my graphics driver however it always occurs again after a few reboots.