I use a capture card for streaming purposes and while my main monitor can do 4k and what I usually play on the capture card maxes at 1080, yet, whenever they are being duplicated so the card can pick it up, AMD for some idiotic reason ALWAYS defaults the 1080 monitor as the DEFAULT and even if I flick the switch to make the OTHER monitor my default, the second I close the software it RESETS the switch and puts it back to the other setting.
What the hell is going on with the software and why does it absolutely refuse to do what I tell it to?