I have a monitor that will run at 3840x2160@30, or 1920x1080@120. In windows 8.1 it was a relatively easy matter to use 1080 resolution with GPU up-scaling off. The 1080 feed would be sent to the monitor. But on upgrading to 10 this is completely impossible as far as I can tell. For starters, if I simply go into the appropriate tab in CCC and disable "gpu up-scaling", then go to the resolution tab and change it to hdtv 1080p@60, the signal that is sent to the TV will be 3840x2160@30, seemingly upscaled from 1080 to 2160 and now with only 30 hz. Extremely annoying.
Adding a custom resolution for 1920x1080@120, via CRU, which worked fine in windows 8.1, now completely breaks the display and requires a reset.
When I enter a game and set it to 1080, this works - sort of. Sometimes it uses what the monitor calls "1080p", aka 1920x1080@60, which looks okay - but of course no way to get it to 120 hz like that. Other games will use "1080i", which looks much worse and I assume is interlaced and possibly being scaled from some other resolution.
I've tried uninstalling the display driver and reinstalling (both through windows update and via direct download), on several occasions. I have not yet tried a fresh windows install, but it may come to that.
Any advice would be appreciated.