AnsweredAssumed Answered

GPU Scaling: Please Let Us Choose the Interpolation Method

Question asked by bigtoe on Jan 17, 2016
Latest reply on Jan 14, 2018 by eugen_m

The Problem

Non-native resolutions appear blurry on LCD panels, even when the resolution divides perfectly into the monitor's native res. AMD's 'GPU Scaling' option seems to use bilinear interpolation, or something similar, which creates this effect. Why not turn off 'GPU Scaling'? Well, then we are at the mercy of our monitors, and in my experience, these all seem to use the same method for interpolation, too. I was surprised and disappointed to find that my 4K monitor turned 1080p into an ugly mess, even though the native resolution is exactly four times that of 1080p.

 

The Solution (Pretty Please, AMD?)

Allow us to choose nearest-neighbour interpolation for a crisp, pixel-perfect image at compatible resolutions (eg. displaying a 1920x1080 image on a 3840x2160 panel, or a 1280x720 image on a 2560x1440 panel). The drivers could revert to bilinear (or whatever) automatically for those resolutions that don't fit nicely (720p on a 1080p display, for example), but it doesn't even have to do that. Hell, I'd be happy with an option I have to manually change depending on my usage; anything to avoid the horrible blurriness we get at the moment. Nearest-neighbour interpolation is the least-complicated form of image scaling there is; concerning perfectly-divisible resolutions, a single pixel is simply represented by four pixels, or nine pixels, or sixteen, etc.

 

Why Does This Matter? Why Not Just Run Everything at Native Resolution?

There are still many games that have a fixed resolution of 720p or 1080p, or even 540p; mainly 2D indie titles / classics, and there are games that don't support higher resolutions, or exhibit problems such as tiny interfaces or text when running at these higher resolutions. One of my all-time favourite games, FTL, runs at 720p only. I have the option of running it pixel perfect in a window - which is tiny - or fullscreen, which is blurry. Most 'HD' video content has a 720p or 1080p resolution, too, although perhaps the difference wouldn't be as noticeable as with games.

 

There are also users who run high-resolution monitors to take advantage of more desktop space and a crisper image but struggle to run games well at native resolution. I personally have a 4K monitor for a nicer desktop experience (and 4K video content), but find that my R9 390 struggles at native resolution in graphically-demanding games. GTA 5 runs fantastically at 1080p but looks blurry due to the filtering method employed by the GPU drivers. At 4K, it's pixel perfect but performance is unacceptable. Dying Light is the same story. And Crysis 3, and so on. I'd need to fill my rig with GPUs to get decent framerates without compromises. Putting cost aside for a moment, I don't actually want to line my case with multiple graphics cards; the whole thing seems so inelegant and wasteful.

 

Let's look at some of the resolutions that could all appear perfectly crisp if nearest-neighbour were used. Take note of the large selection that 4K monitors would have access to. Also, 5K monitors, while not a realistic proposition for a lot of people at the moment, could have the bonus of crisp gaming at 1440p (think of the horsepower you'd need to game at native res!).

 

1080p Monitors

Full HD: 1920x1080 (native)

qHD: 960x540

nHD: 640x360

 

1440p Monitors

QHD: 2560x1440 (native)

HD: 1280x720

nHD: 640x360

 

4K Monitors

UHD: 3840x2160 (native)

Full HD: 1920x1080

HD: 1280x720

qHD: 960x540

nHD: 640x360

 

5K Monitors

UHD+: 5120x2880 (native)

QHD: 2560x1440

HD: 1280x720

WSVGA: 1024x576

nHD: 640x360

Outcomes