cancel
Showing results for 
Search instead for 
Did you mean: 

Drivers & Software

bigtoe
Adept I

GPU Scaling: Please Let Us Choose the Interpolation Method

The Problem

Non-native resolutions appear blurry on LCD panels, even when the resolution divides perfectly into the monitor's native res. AMD's 'GPU Scaling' option seems to use bilinear interpolation, or something similar, which creates this effect. Why not turn off 'GPU Scaling'? Well, then we are at the mercy of our monitors, and in my experience, these all seem to use the same method for interpolation, too. I was surprised and disappointed to find that my 4K monitor turned 1080p into an ugly mess, even though the native resolution is exactly four times that of 1080p.

The Solution (Pretty Please, AMD?)

Allow us to choose nearest-neighbour interpolation for a crisp, pixel-perfect image at compatible resolutions (eg. displaying a 1920x1080 image on a 3840x2160 panel, or a 1280x720 image on a 2560x1440 panel). The drivers could revert to bilinear (or whatever) automatically for those resolutions that don't fit nicely (720p on a 1080p display, for example), but it doesn't even have to do that. Hell, I'd be happy with an option I have to manually change depending on my usage; anything to avoid the horrible blurriness we get at the moment. Nearest-neighbour interpolation is the least-complicated form of image scaling there is; concerning perfectly-divisible resolutions, a single pixel is simply represented by four pixels, or nine pixels, or sixteen, etc.

Why Does This Matter? Why Not Just Run Everything at Native Resolution?

There are still many games that have a fixed resolution of 720p or 1080p, or even 540p; mainly 2D indie titles / classics, and there are games that don't support higher resolutions, or exhibit problems such as tiny interfaces or text when running at these higher resolutions. One of my all-time favourite games, FTL, runs at 720p only. I have the option of running it pixel perfect in a window - which is tiny - or fullscreen, which is blurry. Most 'HD' video content has a 720p or 1080p resolution, too, although perhaps the difference wouldn't be as noticeable as with games.

There are also users who run high-resolution monitors to take advantage of more desktop space and a crisper image but struggle to run games well at native resolution. I personally have a 4K monitor for a nicer desktop experience (and 4K video content), but find that my R9 390 struggles at native resolution in graphically-demanding games. GTA 5 runs fantastically at 1080p but looks blurry due to the filtering method employed by the GPU drivers. At 4K, it's pixel perfect but performance is unacceptable. Dying Light is the same story. And Crysis 3, and so on. I'd need to fill my rig with GPUs to get decent framerates without compromises. Putting cost aside for a moment, I don't actually want to line my case with multiple graphics cards; the whole thing seems so inelegant and wasteful.

Let's look at some of the resolutions that could all appear perfectly crisp if nearest-neighbour were used. Take note of the large selection that 4K monitors would have access to. Also, 5K monitors, while not a realistic proposition for a lot of people at the moment, could have the bonus of crisp gaming at 1440p (think of the horsepower you'd need to game at native res!).

1080p Monitors

Full HD: 1920x1080 (native)

qHD: 960x540

nHD: 640x360

1440p Monitors

QHD: 2560x1440 (native)

HD: 1280x720

nHD: 640x360

4K Monitors

UHD: 3840x2160 (native)

Full HD: 1920x1080

HD: 1280x720

qHD: 960x540

nHD: 640x360

5K Monitors

UHD+: 5120x2880 (native)

QHD: 2560x1440

HD: 1280x720

WSVGA: 1024x576

nHD: 640x360

35 Replies
waltc
Miniboss

Re: GPU Scaling: Please Let Us Choose the Interpolation Method

That's the problem with all 4k & 5k monitors, essentially.  They should only be purchased if the user plans to actually use the native res more than any other lower res.  Really, the same is true of all post-CRT monitors--native res is always the best.  Unless you are running dual gpus you probably don't have the power to push games at your 4k native resolution--which is, of course, the solution.  I'd expect within the year single gpus will be strong enough to push most games @4k resolutions.

Also, some current monitors simply look much better than others when scaling down.  It has to do with pixel placement and dot pitch, panel type, etc.  Some 4k monitors are simply 2 1920x1080 monitors in the same bezel, more or less, etc.  A lot has to do with the particular monitor being used, in other words.

0 Kudos
bigtoe
Adept I

Re: GPU Scaling: Please Let Us Choose the Interpolation Method

I personally don't feel that 1080p monitors cut it for desktop use any more (single displays, anyway), and after using 1080p displays that fit in my pocket for the last two years at least, I was starting to wonder why I was putting up with having the same number of pixels on my comparatively huge 24" monitor. In my case, I spend more time in the desktop environment than in games, and I appreciate the clarity in text that the higher PPI of a 4K display gives me. Web pages are much nicer to read now, and the extra desktop space is fantastic; I no longer have to mess about with zoom settings to read two pages side by side, and with Windows 10 I can even snap a window to each corner of the screen; each one effectively having its own 1080p of pixels.

I agree that native resolution is always the best. However, other resolutions could look as crisp as native if they were handled correctly (albeit with reduced detail due to less visible information). GPU scaling currently ruins things with its smoothing algorithm. If it did not apply this filter and simply multiplied the pixels out instead, the picture would not be blurry at all (providing the source resolution divides exactly into the native resolution). In fact, scaling a 1080p image to 4K in this manner would likely look cleaner than a native 1080p panel as the subpixels would be more difficult to see. One "1080p pixel" would simply be represented by four "4K pixels"; no fancy algorithms to introduce artifacts to the image.

Nearest-neighbour interpolation should be extremely easy to implement, and would make viewing lower-resolutions a much nicer experience on high-resolution panels. Some panels may indeed do a better job of upscaling, but this would work on any model regardless. As I mentioned before, it's not just about having the power to push lots of pixels at native resolution; there is plenty of content out there that has fixed resolutions.

0 Kudos
quppa
Adept I

Re: GPU Scaling: Please Let Us Choose the Interpolation Method

Just registered to add my support for adding nearest neighbour scaling support. There's a longer thread on NVIDIA's forums and one here at ardForum​ that includes some good photographs.


I'm keen to buy a 4K monitor, but I can't bring myself to do so until there is a way to display 1920x1080 and 1280x720 pixel resolutions without filtering. The first company to add support for this to their drivers (or the first monitor manufacturer to have a configurable scaler) will definitely get my money.

0 Kudos
okarin
Adept I

Re: GPU Scaling: Please Let Us Choose the Interpolation Method

Same here. Registered just to support OP.

Can we at least get the answer from AMD tech staff, is it possible to implement such a simple feature (nearest-neighbour scaling) at driver level (with or without perfomance loss), or hardware must also support this? Do AMD have any plans to implement this in near future?

There is definitely no need in fancy interpolation algorithms (bilinear, bicubic, etc) when upscaling 1080p to 4k. Just copy pixels (1 to 4) and picture will be perfect - crisp and solid as on 1080p display or even better. I just bought 24" UHD monitor and ready to buy a top gaming videocard (radeon or any other) that supports selecting scale method. While it's possible to play all recent games on something like FuryX, there are still many 5+ years old games (and even some new ones) that supports 3840x2160 resolution but lags (due to engine limitations) or can't properly scale UI elements (like HUD, chat, buttons and text fields) so just unplayable in such resolution.

quppa
Adept I

Re: GPU Scaling: Please Let Us Choose the Interpolation Method

Meanwhile at the NVIDIA forums there's finally been some acknowledgement that they're looking into trying to add this feature. I'd be happy for AMD to beat them to the punch, of course.

bigtoe
Adept I

Re: GPU Scaling: Please Let Us Choose the Interpolation Method

Strangely enough, I had just discovered the nVidia post myself and decided to check in on this one to see if it had stirred anything up. I'm happy to find a few people who understand the need for this.

If nVidia listen to their customers and implement this overdue option, I would gladly swap my R9 390 out in an instant.

mt_
Adept II

Re: GPU Scaling: Please Let Us Choose the Interpolation Method

Unless you are running dual gpus you probably don't have the power to  push games at your 4k native resolution--which is, of course, the  solution.  I'd expect within the year single gpus will be strong enough  to push most games @4k resolutions.

That’s just a stereotype detached from reality. There is software (not just games) that does not support High-DPI modes at all. For example, latest versions of most of professional music DAW software such as Steinberg Cubase, Cakewalk Sonar, Ableton Live, Avid Pro Tools, Cockos Reaper are either blurry (by default at native 4K resolution with OS DPI scaling used, or when a lower-than-physical resolution is set on OS level — e.g. Full HD instead of 4K, so either GPU’s or the monitor’s own blurry scaling algorithm is applied) or have tiny GUI (when DPI scaling is forcedly disabled in exe properties while OS is running at native resolution of the monitor). This is absolutely NOT solvable by purchasing a more performant graphics card.

0 Kudos
mt_
Adept II

Re: GPU Scaling: Please Let Us Choose the Interpolation Method

I would say a similar thing, but in an opposite direction: I would replace my existing nVidia card with an AMD one if AMD did support blur-free integer-ratio scaling.

mt_
Adept II

Re: GPU Scaling: Please Let Us Choose the Interpolation Method

is it possible to implement such a simple feature (nearest-neighbour scaling) at driver level (with or without perfomance loss)

It is undoubtedly possible. For example, nVidia driver has the Dynamic Super Resolution (DSR) feature which in essence is rendering at a larger-than-physical resolution and then downscaling to physical resolution with smoothing filter applied.

Blur-free scaling by pixel duplication is even simpler to implement (as a pro web-developer and a hobbyist programmer, I suspect it would be much simpler).