At integer scaling ratios, full-screen upscaling should be done just by duplicating pixels, with no blur at all. For example, Full HD (1920×1080) image could be displayed on a 4K display (3840×2160) with no blur, just by displaying one image pixel as a group of exactly 4 (2×2) absolutely identical physical pixels with no interpixel diffusion whatsoever.
On the contrary, full-screen upscaling via graphics driver is currently blurry even if scaling ratio is integer (e.g. 2x, 3x, 4x). For FHD (1920×1080) and HD (1280×720) images on 4K (3840×2160) monitors of 24-27-inch size where individual image pixels are almost indistinguishable, such blur unreasonably decreases perceptible sharpness without adding anything useful.
There should be a driver option to disable blur or at least to switch between bilinear/bicubic and nearest-neighbour interpolation.
For better understanding of what nonblurry integer-ratio scaling is, please see the demo. Thanks.
See also the “Nonblurry integer-ratio scaling” article which is an attempt to explain the blur issue and collect and summarize all the important relevant information about the issue and nonblurry integer-ratio scaling by pixel duplication as a solution.
See also a corresponding petition on Change.org.
Update (2017-07-06): The feature is now supported by nVidia GeForce driver 384.47 (Beta) for Linux via the “Nearest” transform filter.
It would be great if AMD could add this as a driver feature. Native resolutions always look best, but with pixel-perfect scaling, resolutions that are a factor of the native res could look crisp too.
Being able to display 1080p and 720p resolutions on a 4K monitor without horrible, blurry interpolation would be fantastic.
I've been waiting for this feature from AMD or Nvidia for four years. It's really such a simple thing to implement and it would have huge benefits! With new ultra high resolution displays and many people asking for it on both sides of the war, it's essential and should be a top priority IMHO. Whoever implements this first I will jump the bandwagon, simple.
This is my position - my next GPU purchase (I'm in the market) will come down to whichever company implements this feature first.
I’ve discovered that scaling for non-DPI-aware applications in Windows 10 works exactly as we need for full-screen scaling:
So since Windows 7, Microsoft has figured out that HiDPI displays are getting popular and blur is unreasonable at integer Windows-zoom ratios, and implemented a new proper scaling algorithm in Windows 10.
So now, at least for Windows 10+ users, nonblurry full-screen scaling seems to be needed only for games, and this is still important, though the whole situation is now much better, and Microsoft set a good example of how things should be done.
you have to notify AMD of the issue
Thank you for your contribution. It is not enough for us just to notify AMD of the issue (I’m sure they are aware of it). What we need is a public official reply as well as ability for other users to find the thread and express their opinions in the same one place.
I'd like this to become a thing too, regardless of operating system.
Also, that's a nice image in the OP showing the difference.
Back in the old days, I had an IBM laptop that had a bios setting for scaling fullscreen apps, which allowed for a letterbox or even a centered square with a black frame if the app resolution was lower than screen resolution. It had an ATI rage GPU, if I recall.
From newer devices, PS vita displays PSP games at 2x pixel size, for pixel perfect emulation of older generation games.
AMD has VSR, there's no reason to not allow us a reverse-VSR for this, where GPU displays a full panel resolution and scales the image up without blur. Virtual Low Resolution or perhaps Virtual Pixel Perfect Resolution? Wouldn't it be easiest to "just add" resolution options to already present functionality?