34 Replies Latest reply on Aug 27, 2017 2:03 PM by bigtoe

    GPU Scaling: Please Let Us Choose the Interpolation Method

    bigtoe

      The Problem

      Non-native resolutions appear blurry on LCD panels, even when the resolution divides perfectly into the monitor's native res. AMD's 'GPU Scaling' option seems to use bilinear interpolation, or something similar, which creates this effect. Why not turn off 'GPU Scaling'? Well, then we are at the mercy of our monitors, and in my experience, these all seem to use the same method for interpolation, too. I was surprised and disappointed to find that my 4K monitor turned 1080p into an ugly mess, even though the native resolution is exactly four times that of 1080p.

       

      The Solution (Pretty Please, AMD?)

      Allow us to choose nearest-neighbour interpolation for a crisp, pixel-perfect image at compatible resolutions (eg. displaying a 1920x1080 image on a 3840x2160 panel, or a 1280x720 image on a 2560x1440 panel). The drivers could revert to bilinear (or whatever) automatically for those resolutions that don't fit nicely (720p on a 1080p display, for example), but it doesn't even have to do that. Hell, I'd be happy with an option I have to manually change depending on my usage; anything to avoid the horrible blurriness we get at the moment. Nearest-neighbour interpolation is the least-complicated form of image scaling there is; concerning perfectly-divisible resolutions, a single pixel is simply represented by four pixels, or nine pixels, or sixteen, etc.

       

      Why Does This Matter? Why Not Just Run Everything at Native Resolution?

      There are still many games that have a fixed resolution of 720p or 1080p, or even 540p; mainly 2D indie titles / classics, and there are games that don't support higher resolutions, or exhibit problems such as tiny interfaces or text when running at these higher resolutions. One of my all-time favourite games, FTL, runs at 720p only. I have the option of running it pixel perfect in a window - which is tiny - or fullscreen, which is blurry. Most 'HD' video content has a 720p or 1080p resolution, too, although perhaps the difference wouldn't be as noticeable as with games.

       

      There are also users who run high-resolution monitors to take advantage of more desktop space and a crisper image but struggle to run games well at native resolution. I personally have a 4K monitor for a nicer desktop experience (and 4K video content), but find that my R9 390 struggles at native resolution in graphically-demanding games. GTA 5 runs fantastically at 1080p but looks blurry due to the filtering method employed by the GPU drivers. At 4K, it's pixel perfect but performance is unacceptable. Dying Light is the same story. And Crysis 3, and so on. I'd need to fill my rig with GPUs to get decent framerates without compromises. Putting cost aside for a moment, I don't actually want to line my case with multiple graphics cards; the whole thing seems so inelegant and wasteful.

       

      Let's look at some of the resolutions that could all appear perfectly crisp if nearest-neighbour were used. Take note of the large selection that 4K monitors would have access to. Also, 5K monitors, while not a realistic proposition for a lot of people at the moment, could have the bonus of crisp gaming at 1440p (think of the horsepower you'd need to game at native res!).

       

      1080p Monitors

      Full HD: 1920x1080 (native)

      qHD: 960x540

      nHD: 640x360

       

      1440p Monitors

      QHD: 2560x1440 (native)

      HD: 1280x720

      nHD: 640x360

       

      4K Monitors

      UHD: 3840x2160 (native)

      Full HD: 1920x1080

      HD: 1280x720

      qHD: 960x540

      nHD: 640x360

       

      5K Monitors

      UHD+: 5120x2880 (native)

      QHD: 2560x1440

      HD: 1280x720

      WSVGA: 1024x576

      nHD: 640x360

        • Re: GPU Scaling: Please Let Us Choose the Interpolation Method
          waltc

          That's the problem with all 4k & 5k monitors, essentially.  They should only be purchased if the user plans to actually use the native res more than any other lower res.  Really, the same is true of all post-CRT monitors--native res is always the best.  Unless you are running dual gpus you probably don't have the power to push games at your 4k native resolution--which is, of course, the solution.  I'd expect within the year single gpus will be strong enough to push most games @4k resolutions.

           

          Also, some current monitors simply look much better than others when scaling down.  It has to do with pixel placement and dot pitch, panel type, etc.  Some 4k monitors are simply 2 1920x1080 monitors in the same bezel, more or less, etc.  A lot has to do with the particular monitor being used, in other words.

            • Re: GPU Scaling: Please Let Us Choose the Interpolation Method
              bigtoe

              I personally don't feel that 1080p monitors cut it for desktop use any more (single displays, anyway), and after using 1080p displays that fit in my pocket for the last two years at least, I was starting to wonder why I was putting up with having the same number of pixels on my comparatively huge 24" monitor. In my case, I spend more time in the desktop environment than in games, and I appreciate the clarity in text that the higher PPI of a 4K display gives me. Web pages are much nicer to read now, and the extra desktop space is fantastic; I no longer have to mess about with zoom settings to read two pages side by side, and with Windows 10 I can even snap a window to each corner of the screen; each one effectively having its own 1080p of pixels.

               

              I agree that native resolution is always the best. However, other resolutions could look as crisp as native if they were handled correctly (albeit with reduced detail due to less visible information). GPU scaling currently ruins things with its smoothing algorithm. If it did not apply this filter and simply multiplied the pixels out instead, the picture would not be blurry at all (providing the source resolution divides exactly into the native resolution). In fact, scaling a 1080p image to 4K in this manner would likely look cleaner than a native 1080p panel as the subpixels would be more difficult to see. One "1080p pixel" would simply be represented by four "4K pixels"; no fancy algorithms to introduce artifacts to the image.

               

              Nearest-neighbour interpolation should be extremely easy to implement, and would make viewing lower-resolutions a much nicer experience on high-resolution panels. Some panels may indeed do a better job of upscaling, but this would work on any model regardless. As I mentioned before, it's not just about having the power to push lots of pixels at native resolution; there is plenty of content out there that has fixed resolutions.

              • Re: GPU Scaling: Please Let Us Choose the Interpolation Method
                mt_

                Unless you are running dual gpus you probably don't have the power to  push games at your 4k native resolution--which is, of course, the  solution.  I'd expect within the year single gpus will be strong enough  to push most games @4k resolutions.

                That’s just a stereotype detached from reality. There is software (not just games) that does not support High-DPI modes at all. For example, latest versions of most of professional music DAW software such as Steinberg Cubase, Cakewalk Sonar, Ableton Live, Avid Pro Tools, Cockos Reaper are either blurry (by default at native 4K resolution with OS DPI scaling used, or when a lower-than-physical resolution is set on OS level — e.g. Full HD instead of 4K, so either GPU’s or the monitor’s own blurry scaling algorithm is applied) or have tiny GUI (when DPI scaling is forcedly disabled in exe properties while OS is running at native resolution of the monitor). This is absolutely NOT solvable by purchasing a more performant graphics card.

              • Re: GPU Scaling: Please Let Us Choose the Interpolation Method
                quppa

                Just registered to add my support for adding nearest neighbour scaling support. There's a longer thread on NVIDIA's forums and one here at [H]ardForum that includes some good photographs.


                I'm keen to buy a 4K monitor, but I can't bring myself to do so until there is a way to display 1920x1080 and 1280x720 pixel resolutions without filtering. The first company to add support for this to their drivers (or the first monitor manufacturer to have a configurable scaler) will definitely get my money.

                • Re: GPU Scaling: Please Let Us Choose the Interpolation Method
                  okarin

                  Same here. Registered just to support OP.

                  Can we at least get the answer from AMD tech staff, is it possible to implement such a simple feature (nearest-neighbour scaling) at driver level (with or without perfomance loss), or hardware must also support this? Do AMD have any plans to implement this in near future?

                  There is definitely no need in fancy interpolation algorithms (bilinear, bicubic, etc) when upscaling 1080p to 4k. Just copy pixels (1 to 4) and picture will be perfect - crisp and solid as on 1080p display or even better. I just bought 24" UHD monitor and ready to buy a top gaming videocard (radeon or any other) that supports selecting scale method. While it's possible to play all recent games on something like FuryX, there are still many 5+ years old games (and even some new ones) that supports 3840x2160 resolution but lags (due to engine limitations) or can't properly scale UI elements (like HUD, chat, buttons and text fields) so just unplayable in such resolution.

                  • Re: GPU Scaling: Please Let Us Choose the Interpolation Method
                    mt_

                    I’ve created a web-based DEMO that illustrates how non-blurry scaling should work (please use a modern web-browser to view it). It can probably provide driver developers with an idea on how to implement the feature we need and what exactly we need.

                     

                    It takes a raster image and scales it up by representing each image pixel as a square group of an INTEGER number of physical pixels of same color, so that the available space is occupied as fully as possible while the resulting scaled image is absolutely free of blur. The rest space is filled with gray color.

                     

                    The image is rescaled automatically on browser-window resize. The current number of physical pixels per image pixel is displayed at the top right corner of the webpage (this number depends on actual size of the browser window). For even more demonstrativeness, blur can be enabled via the corresponding checkbox, so that you could immediately compare how the image looks without and with blur. Thanks.

                     

                    UPDATE: fixed a bug that caused the scaled image to have a wrong size at OS-level scale different from 200%.

                    2 of 2 people found this helpful
                      • Re: GPU Scaling: Please Let Us Choose the Interpolation Method
                        okarin

                        Thank you for demo! (although I think devs don't answer to this thread not because they can't understand what we need, but because they too lazy/busy/not interested/etc.). Btw, "Enable blur" doesn't work in my browser (firefox), but this is probably because I have image-rendering: -moz-crisp-edges in my userContent.css

                          • Re: GPU Scaling: Please Let Us Choose the Interpolation Method
                            mt_

                            "Enable blur" doesn't work in my browser (firefox), but this is probably because I have image-rendering: -moz-crisp-edges in my userContent.css

                            Exactly. When the checkbox is checked, the `canvas` element used for non-blurry scaling is replaced with the original image scaled by browser (browsers use bilinear interpolation for images by default, but you’ve overridden this in your user stylesheet).

                             

                            Fwiw, nearest-neighbour interpolation is lossless solely at integer physical-size-to-image-size ratios (and applying lossless interpolation at integer ratios is the entire point of the feature we request here). But for the most of images in the web, ratios are not integer, so having such style forced globally, you inevitably lose quality.

                              • Re: GPU Scaling: Please Let Us Choose the Interpolation Method
                                okarin

                                About quality. I'm using 200% scaling in my system/browser (as well as all 4k display users do, I guess), so it's integer (2x), isn't it? Anyway, you are right about poor quality, but current browsers that I use (ff and chromium) utilizes bilnear filter for upscaling, which is produces blurred images (that's why I'm using moz-crisp-edges globally). Would be much better if they use lanczos3 or at least bicubic filter, but they don't (and I can't find the way to force them to do so).

                                  • Re: GPU Scaling: Please Let Us Choose the Interpolation Method
                                    mt_

                                    I'm using 200% scaling in my system/browser (as well as all 4k display users do, I guess), so it's integer (2x), isn't it?

                                    Yes, ratio is integer (2×2 physical pixels per image pixel at scale of 200%), but only for images that have their specified sizes equal to natural sizes, while that’s not always the case.

                                    • Re: GPU Scaling: Please Let Us Choose the Interpolation Method
                                      mt_

                                      okarin For what it’s worth, I’ve created a Firefox extension — SmartUpscale — that prevents blurring images which have their size in physical pixels an integer number of times larger than their natural size. Compared with global user stylesheet you are using, the extension applies image-rendering: -moz-crisp-edges solely to those images that wouldn’t lose quality at that.

                                       

                                      There is also ability to limit maximum image zoom (not to be confused with page zoom) that blur should be prevented at (unlimited by default), so for example for maximum zoom of 3 (when each image pixel is displayed as a group of 2×2 or 3×3 physical pixels), blur is prevented, while for 4+ (when image pixels get too large and noticeable on a 24″ 4K monitor) image is blurred. It is also possible to disable blur globally (like in your user stylesheet) if needed.

                                • Re: GPU Scaling: Please Let Us Choose the Interpolation Method
                                  bigtoe

                                  Nice work, it certainly helps to illustrate what we want.

                                • Re: GPU Scaling: Please Let Us Choose the Interpolation Method
                                  istario

                                  I just wanted to add that I am also very interested in this feature.  I have a 4k Freesync monitor and this would be nice to have.

                                  • Re: GPU Scaling: Please Let Us Choose the Interpolation Method
                                    maspok

                                    Поддерживаю данную тему. Почему до сих пор не введён данный способ интерполяции. В 4k телевизорах используются специальные процессоры для масштабирование FullHD контента до разрешения 4K, и выглядят такие фильмы на 4K телевизорах лучше, чем на FullHD телевизорах. Почему же для мониторов такое не сделать ? Может AMD боится, что никто не будет покупать топовые видеокарты для 4K режимов ? Все купят себе видеокарты среднего уровня для игр в разрешении FullHD и будут радоваться, что не нужно покупать видеокарту для разрешения UltraHD. Но это же не правда ! Ведь всё равно игровые маньяки будут пытаться играть на максимальных настройках графики в родном 4K разрешении. Топовые карты никуда не денутся. Или пока все FullHD мониторы не продадут, AMD не будет внедрять простейший способ интерполяции ? Я хочу купить 4K монитор в пределах 700 долларов. К сожалению не OLED, т.к. производители ещё не могут наладить массовый их выпуск. Но монитор покупается на гораздо более длительное время, чем видеокарта. И сколько лет мне нужно ждать, пока появится одночиповая видеокарта, способная выдавать за 100 FPS в самых требовательных играх в 4K разрешении ? Изучил много русскоязычных форумов по этой теме, в России и в СНГ множество людей ждут решения данной проблемы. Если NVIDIA сделает этот шаг первой, то я забуду про свою любовь к видеокартам ATI.

                                    Спасибо.

                                     

                                    Sorry for my russian language, but I not want use google translate.

                                      • Re: GPU Scaling: Please Let Us Choose the Interpolation Method
                                        mt_

                                        Повышение разрешения Full HD до 4K в телевизорах осуществляется с использованием особых алгоритмов интерполяции (в том числе, вероятно, с использованием данных из предыдущих и последующих кадров для восстановления недостающих деталей) и с заметной задержкой, для монитора неприемлемой.

                                         

                                        В данной же теме речь совсем о другом — о свободном от бессмысленного размытия масштабировании путём простейшего повторения пикселов без какой-либо алгоритмической обработки. Единственная «сложность» здесь — условное ветвление на основе проверки целочисленности коэффициента масштабирования (физическое разрешение экрана, поделённое на логическое разрешение сигнала): если коэффициент целочисленный (например, 2 по каждой из осей) — масштабируем повторением пикселов без размытия, если дробный (например, 1,5) — масштабируем так, как это происходит сейчас на всех мониторах и видеокартах — с использованием билинейной или бикубической интерполяции с неизбежным размытием.

                                      • Re: GPU Scaling: Please Let Us Choose the Interpolation Method
                                        mt_

                                        I’m curious whether the new Crimson ReLive Edition has some progress as for nonblurry integer-ratio scaling.

                                        • Re: GPU Scaling: Please Let Us Choose the Interpolation Method
                                          mt_

                                          There is now a corresponding petition on Change.org.