So, a few days ago I downloaded a new driver (22.4.2) to update from my previous 21.7.2, and found out that now VSR can finally downscale from up to 8K (7680x4320) resolution (earlier the maximum limit was 5K, or 5120x2880, even for native 4K displays, which was sup-par at best, if not straight out disrespectful to the owners of higher-end GPUs such as RX 6800, 6800XT and 6900XT). With that I can finally effectively supersample most of my older games, since "Override application settings" in control panel still does jack 95% of the time, except for some OpenGL titles.
But I can't not notice that the downscaling algorythm seems very different from the one I had when I used 2080Ti. As far as I know, nVidia uses 13-tap gaussian filter for their DSR downsampling, but what does AMD's VSR employ? Is it the standart bilinear interpolation that GPUs generally do if not ordered otherwise, or bicubic, or some special method? What is the downsampling pattern - ordered grid, rotated grid, sparse grid or some form of stochastic sampling? I seem to be unable to make Google answer this questions.
P.S. While searching for information on VSR downscaling specifics, found a fascinating article on various antialiasing methods, dated 2019, here's the link for anyone interested: https://www.overclockersclub.com/reviews/serious_statistics_aliasing/4.htm