Nonblurry integer-ratio scaling
- Marat Tanalin
- Scaling built into monitors and graphics drivers leads to blur. The issue could be solved by implementing integer-ratio scaling via graphics drivers, but for now, there are partial solutions and indirect ways to encourage developers of graphics drivers to implement lossless scaling.
In a nutshell (tl;dr)
Monitors and graphics cards at video signal’s resolution lower than display’s physical resolution add blur to the image even when this could be avoided. As a result, Full HD on 4K monitor looks worse than on Full HD monitor.
When scaling, each logical pixel could be displayed as a square group of integer (2×2, 3×3) number of physical pixels of the same color with the resulting image centered on the screen. See the live demo.
- Sign the petition about implementing nonblurry scaling in graphics drivers.
- Spread the word about the blur issue to your friends and specify links to the petition and the article.
- Express your support in forums: GeForce, AMD, Intel, Reddit.
- Complain of blur to technical support: nVidia, AMD (1, 2) or Intel (1, 2).
- Request adding the feature to popular game engines: Unreal Engine, Unity, CryEngine.
What scaling is
Scaling is resizing (changing resolution of) an image. It is needed if physical resolution (number of pixels in horizontal and vertical directions) of a monitor or a TV set is different from the resolution of the video signal it receives: e. g. if a game in the resolution of 1920×1080 (Full HD) is running in full-screen mode on a monitor with physical resolution of 3840×2160 (4K).
The blur issue
With all monitors, graphics cards and most TV sets, scaling always leads to losing sharpness at any scaling ratio. Such sharpness loss is perceived as blur and is irreversible quality loss.
But actually, blur is only inevitable if the monitor’s resolution is not a multiple of the signal’s resolution. For example, it’s impossible to display one logical pixel as 1.5 physical pixels, so we are forced to use interpolation that calculates average colors of adjacent pixels of the image which is subject to scale.
Solution — integer-ratio scaling
If the scaling ratio is not fractional (e. g. 1.5), but integer (e. g. 2 or 3), the blur can be avoided just by displaying each logical pixel as a square group of an integer (e. g. 2×2 or 3×3) number of physical pixels of the same color — i. e. just by duplicating the corresponding pixel of the original image multiple times with no any diffusion of colors of sibling pixels.
More than nearest neighbour
Blur can be avoided even if resolutions are not divisible: it’s enough to scale with an integer ratio so that the image fills the screen as fully as possible, and to fill the rest screen space with black background the same way as when the image is centered on the screen with no scaling.
For example, we can display a 1280×1024 image on a 3840×2160 screen by displaying each image pixel as a group of 4 (2×2) identical physical pixels with black margins of 56 physical pixels above and below, and 640 pixels — leftward and rightward.
This is the key difference of integer-ratio scaling from scaling via nearest-neighbour interpolation. Their results are only identical at integer ratios of physical and logical resolutions. At fractional ratios, nearest-neighbour interpolation results in image distortion, while with integer-ratio scaling, the image is always scaled with an integer ratio and with no quality loss — even if physical and logical resolutions are not divisible.
The differences between lossless integer-ratio scaling and bicubic interpolation with blur are clearly evident on a pixel-art image:
See also the photos of the same screenshot of the Reaper application taken using the three scaling methods:
For better understanding of what nonblurry integer-ratio scaling is, see the web demo.
Why it’s important
- With 4K monitors of 24—27″ size at Full HD resolution, single image pixels are almost indistiguishable, so blur does not add any smoothing and just senselessly decreases sharpness resulting in an unreasonable quality loss.
- All 4K monitors use scaling with blur, so the issue can be solved only via graphics driver.
- In the nVidia G-Sync mode, monitor’s own scaling feature is not used (1, 2), so the only way to scale properly is scaling via graphics driver.
- Graphics drivers of nVidia and AMD already have implemented technologies of resolution virtualization — nVidia DSR and AMD VSR, so the only missing technical thing is support for ratios below 1.
Purchasing a higher-performance graphics card is not a solution:
- many games contain bitmap elements (e. g. menu in “Half-Life 2” and control panels in “Cities XL Platinum”) which get small at 4K resolution;
- some games (e. g. “Duck Tales Remastered”) have real resolution of Full HD while support for other resolutions is achieved via blurry scaling;
- some games work incorrectly at 4K resolution (e. g. in the “Bionic Commando Rearmed” game, conundrum elements are invisible in “Hack enemy network” sublevels).
A more powerful GPU cannot help here.
- From the performance perspective, scaling by pixel duplication should work much faster than using bilinear or bicubic interpolation, so using integer-ratio scaling could descrease or remove a lag introduced by scaling.
- User has a right not to have unreasonable quality loss when working at resolutions different from the physical resolutions of the display.
- Using scaling built into specific full-screen application (e. g. a game). This is typically available only in 2D games: e. g. pixel-art-oriented Owlboy.
For scaling 3D games, the freeware GeDoSaTo application can be used. Drawbacks:
- only supports relatively old games that use DirectX 9 or older;
- works with just some of them; on some systems;
- modern versions of the application don’t work at all, yours truly was able to use successfully one of its previous versions — Beta 10 Poltergeist.
- For windowed applications — Windows Magnifier. Drawbacks: unusable user interface, a lag and some jerkiness.
- For running DOS games — the DOSBox emulator with certain settings applied.
- DgVoodoo (Glide emulator and wrapper for DirectX 8 and older) and DXGL (OpenGL implementation of DirectDraw) reportedly support the nonblurry scaling algorithm “Nearest Neighbour” (sources: 1, 2).
- For viewing images — the freeware XnView application with the unchecked checkbox Tools → Options → View → High quality zoom → Enlarge. Limitation: the option is applied to all viewed images regardless of ratio of their size to size of the application’s window and using full-screen mode.
- For playing videos — the freeware MPC-HC player with the setting View → Options → Playback → Output → Resizer → Nearest neighbor. A limitation: the option is applied to all played videos regardless of ratio of their frame size to size of the application’s window and using full-screen mode.
- For windowed Windows applications incompatible with HiDPI (non-DPI-aware) — using Windows 10, where, unlike Windows 7, old applications are automatically scaled with no blur at integer Windows zooms.
- To disable unreasonable blur of images on web pages — the SmartUpscale extension for Firefox browser.
- There is reportedly no blur when using full-screen scaling with integer ratios via the official Intel’s graphics driver for Linux operating system.
- Some 4K TVs by Panasonic reportedly support displaying Full HD signal with no blur. For example, in the TX-55CX802B model, it is available via the “1080p Pixel by 4 pixels” option in the “Picture → Option Settings” menu. The feature is apparently intended solely for use with Full HD (1920×1080) resolution and does not work at different resolutions of input video signal such as 1280×720. That said, the similar smaller-size model — TX-50CX802 — does not have the feature. A similar feature is reportedly available in the “Graphics” mode in some models of 4K TVs by Sony (e. g. X900E).
- Scaling via graphics driver.
- A monitor emulator that intercepts the image to be displayed on monitor, scales it and displays the scaled image on the physical monitor.
- Reverse engineering of existing graphics drivers with the purpose of enabling ability to upscale besides already available downscaling in the already available feature — nVidia DSR or AMD VSR.
- Scaling built into monitor itself. The advantage compared with scaling via graphics driver is saving the graphics interface’s bandwidth and potential decrease of electromagnetic radiation from the signal cable.
A device inserted between graphics card and monitor. There is a similar device UltraHDMI intended to output video signal of the Nintendo 64 console via HDMI with no loss (review). Such device could probably be based on FPGA: 1, 2. Potential drawbacks:
- an extra lag is possible;
- may have a negative impact to compatibility with the HDCP technology required for playing protected content. Solvable with a switch for temporarily enabling the mode of passing-through the unaltered video signal. The switch could be either physical or controllable from computer via USB.
What you can do right now
- Sign the petition on Change.org about implementing nonblurry integer-ratio scaling in graphics drivers.
- Spread the word about the blur issue to as many people as possible and specify links to the petition and this article.
- Express your support in corresponding existing threads in forums of GeForce, AMD, Intel, and also on Reddit.
- Complain of low quality of scaling via graphics driver to GPU vendor’s technical support: nVidia, AMD (1, 2) or Intel (1, 2).