Integer Scaling Explored: Sharper Pixels For Retro And Modern Games
Integer Scaling Explored: Everything You Need To Know
Windows has a long history of maintaining backwards compatibility, so of course many older PC games still run today on Windows 10, even if they require some compatibility tweaks. Don't forget about GOG—formerly known as Good Old Games—which still sells big piles of older PC games alongside newer titles like CD Projekt Red's Witcher series. These days you can hardly swing a cat without hitting an emulator or an older PC game, but now there are plenty of fresh retro-inspired titles like Shovel Knight, Hotline Miami, and Stardew Valley all writing love letters to years gone by. Developers built these games and many more—Papers Please, Beat Cop, Sonic Mania...the list goes on forever—with what nowadays we call pixel art.
Pixel art and standard-definition graphics
Back in the 80s and 90s when Nintendo and Sega fought for dominance, pixel art was just "art". For example, the Sega Genesis could display any color you wanted as long as it fit into a nine-bit color gamut with 512 distinct hues. Sega's 16-bit console could display 61 colors at a time at a resolution of either 320x224 or 256x224 non-interlaced through the NTSC TV standard. These days video standards are named by their resolution, so we call it 240p since it's a progressive (non-interlaced) image with 240 lines, not pixels, of resolution. If you're interested in the topic of 240p and the science behind television signals, check out Displaced Gamers' History of 240p on YouTube. Recent titles see this kind of limitation as an opportunity to direct the art, and these games are beautiful in their own way.The problem that modern hardware has with all of these games, both young and old, is scaling the image to fit on our LCD or OLED displays. That's where both Nvidia and Intel come in. At GamesCom, both Nvidia and Intel separately introduced updated drivers that could scale resolutions on the GPU using integer scaling. Integer scaling sounds pretty self-explanatory, and it is: the selected resolution is multiplied by whole numbers to fit the screen. For example, a monitor with a resolution of 2560x1440 can be filled with a 1280x720 image by doubling the height and width of the pixels. It looks kind of blocky to use as a Windows desktop resolution, but for the right kind of game, like retro or retro-inspired titles, it gives razor-sharp edges to the pixels that make up the image on your monitor. Other perfect (or mostly-perfect) multiples include 512x288, 640x360, and 853x480.
What do you do with resolutions that aren't a perfect match? A typical 1920x1080 desktop or laptop monitor only fits 240 pixels 2.5 times. Typical monitor built-in scalers and the average graphics driver by default will stretch those pixels and stretch the image somehow. It might be as simple as applying a bilinear filter like SNES9x has done since the late 90s, or as advanced as using a pixel shader that emulators like Higan or RetroArch can do. However, you lose that razor-sharp edge and the image will get a little muddy no matter what. Instead, you can center the image on the display as large as possible and surround it with blank space. The black border approach is the route that both Nvidia and Intel chose for their latest drivers.
Integer scaling for a modern age
Even modern 3D games can benefit from this. For example, a GeForce RTX 2060 is not going to have a problem accelerating even the most gorgeous ray tracing-enabled game at 1080p, but this is the age of high-resolution displays. What if you pair an RTX 2060 with a 4K (3820x2160) or even a 5K (5120x2880) display? Suddenly you're trying to use those RTX features on four times as many pixels, which may now work well. However, if you integer scale 1920x1080 to a 4K display, you get a nice even 2x scale. Lower resolutions scale just as well: 1280x720 fits 2560x1440 perfectly in a 2x scale or 3820x2160 by scaling each pixel by a factor of three. Suddenly those super high-end displays can still look pretty sharp without the blur usually associated with non-native resolutions. This is an even bigger deal when we talk about systems with Intel's integrated graphics.Let's take a look at Nvidia's control panel to see how it works.
While both Intel and Nvidia have released drivers for this, we're going to test with our GeForce RTX 2080 Super for our screen captures, but it should work the same on any supported GPU. Intel's version, called Retro Scaling, works and looks exactly the same, which is great if you don't have a discrete GPU in your PC. In the image above, you can see how Nvidia configures its "Adjust desktop size and position" section of its driver control panel by default. If you have a Pascal or Turning-based GPU in your PC, the control panel in driver version 436.02 or later exposes that fourth option for integer scaling. Watch how the sample resolution mock-up below the drop-down changes when we pick Integer Scaling. The box which represents a 1600x1200 image is now centered on the screen with a black border all the way around instead of being pillarboxed.
Below the radio buttons, you can choose where scaling happens—either on the GPU or on the display. Many monitors don't cope with images that don't have the same aspect ratio as the panel itself, so Nvidia chooses the GPU by default, but you can switch back and forth to see what looks the best for you. However, when you choose Integer Scaling, the app disables the "Perform scaling on" drop-down, and selects the GPU. That's because your PC's graphics card is running this show. You won't see anything different at the monitor's native resolution, but when you change to a resolution lower than native, this scaling method will kick in.
In Intel's control panel, the feature is called Retro Mode, but it works the exact same way. Unfortunately, AMD hasn't added integer scaling to its Radeon drivers, but we hope that they'll join the club soon.
How we tested
Screenshots are not useful for judging quality of on-GPU scaling since Windows captures the screen at the selected resolution, and in this case a 1280x720 image is going to be pretty boring. To see a scaled image, we had to bust out our trusty AVerMedia Live Gamer Mini streaming box to grab the image on one PC and capture it on another. This little box is really easy to use with the included RECentral 4 software. Just plug the HDMI output from your graphics card into the HDMI In, attach the USB 3.0 cable into your capture machine, and connect an HDMI cable from the HDMI Out on the box to the source machine's monitor. EDID data which tells your source machine about the display comes from the monitor, so whatever your monitor can handle is what's sent to the Live Gamer Mini. If you want to stream to Twitch, YouTube Live, or several other streaming services either individually or all at once, RECentral 4 lets you do that while optionally capturing audio and a second video stream from a microphone or webcam.
Since this external capture card can handle only 1080p at 60 frames per second, we stuck with a basic 1080p monitor to see how the scaling goes. AVerMedia RECentral 4 can capture still images and video alike, so we grabbed a healthy dose of both. Unfortunately that means you'll have to use your imagination if you've got a high-end display, but hopefully get an idea of how to get some extra performance at a lower resolution without making your fancy 4K display a sloppy mess. To try to further illustrate that point, we've also applied 2x scaling to a selection of our captured images using nearest neighbor scaling, which retains the sharp edges while making everything physically four times larger—two times horizontally and two times vertically.
With all that said, let's take a look at how it looks on a whole bunch of older and older-looking games.