Tuesday, April 30, 2019

Fixing black border in your game image with ImageMagick.

Star with unintended black border
What's with that black around the star? That's because the image has "transparent black" (or in CSS notation, rgba(0, 0, 0, 0)) for all the fully transarent image and white with alpha otherwise. This is not a problem for image editing software like Photoshop but this is a problem for OpenGL especially if you use linear interpolation to resize your image.

What actually happend? If linear interpolation is enabled, the GPU will sample around the white and the "transparent black", thus will result in gray color with alpha around 0.5. This is not what you actually want as this can give your image an unintended black border, which may or may not bad for your game.

A solution for this is to modify your image to have "transparent white" instead.  Based on this answer, assume you have ImageMagick version 7 or later, I come up with this command:

magick convert input.png -channel RGB xc:"#ffffff" -clut output.png

And here's the result.
Star without black border around it.

Now what happends here is that we replace all color channel to 255 (thus result in white), but we keep the alpha values intact. The GPU then will see all the colors as white but only varying in alpha, so it will only interpolate the alpha because all the colors are white.

And if you plan to pass your image to zopflipng, make sure not to pass --lossy_transparent as that option changes all completely transparent pixel to "black transparent" again, which is the source of the problem.

UPDATE: ImageMagick command above won't work for images with various colors. I forked alpha-bleeding program which uses LodePNG to ease MSVC compilation which can be found here: https://github.com/MikuAuahDark/alpha-bleeding.

Thursday, April 25, 2019

VS2013 RTM cl.exe and "Use Unicode UTF-8 for worldwide language support"

There's a feature in Windows 10 that lets you specify UTF-8 string to C function fopen and other ANSI WinAPI functions. This makes it feel Unix-like where fopen in Unix expects it to be UTF-8 filename. However this doesn't mean everything works as expected as Microsoft warns us about that feature which may break application where it assume multi-byte length is 2 bytes max. And unfortunately this is true for VS2013 RTM cl.exe compiler.

C:\Users\MikuAuahDark>cl.exe test.c
Microsoft (R) C/C++ Optimizing Compiler Version 18.00.21005.1 for x64
Copyright (C) Microsoft Corporation.  All rights reserved.

test.c
test.c : fatal error C1001: An internal error has occurred in the compiler.
(compiler file 'f:\dd\vctools\compiler\cxxfe\sl\p1\c\p0io.c', line 2812)
 To work around this problem, try simplifying or changing the program near the locations listed above.
Please choose the Technical Support command on the Visual C++
 Help menu, or open the Technical Support help file for more information

The file test.c is an empty file, but that error shows up regardless of input file you specify. What happend here?

Turns out, there's check in c1.dll which basically equivalent to this C code

CPINFO cpInfo;
UINT chcp = GetACP();
GetCPInfo(chcp, &cpInfo);

if (cpInfo.MaxCharSize > 2) internal_error("f:\dd\vctools\compiler\cxxfe\sl\p1\c\p0io.c", 2812);
 
It assume the max multi-byte size is 2 bytes max, but in this case, I enabled a feature called "Use Unicode UTF-8 for worldwide language support", thus these what happends:
  1. GetACP returns 65001
  2. GetCPInfo returns information about UTF-8 code page, where max char size is 4.
Is there any workaround for this? I'm afraid there's no way. Basically we must make sure cl.exe didn't see 65001 as codepage, otherwise there's explicit check for it. There's Locale Emulator but that only emulates locale string, not code page.

If anyone found how to update VS2013 in 2019, please comment below. Yes, using VS2013 is mandatory in my case because I need to ensure compatibility with Windows Vista where it must target lower than Windows 7 SP1.