Sometimes there’s the need to fly a flag in your project. You probably know the ISO code, more specifically the ISO 3166-1 alpha-2 code, a.k.a two-letter code. For Austria this would be
AT. Now when thinking about converting these codes into emoji flags, a lookup table comes to mind. But instead of jumping straight into implementing the table it’s time to take a closer look at the Unicode Standard. The clever people that added the flag emojis to the Unicode character set invented a system of specifying the flags that makes an algorithmic conversion without lookup tables quite easy.
Regional Indicator Symbols
Each flag emoji defined in the Unicode standard is created by concatenating two regional indicator symbol characters. These symbols correspond to the 26 letters of the English alphabet, so letter
🇦. Now it’s possible to create a flag by just converting the two-letter ISO code of the country to the corresponding regional indicator symbols:
🇹. When concatenated into a single string, this becomes the Austrian flag: 🇦🇹 - easy!
Since the Unicode scalar values for
Z are nicely ordered (thanks ASCII!), as well as the corresponding regional indicator symbols, we can use a simple addition of the value offset to convert the two-letter code into the emoji flag.
I’ve added a couple of checks to the function to make sure that only valid flag emojis are returned.