C++ is fun

We all know C++ is a powerful yet complex programming language. But it can also be fun. For instance, you can use emojis and other Unicode characters in the source code. Of course, nobody sane would use emojis for identifiers. But it’s possible and you can have a little fun if have some spare time.

Identifiers are sequences of digits, underscores, lowercase and upper case Latin letters, and most Unicode characters. Identifiers are not allowed to begin with a digit, they must begin with an underscore, a Latin letter, or a non-digit Unicode character. For more information about the Unicode characters in identifiers see this.

So here is a little example of a C++ program that uses emojis. Can you figure out what it does?

#include <iostream>

using 🍴 = int;
using πŸ”· = double;

struct 🏠
{
   🍴 😏 (🍴 🍺, 🍴 πŸ₯©)
   {
      return 🍺 + πŸ₯©;
   }
};

int main()
{
   🏠 🏨;
   std::cout << 🏨.😏(1, 2) << '\n';

   constexpr πŸ”· πŸ₯§ = 3.1415927;
   πŸ”· r = 5;
   πŸ”· πŸ”΅ = 𓀂 * πŸ₯§ * r;

   std::cout << πŸ”΅ << '\n';
}

This is a screenshot of the program from Visual Studio:

Can we do better (or maybe worse)? Of course, we do. We can do it the Egyptian style writing C++ in hieroglyphs.

π“œπ“‚€π“  <π“­π“‰π“„™π“…Š 𓆦>
π“†Œπ“†šπ“†£ 𓁂𓇽𓁑(𓆦 π“€ π“‰ π“Œ² π“…‡, 𓆦 π“€ π“‰ π“Œ² π“…“) π“‚˜ π“Ήπ“Ž¬π“» π“…‡ π“‹  π“…“; π“‚“

π“œπ“‚€π“  <π“­π“‰π“„™π“…Š 𓆦>
π“­π“‰π“„™π“…Š π“‡£π“ˆŒπ“‚“
π“‚˜
π“…˜π“šπ“†«:
   π“œπ“‚€π“  <π“­π“‰π“„™π“…Š 𓋍>
   π“†Œπ“†šπ“†£ π“ƒšπ“ŒŒπ“‹Ήπ“…€(𓋍&& π“‹—, 𓆦 π“€ π“‰ π“Œ² π“…‡, 𓆦 π“€ π“‰ π“Œ² π“…“)
   π“‚˜
      π“Ήπ“Ž¬π“» π“‹—(π“…‡, π“…“);
   π“‚“
π“‚“;

𓀑𓀽𓃩 𓁩𓉒π“ͺ()
π“‚˜
   π“‡£π“ˆŒπ“‚“<𓀑𓀽𓃩> π“……π“€Žπ“ƒ¦;
   𓀑𓀽𓃩 𓃕 π“ˆˆ π“……π“€Žπ“ƒ¦.π“ƒšπ“ŒŒπ“‹Ήπ“…€(𓁂𓇽𓁑<𓀑𓀽𓃩>, 1, 2);

   π“­π“ƒŒπ“” 𓂁 𓃕 𓂁 𓄃𓃠;
π“‚“

In case that’s too small to read perhaps this screenshot is a bit better for reading:

That’s not so far off from this:

(Source)

If you want to check more about the Unicode block of Egyptian hieroglyphs see Unicode Block “Egyptian Hieroglyphs” or Egyptian Hieroglyph.

This program prints 3 to the console, although that is harder to figure out because I cheated and didn’t show the whole program. This part actually:

#include <iostream>

#define π“‚˜ {
#define π“‚“ }
#define 𓂁 <<
#define π“‚„ >>
#define 𓇼 *
#define π“‹  +
#define π“ˆˆ =

#define π“­π“ƒŒπ“” std::cout
#define π“­π“ƒŒπ“… std::cin
#define 𓀑𓀽𓃩 int
#define π“ƒ½π“€Œπ“€“π“° char
#define π“œπ“‚€π“  template
#define π“­π“‰π“„™π“…Š class
#define π“…˜π“šπ“†« public
#define π“€ π“‰ π“Œ² const
#define π“Ήπ“Ž¬π“» return
#define π“†Œπ“†šπ“†£ auto
#define 𓄃𓃠 '\n'
#define 𓁩𓉒π“ͺ main

The equivalent readable code of the above example is as follows:

template <class T>
auto add(T const a, T const b) { return a + b; }

template <class T>
class foo
{
public:
   template <class F>
   auto compose(F&& f, T const a, T const b)
   {
      return f(a, b);
   }
};

int main()
{
   foo<int> f;
   int r = f.compose(add<int>, 1, 2);

   std::cout << r << '\n';
}

Visual Studio seems to have problems displaying some Unicode characters, although I didn’t figure out the pattern. Sometimes they are OK, sometimes they are not. However, the source code is correct and you can successfully build and run a program written in Unicode.

You do have to save the source code using an encoding that allows you to enter the Unicode characters that you want. To do that you have to go to File > Save as … > Save with Encoding and make the appropriate selection.

If you are using Windows 10, there is an emoji application that you can use to type for them. Just type Win + . or Win + ;.

For more information on this see How to type emoji on your PC using Windows 10 Fall Creators Update or Windows 10 Tip: Get started with the emoji keyboard shortcut.

But how do you display Unicode characters in the Windows console? You need to use a code page that supports them. Here is an example:

#include "windows.h"
#include <iostream>
#include <codecvt>

std::string utf16_to_utf8(std::u16string utf16_string)
{
   std::wstring_convert<std::codecvt_utf8_utf16<int16_t>, int16_t> convert;
   auto p = reinterpret_cast<const int16_t *>(utf16_string.data());
   return convert.to_bytes(p, p + utf16_string.size());
}

std::string utf32_to_utf8(std::u32string utf32_string)
{
   std::wstring_convert<std::codecvt_utf8<int32_t>, int32_t> convert;
   auto p = reinterpret_cast<const int32_t *>(utf32_string.data());
   return convert.to_bytes(p, p + utf32_string.size());
}

int main()
{
   if(IsValidCodePage(CP_UTF8))
      SetConsoleOutputCP(CP_UTF8);

   std::cout << utf16_to_utf8(u"♠♣β™₯♦") << '\n';
   std::cout << utf32_to_utf8(U"❷⓿❢❾") << '\n';
}

The output of this is as follows:

If you want to learn more about converting between Unicode strings see the following:

However, the Windows console does not currently support emojis (and, in general, Unicode characters from the Supplementary Multilingual Plane. That support will come in the Windows Terminal, which is currently in preview. You can read more about that here. Excited?


via GIPHY

Of course, C++ is not the only programming language that allows emojis in identifiers. You can see examples from other languages here.

3 Replies to “C++ is fun”

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.