We all know C++ is a powerful yet complex programming language. But it can also be fun. For instance, you can use emojis and other Unicode characters in the source code. Of course, nobody sane would use emojis for identifiers. But it’s possible and you can have a little fun if have some spare time.
Identifiers are sequences of digits, underscores, lowercase and upper case Latin letters, and most Unicode characters. Identifiers are not allowed to begin with a digit, they must begin with an underscore, a Latin letter, or a non-digit Unicode character. For more information about the Unicode characters in identifiers see this.
So here is a little example of a C++ program that uses emojis. Can you figure out what it does?
#include <iostream> using π΄ = int; using π· = double; struct π { π΄ π (π΄ πΊ, π΄ π₯©) { return πΊ + π₯©; } }; int main() { π π¨; std::cout << π¨.π(1, 2) << '\n'; constexpr π· π₯§ = 3.1415927; π· r = 5; π· π΅ = π * π₯§ * r; std::cout << π΅ << '\n'; }
This is a screenshot of the program from Visual Studio:

Can we do better (or maybe worse)? Of course, we do. We can do it the Egyptian style writing C++ in hieroglyphs.
πππ <ππππ π¦> πππ£ ππ½π‘(π¦ π π π² π , π¦ π π π² π ) π πΉπ¬π» π π π ; π πππ <ππππ π¦> ππππ π£ππ π π ππ«: πππ <ππππ π> πππ£ πππΉπ (π&& π, π¦ π π π² π , π¦ π π π² π ) π πΉπ¬π» π(π , π ); π π; π‘π½π© π©π’πͺ() π π£ππ<π‘π½π©> π ππ¦; π‘π½π© π π π ππ¦.πππΉπ (ππ½π‘<π‘π½π©>, 1, 2); πππ π π π ππ ; π
In case that’s too small to read perhaps this screenshot is a bit better for reading:

That’s not so far off from this:
(Source)
If you want to check more about the Unicode block of Egyptian hieroglyphs see Unicode Block “Egyptian Hieroglyphs” or Egyptian Hieroglyph.
This program prints 3 to the console, although that is harder to figure out because I cheated and didn’t show the whole program. This part actually:
#include <iostream> #define π { #define π } #define π << #define π >> #define πΌ * #define π + #define π = #define πππ std::cout #define πππ std::cin #define π‘π½π© int #define π½πππ° char #define πππ template #define ππππ class #define π ππ« public #define π π π² const #define πΉπ¬π» return #define πππ£ auto #define ππ '\n' #define π©π’πͺ main
The equivalent readable code of the above example is as follows:
template <class T> auto add(T const a, T const b) { return a + b; } template <class T> class foo { public: template <class F> auto compose(F&& f, T const a, T const b) { return f(a, b); } }; int main() { foo<int> f; int r = f.compose(add<int>, 1, 2); std::cout << r << '\n'; }
Visual Studio seems to have problems displaying some Unicode characters, although I didn’t figure out the pattern. Sometimes they are OK, sometimes they are not. However, the source code is correct and you can successfully build and run a program written in Unicode.

You do have to save the source code using an encoding that allows you to enter the Unicode characters that you want. To do that you have to go to File > Save as … > Save with Encoding and make the appropriate selection.

If you are using Windows 10, there is an emoji application that you can use to type for them. Just type Win + . or Win + ;.

For more information on this see How to type emoji on your PC using Windows 10 Fall Creators Update or Windows 10 Tip: Get started with the emoji keyboard shortcut.
But how do you display Unicode characters in the Windows console? You need to use a code page that supports them. Here is an example:
#include "windows.h" #include <iostream> #include <codecvt> std::string utf16_to_utf8(std::u16string utf16_string) { std::wstring_convert<std::codecvt_utf8_utf16<int16_t>, int16_t> convert; auto p = reinterpret_cast<const int16_t *>(utf16_string.data()); return convert.to_bytes(p, p + utf16_string.size()); } std::string utf32_to_utf8(std::u32string utf32_string) { std::wstring_convert<std::codecvt_utf8<int32_t>, int32_t> convert; auto p = reinterpret_cast<const int32_t *>(utf32_string.data()); return convert.to_bytes(p, p + utf32_string.size()); } int main() { if(IsValidCodePage(CP_UTF8)) SetConsoleOutputCP(CP_UTF8); std::cout << utf16_to_utf8(u"β β£β₯β¦") << '\n'; std::cout << utf32_to_utf8(U"β·βΏβΆβΎ") << '\n'; }
The output of this is as follows:

If you want to learn more about converting between Unicode strings see the following:
- C++ – Unicode conversions
- std::u32string conversion to/from std::string and std::u16string
- Visual Studio C++ 2015 std::codecvt with char16_t or char32_t
However, the Windows console does not currently support emojis (and, in general, Unicode characters from the Supplementary Multilingual Plane. That support will come in the Windows Terminal, which is currently in preview. You can read more about that here. Excited?
Of course, C++ is not the only programming language that allows emojis in identifiers. You can see examples from other languages here.
Try VsCode for better graphics on emojis π
I’m not doing emojis for a living. Visual Studio is good enough on Windows for that. π
Friends this is very good information for “japanese emoticons”. my blog