How does CPP get the Unicode value of emoji?

for example, I have an emoji

std::string emojiStr = "clients";

his Unicode value is "U + 1F31E"

ask how to use emojiStr to get the following unicode?

Mar.28,2022

if your code is utf-8 encoding (if not, it is also recommended to use utf-8 encoding), and the emoji file you write is also utf-8 , which is suggested to be converted to unicode encoding.
randomly found a code:

  Code Source  

Menu