/

Understanding the JavaScript String codePointAt() Method

Understanding the JavaScript String codePointAt() Method

Learn all about the codePointAt() method in JavaScript for strings.

The codePointAt() method was introduced in ES2015 to handle Unicode characters that cannot be represented by a single 16-bit Unicode unit, but require two instead.

Unlike the charCodeAt() method, which retrieves the first and second 16-bit parts separately and then combines them, the codePointAt() method allows you to obtain the entire character with a single call.

Let’s take a Chinese character “𠮷” as an example. This character is composed of two UTF-16 (Unicode) parts:

1
2
"𠮷".charCodeAt(0).toString(16) // d842
"𠮷".charCodeAt(1).toString(16) // dfb7

If you combine these Unicode characters to create a new character:

1
"\ud842\udfb7" // "𠮷"

The same result can be achieved using the codePointAt() method:

1
"𠮷".codePointAt(0) // 20bb7

And if you create a new character by combining these Unicode characters:

1
"\u{20bb7}" // "𠮷"

For more information about Unicode and working with it in JavaScript, check out our article on Unicode and UTF-8.

tags: [“JavaScript”, “string methods”, “Unicode”, “codePointAt()”]