JavaScript String charCodeAt()

The JavaScript String charCodeAt() method returns an integer between 0 and 65535 representing the UTF-16 code unit at the given index.

The syntax of the charCodeAt() method is:

str.charCodeAt(index)

Here, str is a string.


charCodeAt() Parameters

The charCodeAt() method takes in :

  • index - An integer between 0 and str.length - 1. If index cannot be converted to integer or is not provided, the default value 0 is used.

Return value from charCodeAt()

  • Returns a number representing the UTF-16 code unit value of the character at the given index.

Notes:

  • charCodeAt() returns NaN if index is negative or out of range.
  • If a Unicode point cannot be represented in a single UTF-16 code unit (values greater than 0xFFFF), then it returns the first part of a pair for the code point. For the entire code point value, use codePointAt().

Example: Using charCodeAt() method

let sentence = "Happy Birthday to you!";

let unicode1 = sentence.charCodeAt(2);
console.log(`Unicode of '${sentence.charAt(2)}': ${unicode1}`); // 112

let unicode2 = sentence.charCodeAt(sentence.length - 1);
console.log(
  `Unicode of '${sentence.charAt(sentence.length - 1)}': ${unicode2}`
); // 33

// index is 0 for non-numeric
let unicode3 = sentence.charCodeAt("string");
console.log(`Unicode of '${sentence.charAt(0)}': ${unicode3}`); // 'p'

// returns NaN for negative or out of range indices
let unicode4 = sentence.charCodeAt(-2);
console.log(`Unicode of '${sentence.charAt(-2)}': ${unicode4}`); // NaN

Output

Unicode of 'p': 112
Unicode of '!': 33
Unicode of 'H': 72
Unicode of '': NaN

Recommended Reading: JavaScript String fromCharCode()