JavaScript String fromCharCode()

The JavaScript String fromCharCode() method returns a string created from the specified sequence of UTF-16 code units.

The syntax of the fromCharCode() method is:

String.fromCharCode(num1, ..., numN)

The fromCharCode() method, being a static method, is called using the String class name.


fromCharCode() Parameters

The fromCharCode() method takes in :

  • num1, ..., numN - A sequence of UTF-16 code units (numbers between 0 and 65535). Numbers greater than 65535 (0xFFFF) are truncated.

Return value from fromCharCode()

  • Returns a string of length N consisting of the N specified UTF-16 code units.

Note: The fromCharCode() method returns a string and not a String object.


Example: Using fromCharCode() method

// most common characters can be represented by single 16 bit value
let string1 = String.fromCharCode(65, 66, 67);
console.log(string1); // ABC

// numbers can be passed as a hexadecimal value
let string2 = String.fromCharCode(0x2014);
console.log(string2); // —

// num > 65535 are truncated, so here 1 is truncated
// equivalent to 0x2014
let string3 = String.fromCharCode(0x12014);
console.log(string3); // —

Output

ABC
—
—

However, supplementary characters in UTF-16 require two code units.

String.fromCharCode(0xD83C, 0xDF03); // Code Point U+1F303 "Night with
String.fromCharCode(55356, 57091);   // Stars" == "\uD83C\uDF03"

To view the entire UTF-16 Table, visit ASCII Table.


Recommended Reading: