The charCodeAt() method returns the Unicode of the character at the specified index in the string. An index of the first character is 0; the second character is 1, and so on.
Javascript String charCodeAt
JavaScript string charCodeAt() is a built-in function that returns the Unicode of the first character in a string.
Syntax
string.charCodeAt(index)
Parameters
The index parameter is required, and a number represents the index of the character you want to return.
Return value
The charCodeAt method returns “NaN” if there is no character at the specified index or if the index is less than “0”.
Example
See the following example of the Javascript String charCodeAt() method.
// app.js
let str = "HELLO KRUNAL";
let op = str.charCodeAt(str.length - 1);
console.log(op);
The output is the following.
String charCodeAt() method returns an integer between 0 and 65535 representing the UTF-16 code unit at the given index.
The UTF-16 code unit matches a Unicode code point for code points described in the single UTF-16 code unit.
If a Unicode code point cannot be represented in the single UTF-16 code unit (because its value is greater than 0xFFFF), then the coding unit returned will be the first part of a surrogate pair for the code point.
Note that String charCodeAt() will always return a value less than 65536. This is because the higher code points are represented by the pair of (lower valued) “surrogate” pseudo-characters to comprise a real character.
More Examples
// app.js
let data = "KRUNAL".charCodeAt("KRUNAL".length);
console.log(data);
In the above code, the output will be a NaN.
See the below code.
// app.js
let op = "KRUNAL".charCodeAt(0);
console.log(op);
See the output.
We have already seen the String charAt() function on this blog.
That’s it.