Skip to content

unicorn/prefer-code-point Pedantic

What it does

Prefers usage of String.prototype.codePointAt over String.prototype.charCodeAt. Prefers usage of String.fromCodePoint over String.fromCharCode.

Why is this bad?

Unicode is better supported in String#codePointAt() and String.fromCodePoint().

Difference between String.fromCodePoint() and String.fromCharCode()

Examples

Examples of incorrect code for this rule:

javascript
"🦄".charCodeAt(0);
String.fromCharCode(0x1f984);

Examples of correct code for this rule:

javascript
"🦄".codePointAt(0);
String.fromCodePoint(0x1f984);

How to use

To enable this rule using the config file or in the CLI, you can use:

json
{
  "rules": {
    "unicorn/prefer-code-point": "error"
  }
}
bash
oxlint --deny unicorn/prefer-code-point

References