Moo.js Tokenizer with Nearley.js

Published on Feb 10th 2020Duration: 28:23Watch on YouTube

In this video I’ll introduce how to use the Moo.js lexer/tokenizer with Nearley.js. A lexer is a processing step commonly used to process the input string before the parser. It helps increase the performance of the parser, and as in the case of Moo.js, it also allows the parser to report errors with line number information.

Transcript

The following transcript was automatically generated by an algorithm.