Skip to content

Commit b007640

Browse files
committed
Add lexer as new module for tokenizing input
This commit introduces a new lexer module to tokenize input. The lexer efficiently recognizes symbols, keywords, literals, and punctuation using pluggable recognizers. It provides UTF-8 support and error reporting with precise position tracking. The module is tested for various scenarios and edge cases to ensure robustness.
1 parent b1fd711 commit b007640

File tree

2 files changed

+1529
-0
lines changed

2 files changed

+1529
-0
lines changed

0 commit comments

Comments
 (0)