Commit b007640
committed
Add lexer as new module for tokenizing input
This commit introduces a new lexer module to tokenize input. The lexer
efficiently recognizes symbols, keywords, literals, and punctuation using
pluggable recognizers. It provides UTF-8 support and error reporting with
precise position tracking. The module is tested for various scenarios and edge
cases to ensure robustness.1 parent b1fd711 commit b007640
0 commit comments