This project is a basic implementation of a lexer and a Read-Eval-Print Loop (REPL) in Go. The lexer is designed to tokenize an input string based on a simple set of rules defined in the token
package.
- Lexer: Tokenizes input strings into a series of tokens.
- REPL: Allows interactive input of expressions and outputs the tokens generated by the lexer.
-
lexer/
- Lexer: Handles the tokenization process.
- *New(input string) Lexer: Constructor function to initialize a new
Lexer
. - readChar(): Reads the next character and advances the position.
- NextToken() token.Token: Returns the next token in the input.
- readIdentifier(): Reads identifiers (variables, function names, etc.).
- skipWhitespace(): Skips whitespace in the input.
- peekChar() byte: Peeks ahead at the next character without advancing.
- readNumber() string: Reads numeric literals.
-
repl/
- Start(in io.Reader, out io.Writer): The main REPL function that reads input, tokenizes it using the lexer, and prints out the tokens.
-
token/
- TokenType: String alias representing the type of tokens.
- Token: Struct containing the type and literal value of a token.
- LookupIdent(ident string) TokenType: Returns the token type for a given identifier.
- Clone the repository:
git clone https://github.com/yourusername/your-repo-name.git