This project is a combined Lexical Analyzer and Syntax Analyzer (Parser) that processes input text, generates tokens, and parses them according to a defined grammar. It is designed to be a part of a compiler.
- Tokenizes input text
- Supports various token types (identifiers, keywords, literals, operators, etc.)
- Parses tokens based on a defined grammar
- Provides detailed error messages for invalid tokens and syntax errors
To install the necessary dependencies, run:
pip install -r requirements.txt
To use the Lex and Syntax Analyzer, run the following command:
python function.py