This is a command-line tool to analyze and classify software requirements using a local LLM served via Ollama. It supports both structured files and free-text input, and classifies requirements into categories like:
- Functional
- Non-Functional
- Performance
- Design
- Maintenance
- Ambiguous, Incomplete, etc.
Ideal for embedded systems, product specs, and agile teams that want AI-powered requirements triage 🔍⚡
- 📂 Input via
.txt
,.pdf
, or raw string - 📤 Output to:
- Console
- JSON
- 🧠 Powered by local LLMs via
ollama
- 💡 Built-in support for models like
codellama:instruct
- 🐳 Docker-ready: prebuilt image with model loaded
- 🧪 Extensible and testable Python code
For developers who want to run the CLI directly
git clone https://github.com/yourusername/software-requirements-classifier.git
cd software-requirements-classifier
python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txt
pip install .
reqclass --input "The system must respond in under 2 seconds." --console
reqclass --input path/to/reqs.txt --output report.pdf --output-json results.json
This project uses a custom ollama
Docker image with the LLM model preloaded.
docker compose build
Note: This step is optional. Please make sure you change gajesh
to your Docker Hub username in the docker-compose.yml
file if you want to push the image.
Or pull from Docker Hub (if you did not build the image locally):
docker pull gajesh/ollama-requirements-classifier
docker compose up -d
This exposes the LLM at http://localhost:11434
reqclass --input tests/test_0_thermostat.txt --output report.pdf --output-json results.json
reqclass --input "The thermostat should log humidity every 10 minutes." --console
- 🧠 This tool relies on open-source LLMs like
codellama:instruct
- LLM output may sometimes be:
- Vague
- Over-generalized
- Sensitive to phrasing
- Strict formatting is enforced, but not 100% guaranteed (due to model limitations)
- Does not currently support batch processing of multiple documents
Made with ❤️ by Gajesh Bhat
MIT License — free to use, fork, and build on.