Lightweight CLI assistant powered by DeepInfra's LLaMA-3.2-90B-Vision-Instruct model.
Ask questions in the terminal like this:
llama "What is a monad in functional programming?"Or enter an interactive REPL:
llamagit clone https://github.com/notsopreety/llama.git
cd llamapip install -r requirements.txtchmod +x llama.py
mv llama.py /data/data/com.termux/files/usr/bin/llama # for Termux
# OR for Linux/macOS
sudo mv llama.py /usr/local/bin/llamaNow you can run it anywhere using:
llama "tell me a joke"| Command | Description |
|---|---|
llama |
Start interactive REPL |
llama <query> |
Ask a one-off question |
llama clear |
Clear current conversation (preserves system prompt) |
llama context <prompt> |
Set custom system prompt |
llama reset |
Clear history and reset to default system prompt |
If you're using Termux, follow these additional steps:
pkg install git python
pip install -r requirements.txt
chmod +x llama.py
mv llama.py /data/data/com.termux/files/usr/bin/llamaNow use llama anywhere from your Termux shell!
requests"Be a helpful assistant"Use llama context "<your custom prompt>" to customize it.
History is saved at:
~/.llama_conversation.jsonYou can inspect, edit, or delete it if needed.