Skip to content

Latest commit

 

History

History
57 lines (46 loc) · 2.28 KB

File metadata and controls

57 lines (46 loc) · 2.28 KB

sai

Simple AI interface to chat with your Ollama models from the terminal

Features

  • Pretty print real time responses in Markdown, using rich library.
  • Keep conversation context.
  • Autodetect and option to select models.
  • Add support for custom prompts.
  • Add custom roles (reusable prompts).
  • Improve performance by preloading models.
  • Add conversation persistency (sessions).

Requirements

An Ollama instance is required to get access to local models. By default, the URL is set to http://localhost:11434.

Install

You can install it using any package manager of your preference like pip, but the recommended way is uv tool.

Recommended

Using uv:

uv tool install sai-chat

Usage

Start using it in your terminal just by running sai command:

luis@laptop:~ $ sai
╭───────────────────────────────────────────────────────╮
│ Welcome to Sai. Chat with your local LLM models.      │
│                                                       │
│ Available commands:                                   │
│                                                       │
│  • /setup : Setup Ollama URL and preferences          │
│  • /model : Select a model                            │
│  • /roles : List and select a role                    │
│  • /role add : Create a new custom role               │
│  • /role delete : Delete a custom role                │
│  • /help : Show this help message                     │
│  • /quit : Exit the application                       │
╰───────────────────────────────────────────────────────╯
> hi
╭───────────────────────────────── Virtual Assistant ✔ ─╮
│ Hi there! How can I help you today? 😊                │
╰────────────────────── gemma3:1b ──────────────────────╯
> 

Status

This project is under development. Feel free to contribute or provide feedback!