Skip to content

Latest commit

 

History

History
130 lines (92 loc) · 2.67 KB

File metadata and controls

130 lines (92 loc) · 2.67 KB

Installation Guide

Prerequisites

  • ✅ zsh 5.0+ (you probably already have this)
  • curl (already on macOS/Linux)
  • jq (optional, for better reliability)

Choose your AI provider:

Installation

Homebrew (Recommended)

  1. Run this
brew tap matheusml/zsh-ai
brew install zsh-ai
  1. Add this to your ~/.zshrc
source $(brew --prefix)/share/zsh-ai/zsh-ai.plugin.zsh
  1. Start a new terminal session.

Antigen

  1. Add the following to your .zshrc:

    antigen bundle matheusml/zsh-ai
  2. Start a new terminal session.

Oh My Zsh

  1. Clone it
git clone https://github.com/matheusml/zsh-ai ${ZSH_CUSTOM:-~/.oh-my-zsh/custom}/plugins/zsh-ai
  1. Add zsh-ai to your plugins list in ~/.zshrc:
plugins=( 
    # other plugins...
    zsh-ai
)
  1. Start a new terminal session.

Manual Installation

  1. Clone it
git clone https://github.com/matheusml/zsh-ai ~/.zsh-ai
  1. Add it to your ~/.zshrc
echo "source ~/.zsh-ai/zsh-ai.plugin.zsh" >> ~/.zshrc
  1. Start a new terminal session.

Setup

Option 1: Anthropic Claude (default)

export ANTHROPIC_API_KEY="your-api-key-here"

Option 2: Google Gemini

export GEMINI_API_KEY="your-api-key-here"
export ZSH_AI_PROVIDER="gemini"

Option 3: OpenAI

export OPENAI_API_KEY="your-api-key-here"
export ZSH_AI_PROVIDER="openai"
# Optional: Change model (default is gpt-4o)
export ZSH_AI_OPENAI_MODEL="gpt-4o-mini"

Option 4: Ollama (local models)

# Run a model (e.g., 3.2)
ollama run llama3.2

# Configure zsh-ai to use Ollama
export ZSH_AI_PROVIDER="ollama"

Add to your ~/.zshrc to make it permanent.

Configuration

All configuration is done via environment variables with sensible defaults:

# Choose AI provider: "anthropic" (default), "gemini", "openai", or "ollama"
export ZSH_AI_PROVIDER="anthropic"

# Anthropic-specific settings
export ZSH_AI_ANTHROPIC_MODEL="claude-3-5-sonnet-20241022"  # (default)

# Gemini-specific settings
export ZSH_AI_GEMINI_MODEL="gemini-2.5-flash"  # (default)

# OpenAI-specific settings
export ZSH_AI_OPENAI_MODEL="gpt-4o"  # (default)

# Ollama-specific settings 
export ZSH_AI_OLLAMA_MODEL="llama3.2"  # (default)
export ZSH_AI_OLLAMA_URL="http://localhost:11434"  # (default)

That's it! Most users won't need to change anything.