Skip to content

Installation

andershsueh edited this page Feb 10, 2026 · 1 revision

Installation Guide

This guide will help you install ALICE on your system.

Prerequisites

  • Node.js: Version 18.0.0 or higher
  • LLM Backend: One of the following:
    • LM Studio (Recommended for local use)
    • Ollama
    • OpenAI API account
    • Any OpenAI-compatible API

Method 1: Download Pre-built Binary (Recommended)

Download

Go to the Releases page and download the appropriate version:

Platform File Notes
Windows x64 alice-win-x64.zip For 64-bit Windows
macOS Intel alice-macos-x64.tar.gz For Intel Macs
macOS Apple Silicon alice-macos-arm64.tar.gz For M1/M2/M3 Macs
Linux x64 alice-linux-x64.tar.gz For 64-bit Linux

Installation Steps

Windows

# Extract the zip file
# Double-click alice.exe or run from PowerShell:
.\alice.exe

macOS / Linux

# Extract
tar -xzf alice-*.tar.gz

# Add execute permission
chmod +x alice-*

# Optional: Move to system path
sudo mv alice-* /usr/local/bin/alice

# Run
alice

Method 2: Install from Source

Step 1: Clone Repository

git clone https://github.com/AndersHsueh/Alice.git
cd Alice

Step 2: Install Dependencies

npm install

This will install all required packages including:

  • ink - CLI UI framework
  • axios - HTTP client for LLM APIs
  • ajv - JSON Schema validation for tools
  • glob - File searching
  • simple-git - Git operations

Step 3: Build

npm run build

This compiles TypeScript to JavaScript in the dist/ directory.

Step 4: Run

# Development mode (supports live editing)
npm run dev

# Production mode
npm start

# Or run directly
node dist/index.js

Method 3: Install via npm (Coming Soon)

npm install -g alice-cli
alice

⚠️ Note: npm package publication is planned for a future release.

Setting up LLM Backend

Option A: LM Studio (Recommended)

  1. Download LM Studio from https://lmstudio.ai/
  2. Install and launch LM Studio
  3. Download a model:
    • Click "Search" tab
    • Search for models (e.g., "qwen", "llama", "mistral")
    • Download your preferred model
  4. Start Local Server:
    • Click "Local Server" tab
    • Load your model
    • Click "Start Server"
    • Default URL: http://127.0.0.1:1234/v1

Option B: Ollama

  1. Install Ollama from https://ollama.ai/
  2. Pull a model:
    ollama pull qwen2.5:7b
  3. Start Ollama (usually runs automatically)
    • Default URL: http://localhost:11434/v1

Option C: OpenAI API

  1. Get API Key from https://platform.openai.com/
  2. Set Environment Variable:
    # macOS / Linux
    export OPENAI_API_KEY="sk-xxxxx"
    
    # Windows
    set OPENAI_API_KEY=sk-xxxxx

Verify Installation

Run ALICE with the --help flag:

alice --help

If you see the help message, installation is successful!

First Run

On first run, ALICE will:

  1. Show the welcome banner (skip with --no-banner)
  2. Create configuration directory at ~/.alice/
  3. Generate default settings.jsonc
# First run
alice

# Or skip banner
alice --no-banner

Next Steps

Troubleshooting

"Cannot find module" error

Make sure you ran npm install first.

"Connection refused" error

Your LLM backend is not running. Start LM Studio or Ollama.

Permission denied on macOS/Linux

Run chmod +x alice to add execute permission.

For more issues, see Troubleshooting.

Clone this wiki locally