Skip to content

Commit 77d1702

Browse files
committed
feat: Apple Foundation Model with full tool calling support (v1.2.0)
- Add full tool calling support for Apple Intelligence on-device AI - All built-in tools work with Apple Foundation Model (file ops, git, bash, web) - Intelligent tool limit handling (max 10 tools for optimal Apple AI performance) - Channel token parsing for clean output from thinking models - Real-time streaming with channel token filtering - Comprehensive Apple AI test suite (bun test:apple-ai) - Update documentation to highlight multi-provider and local AI support - Add Apple Intelligence feature card to documentation website
1 parent da2aa30 commit 77d1702

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

46 files changed

+7404
-557
lines changed

.env.example

Lines changed: 11 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -1,17 +1,16 @@
1-
# OpenRouter API Key (required)
2-
# Get your key at https://openrouter.ai/keys
3-
OPENROUTER_API_KEY=your_api_key_here
1+
# API Keys (backup - config file takes precedence)
2+
# Recommended: use ~/.clarissa/config.json instead
3+
# Run 'clarissa init' to set up config file
44

5-
# Model to use (optional, defaults to anthropic/claude-sonnet-4)
6-
# See available models at https://openrouter.ai/models
7-
OPENROUTER_MODEL=anthropic/claude-sonnet-4
5+
# OpenRouter - https://openrouter.ai/keys
6+
OPENROUTER_API_KEY=
87

9-
# App name (optional)
10-
APP_NAME=Clarissa
8+
# OpenAI - https://platform.openai.com/api-keys
9+
OPENAI_API_KEY=
1110

12-
# Maximum agent loop iterations (optional, default: 10)
13-
MAX_ITERATIONS=10
11+
# Anthropic - https://console.anthropic.com/settings/keys
12+
ANTHROPIC_API_KEY=
1413

15-
# Debug mode (optional, default: false)
14+
# Optional settings
15+
MAX_ITERATIONS=10
1616
DEBUG=false
17-

CHANGELOG.md

Lines changed: 49 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,55 @@ All notable changes to this project will be documented in this file.
55
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/),
66
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
77

8+
## [1.2.0] - 2025-12-09
9+
10+
### Added
11+
12+
- **Apple Foundation Model with Tools** - Full tool calling support for Apple Intelligence
13+
- Apple on-device AI now supports all built-in tools (file operations, git, bash, web fetch)
14+
- Intelligent tool limit handling (max 10 tools for optimal performance)
15+
- Channel token parsing for clean output from thinking models
16+
- Automatic retry without tools when model returns null responses
17+
- **Enhanced Apple AI streaming** - Real-time streaming with channel token filtering
18+
- **Comprehensive test suite** - New test script for Apple AI provider (`bun test:apple-ai`)
19+
20+
### Fixed
21+
22+
- Apple AI responses now properly parse channel tokens (analysis, commentary, final)
23+
- Tool calls from Apple Foundation Model now correctly extract function arguments
24+
- Streaming mode properly filters internal model tokens for clean user output
25+
26+
## [1.1.0] - 2025-12-08
27+
28+
### Added
29+
30+
- **Multi-provider LLM support** - Switch between different LLM providers at runtime
31+
- OpenRouter (cloud) - Access to 100+ models
32+
- OpenAI (cloud) - Direct GPT API access
33+
- Anthropic (cloud) - Direct Claude API access
34+
- Apple Intelligence (local) - On-device AI for macOS 26+
35+
- LM Studio (local) - Local inference via LM Studio desktop app
36+
- Local Llama (local) - Direct GGUF model inference via node-llama-cpp
37+
- **Provider registry** - Automatic provider detection and priority-based selection
38+
- **Model download system** - Download GGUF models from Hugging Face
39+
- `clarissa download` command with recommended models list
40+
- `clarissa models` command to list downloaded models
41+
- Progress tracking during downloads
42+
- Curated list of best models (Qwen 3, Gemma 3, Llama 4, DeepSeek R1, etc.)
43+
- **Preferences persistence** - Remember last used provider and model across sessions
44+
- **Auto-update system** - Check for updates and upgrade easily
45+
- `clarissa upgrade` command to update to latest version
46+
- Background update checking with notifications
47+
- Package manager detection (bun, pnpm, npm)
48+
- **`/provider` command** - Switch LLM providers during a session
49+
- **Retry logic** - Exponential backoff with jitter for API rate limits
50+
51+
### Changed
52+
53+
- Refactored LLM client to use provider abstraction layer
54+
- Updated architecture diagram to show multi-provider support
55+
- Enhanced configuration options for provider-specific settings
56+
857
## [1.0.2] - 2025-12-07
958

1059
### Fixed

README.md

Lines changed: 50 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -17,20 +17,22 @@
1717

1818
---
1919

20-
Clarissa is a command-line AI agent built with [Bun](https://bun.sh) and [Ink](https://github.com/vadimdemedes/ink). It provides a conversational interface powered by [OpenRouter](https://openrouter.ai), enabling access to various LLMs like Claude, GPT-4, Gemini, and more. The agent can execute tools, manage files, run shell commands, and integrate with external services via the Model Context Protocol (MCP).
20+
Clarissa is a command-line AI agent built with [Bun](https://bun.sh) and [Ink](https://github.com/vadimdemedes/ink). It supports multiple LLM providers including cloud services like [OpenRouter](https://openrouter.ai), OpenAI, and Anthropic, as well as local inference via Apple Intelligence, LM Studio, and local GGUF models. The agent can execute tools, manage files, run shell commands, and integrate with external services via the Model Context Protocol (MCP).
2121

2222
## Features
2323

24-
- **Multi-model support** - Switch between Claude, GPT-4, Gemini, Llama, DeepSeek, and other models via OpenRouter
24+
- **Multi-provider support** - Use cloud providers (OpenRouter, OpenAI, Anthropic) or run completely offline with local models
25+
- **Apple Intelligence** - On-device AI using Apple Foundation Models with full tool calling support (macOS 26+)
26+
- **Local model inference** - Run GGUF models locally via LM Studio or node-llama-cpp with GPU acceleration
2527
- **Streaming responses** - Real-time token streaming for responsive conversations
2628
- **Built-in tools** - File operations, Git integration, shell commands, web fetching, and more
2729
- **MCP integration** - Connect to external MCP servers to extend functionality
28-
- **Session management** - Save and restore conversation history
30+
- **Session management** - Auto-save on exit, quick resume with `/last`, and named sessions
2931
- **Memory persistence** - Remember facts across sessions with `/remember` and `/memories`
3032
- **Context management** - Automatic token tracking and context truncation
3133
- **Tool confirmation** - Approve or reject potentially dangerous operations
32-
- **One-shot mode** - Run single commands directly from your shell
33-
- **Piped input** - Pipe content from other commands for processing
34+
- **One-shot mode** - Run single commands directly from your shell with query history
35+
- **Auto-updates** - Get notified of new versions and upgrade easily with `clarissa upgrade`
3436

3537
## How It Works
3638

@@ -52,9 +54,13 @@ flowchart LR
5254
F[Context Manager]
5355
end
5456
57+
subgraph Providers
58+
G[Cloud: OpenRouter / OpenAI / Anthropic]
59+
H[Local: Apple AI / LM Studio / GGUF]
60+
end
61+
5562
subgraph External
56-
G[OpenRouter API]
57-
H[MCP Servers]
63+
I[MCP Servers]
5864
end
5965
6066
A --> C
@@ -63,10 +69,11 @@ flowchart LR
6369
C <--> E
6470
C <--> F
6571
D <--> G
66-
E <-.-> H
72+
D <--> H
73+
E <-.-> I
6774
```
6875

69-
The system connects your terminal to various LLMs through OpenRouter. When you ask Clarissa to perform a task, it:
76+
The system connects your terminal to various LLM providers. When you ask Clarissa to perform a task, it:
7077

7178
1. Sends your message to the LLM along with available tool definitions
7279
2. Receives a response that may include tool calls (e.g., read a file, run a command)
@@ -103,7 +110,9 @@ For detailed architecture documentation, see the [Architecture Guide](https://ca
103110
## Requirements
104111

105112
- [Bun](https://bun.sh) v1.0 or later (for running from source or npm install)
106-
- An [OpenRouter API key](https://openrouter.ai/keys)
113+
- For cloud providers: API key for [OpenRouter](https://openrouter.ai/keys), [OpenAI](https://platform.openai.com/api-keys), or [Anthropic](https://console.anthropic.com/)
114+
- For Apple Intelligence: macOS 26+ with Apple Silicon and Apple Intelligence enabled
115+
- For local models: [LM Studio](https://lmstudio.ai) or download GGUF models with `clarissa download`
107116

108117
## Installation
109118

@@ -138,25 +147,44 @@ mv clarissa-macos-arm64 /usr/local/bin/clarissa
138147

139148
## Configuration
140149

141-
Create a config file at `~/.clarissa/config.json`:
150+
Create a config file at `~/.clarissa/config.json` or run `clarissa init` for interactive setup.
151+
152+
### API Keys (Cloud Providers)
153+
154+
Set one or more API keys for cloud providers:
142155

143156
```bash
144-
mkdir -p ~/.clarissa
145-
echo '{"apiKey": "your_api_key_here"}' > ~/.clarissa/config.json
157+
# Environment variables
158+
export OPENROUTER_API_KEY=your_key_here
159+
export OPENAI_API_KEY=your_key_here
160+
export ANTHROPIC_API_KEY=your_key_here
146161
```
147162

148-
Or set your OpenRouter API key as an environment variable:
163+
Or in `~/.clarissa/config.json`:
149164

150-
```bash
151-
export OPENROUTER_API_KEY=your_api_key_here
165+
```json
166+
{
167+
"apiKey": "your_openrouter_key",
168+
"openaiApiKey": "your_openai_key",
169+
"anthropicApiKey": "your_anthropic_key"
170+
}
152171
```
153172

154-
Optional settings (in config.json or as environment variables):
173+
### Local Providers (No API Key Required)
174+
175+
- **Apple Intelligence**: Automatically detected on macOS 26+ with Apple Intelligence enabled
176+
- **LM Studio**: Start LM Studio and load a model - Clarissa auto-detects the local server
177+
- **Local GGUF**: Download models with `clarissa download` and run offline
178+
179+
### Configuration Options
155180

156181
| Config Key | Env Variable | Default | Description |
157182
|------------|--------------|---------|-------------|
158-
| `apiKey` | `OPENROUTER_API_KEY` | (required) | Your OpenRouter API key |
159-
| `model` | `OPENROUTER_MODEL` | `anthropic/claude-sonnet-4` | Default model to use |
183+
| `apiKey` | `OPENROUTER_API_KEY` | - | OpenRouter API key |
184+
| `openaiApiKey` | `OPENAI_API_KEY` | - | OpenAI API key |
185+
| `anthropicApiKey` | `ANTHROPIC_API_KEY` | - | Anthropic API key |
186+
| `model` | - | (auto) | Preferred model |
187+
| `preferredProvider` | - | (auto) | Preferred provider ID |
160188
| `maxIterations` | `MAX_ITERATIONS` | `10` | Maximum tool execution iterations |
161189
| `debug` | `DEBUG` | `false` | Enable debug logging |
162190
| `mcpServers` | - | `{}` | MCP servers to auto-load (see below) |
@@ -230,12 +258,14 @@ git diff | clarissa "Write a commit message for these changes"
230258
| `/save [NAME]` | Save current session |
231259
| `/sessions` | List saved sessions |
232260
| `/load ID` | Load a saved session |
261+
| `/last` | Resume last session |
233262
| `/delete ID` | Delete a saved session |
234263
| `/remember <fact>` | Save a memory |
235264
| `/memories` | List saved memories |
236265
| `/forget <#\|ID>` | Forget a memory |
237266
| `/model [NAME]` | Show or switch the current model |
238-
| `/mcp` | Show MCP server status |
267+
| `/provider [ID]` | Show or switch the LLM provider |
268+
| `/mcp` | Show connected MCP servers |
239269
| `/tools` | List available tools |
240270
| `/context` | Show context window usage and breakdown |
241271
| `/yolo` | Toggle auto-approve mode (skip tool confirmations) |

ROADMAP.md

Lines changed: 47 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,50 @@
44
55
---
66

7+
## Completed Features (v1.2.0)
8+
9+
### Apple Foundation Model with Tools
10+
- [x] Full tool calling support for Apple Intelligence on-device AI
11+
- [x] All built-in tools work with Apple Foundation Model (file ops, git, bash, web fetch)
12+
- [x] Intelligent tool limit handling (max 10 tools for optimal Apple AI performance)
13+
- [x] Channel token parsing for clean output from thinking models
14+
- [x] Automatic retry without tools when model returns null responses
15+
- [x] Real-time streaming with channel token filtering
16+
- [x] Comprehensive Apple AI test suite (`bun test:apple-ai`)
17+
18+
## Completed Features (v1.1.0)
19+
20+
### Multi-Provider LLM Support
21+
- [x] Provider abstraction layer with unified interface
22+
- [x] OpenRouter provider (cloud) - 100+ models
23+
- [x] OpenAI provider (cloud) - Direct GPT API
24+
- [x] Anthropic provider (cloud) - Direct Claude API
25+
- [x] Apple Intelligence provider (local) - macOS 26+ on-device AI
26+
- [x] LM Studio provider (local) - Desktop app integration
27+
- [x] Local Llama provider (local) - Direct GGUF inference via node-llama-cpp
28+
- [x] Provider switching with `/provider` command
29+
- [x] Automatic provider detection and priority selection
30+
- [x] Preferences persistence for last used provider/model
31+
32+
### Local Model Support
33+
- [x] GGUF model download from Hugging Face (`clarissa download`)
34+
- [x] Curated recommended models list (Qwen 3, Gemma 3, Llama 4, DeepSeek R1, etc.)
35+
- [x] Download progress tracking
36+
- [x] Model listing (`clarissa models`)
37+
- [x] GPU layer configuration for local inference
38+
- [x] Flash attention support
39+
40+
### Auto-Update System
41+
- [x] Version checking against npm registry
42+
- [x] `clarissa upgrade` command
43+
- [x] Package manager detection (bun, pnpm, npm)
44+
- [x] Background update notifications
45+
46+
### API Improvements
47+
- [x] Retry logic with exponential backoff and jitter
48+
- [x] Rate limit handling for all providers
49+
- [x] Streaming support for all providers
50+
751
## Completed Features (v1.0.0)
852

953
### Core Operations
@@ -63,17 +107,20 @@
63107
- [ ] HTTP/SSE MCP server transport
64108
- [ ] File context references
65109
- [ ] Image/vision analysis
110+
- [ ] Model delete command for local models
66111

67112
### Medium Term
68113
- [ ] Codebase indexing with embeddings
69114
- [ ] Semantic search across codebase
70115
- [ ] Model comparison mode
71116
- [ ] Fallback model configuration
117+
- [ ] Provider-specific model recommendations
72118

73119
### Long Term
74120
- [ ] Integrated linting with auto-fix
75121
- [ ] Test runner integration
76122
- [ ] Project scaffolding templates
123+
- [ ] Multi-agent collaboration
77124

78125
---
79126

0 commit comments

Comments
 (0)