You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
feat: Apple Foundation Model with full tool calling support (v1.2.0)
- Add full tool calling support for Apple Intelligence on-device AI
- All built-in tools work with Apple Foundation Model (file ops, git, bash, web)
- Intelligent tool limit handling (max 10 tools for optimal Apple AI performance)
- Channel token parsing for clean output from thinking models
- Real-time streaming with channel token filtering
- Comprehensive Apple AI test suite (bun test:apple-ai)
- Update documentation to highlight multi-provider and local AI support
- Add Apple Intelligence feature card to documentation website
Copy file name to clipboardExpand all lines: README.md
+50-20Lines changed: 50 additions & 20 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -17,20 +17,22 @@
17
17
18
18
---
19
19
20
-
Clarissa is a command-line AI agent built with [Bun](https://bun.sh) and [Ink](https://github.com/vadimdemedes/ink). It provides a conversational interface powered by [OpenRouter](https://openrouter.ai), enabling access to various LLMs like Claude, GPT-4, Gemini, and more. The agent can execute tools, manage files, run shell commands, and integrate with external services via the Model Context Protocol (MCP).
20
+
Clarissa is a command-line AI agent built with [Bun](https://bun.sh) and [Ink](https://github.com/vadimdemedes/ink). It supports multiple LLM providers including cloud services like [OpenRouter](https://openrouter.ai), OpenAI, and Anthropic, as well as local inference via Apple Intelligence, LM Studio, and local GGUF models. The agent can execute tools, manage files, run shell commands, and integrate with external services via the Model Context Protocol (MCP).
21
21
22
22
## Features
23
23
24
-
-**Multi-model support** - Switch between Claude, GPT-4, Gemini, Llama, DeepSeek, and other models via OpenRouter
24
+
-**Multi-provider support** - Use cloud providers (OpenRouter, OpenAI, Anthropic) or run completely offline with local models
25
+
-**Apple Intelligence** - On-device AI using Apple Foundation Models with full tool calling support (macOS 26+)
26
+
-**Local model inference** - Run GGUF models locally via LM Studio or node-llama-cpp with GPU acceleration
25
27
-**Streaming responses** - Real-time token streaming for responsive conversations
26
28
-**Built-in tools** - File operations, Git integration, shell commands, web fetching, and more
27
29
-**MCP integration** - Connect to external MCP servers to extend functionality
28
-
-**Session management** - Save and restore conversation history
30
+
-**Session management** - Auto-save on exit, quick resume with `/last`, and named sessions
29
31
-**Memory persistence** - Remember facts across sessions with `/remember` and `/memories`
30
32
-**Context management** - Automatic token tracking and context truncation
31
33
-**Tool confirmation** - Approve or reject potentially dangerous operations
32
-
-**One-shot mode** - Run single commands directly from your shell
33
-
-**Piped input** - Pipe content from other commands for processing
34
+
-**One-shot mode** - Run single commands directly from your shell with query history
35
+
-**Auto-updates** - Get notified of new versions and upgrade easily with `clarissa upgrade`
34
36
35
37
## How It Works
36
38
@@ -52,9 +54,13 @@ flowchart LR
52
54
F[Context Manager]
53
55
end
54
56
57
+
subgraph Providers
58
+
G[Cloud: OpenRouter / OpenAI / Anthropic]
59
+
H[Local: Apple AI / LM Studio / GGUF]
60
+
end
61
+
55
62
subgraph External
56
-
G[OpenRouter API]
57
-
H[MCP Servers]
63
+
I[MCP Servers]
58
64
end
59
65
60
66
A --> C
@@ -63,10 +69,11 @@ flowchart LR
63
69
C <--> E
64
70
C <--> F
65
71
D <--> G
66
-
E <-.-> H
72
+
D <--> H
73
+
E <-.-> I
67
74
```
68
75
69
-
The system connects your terminal to various LLMs through OpenRouter. When you ask Clarissa to perform a task, it:
76
+
The system connects your terminal to various LLM providers. When you ask Clarissa to perform a task, it:
70
77
71
78
1. Sends your message to the LLM along with available tool definitions
72
79
2. Receives a response that may include tool calls (e.g., read a file, run a command)
@@ -103,7 +110,9 @@ For detailed architecture documentation, see the [Architecture Guide](https://ca
103
110
## Requirements
104
111
105
112
-[Bun](https://bun.sh) v1.0 or later (for running from source or npm install)
106
-
- An [OpenRouter API key](https://openrouter.ai/keys)
113
+
- For cloud providers: API key for [OpenRouter](https://openrouter.ai/keys), [OpenAI](https://platform.openai.com/api-keys), or [Anthropic](https://console.anthropic.com/)
114
+
- For Apple Intelligence: macOS 26+ with Apple Silicon and Apple Intelligence enabled
115
+
- For local models: [LM Studio](https://lmstudio.ai) or download GGUF models with `clarissa download`
0 commit comments