Skip to content

Commit e62ce67

Browse files
committed
feat: change model notation from provider:model to provider/model
Switch the --model / -m flag format from colon-separated (provider:model) to slash-separated (provider/model), e.g. anthropic/claude-sonnet-4-5-20250929 or ollama/qwen3:8b. The slash separator is cleaner since model names can contain colons (ollama tags, bedrock ARNs). Add centralized ParseModelString() in internal/models/providers.go that all callers now use. The old colon format is still accepted with a deprecation warning to stderr for backward compatibility. Update default model to claude-sonnet-4-5-20250929.
1 parent be91626 commit e62ce67

20 files changed

+114
-90
lines changed

README.md

Lines changed: 13 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -204,7 +204,7 @@ mcpServers:
204204
DEBUG: "${env://DEBUG:-false}"
205205
LOG_LEVEL: "${env://LOG_LEVEL:-info}"
206206

207-
model: "${env://MODEL:-anthropic:claude-sonnet-4-20250514}"
207+
model: "${env://MODEL:-anthropic/claude-sonnet-4-5-20250929}"
208208
provider-api-key: "${env://OPENAI_API_KEY}" # Required - will fail if not set
209209
```
210210
@@ -216,7 +216,7 @@ export OPENAI_API_KEY="your_openai_key"
216216

217217
# Optionally override defaults
218218
export DEBUG="true"
219-
export MODEL="openai:gpt-4"
219+
export MODEL="openai/gpt-4"
220220

221221
# Run mcphost
222222
mcphost
@@ -584,7 +584,7 @@ mcpServers:
584584
type: "local"
585585
command: ["npx", "-y", "@modelcontextprotocol/server-filesystem", "${env://WORK_DIR:-/tmp}"]
586586

587-
model: "${env://MODEL:-anthropic:claude-sonnet-4-20250514}"
587+
model: "${env://MODEL:-anthropic/claude-sonnet-4-5-20250929}"
588588
---
589589
Hello ${name:-World}! Please list ${repo_type:-public} repositories for user ${username}.
590590
Working directory is ${env://WORK_DIR:-/tmp}.
@@ -723,7 +723,7 @@ mcphost -p "What is the weather like today?"
723723
mcphost -p "What is 2+2?" --quiet
724724
725725
# Use with different models
726-
mcphost -m ollama:qwen2.5:3b -p "Explain quantum computing" --quiet
726+
mcphost -m ollama/qwen2.5:3b -p "Explain quantum computing" --quiet
727727
```
728728

729729
### Model Generation Parameters
@@ -751,24 +751,24 @@ These parameters work with all supported providers (OpenAI, Anthropic, Google, O
751751

752752
### Available Models
753753
Models can be specified using the `--model` (`-m`) flag:
754-
- **Anthropic Claude** (default): `anthropic:claude-sonnet-4-20250514`, `anthropic:claude-3-5-sonnet-latest`, `anthropic:claude-3-5-haiku-latest`
755-
- **OpenAI**: `openai:gpt-4`, `openai:gpt-4-turbo`, `openai:gpt-3.5-turbo`
756-
- **Google Gemini**: `google:gemini-2.0-flash`, `google:gemini-1.5-pro`
757-
- **Ollama models**: `ollama:llama3.2`, `ollama:qwen2.5:3b`, `ollama:mistral`
754+
- **Anthropic Claude** (default): `anthropic/claude-sonnet-4-5-20250929`, `anthropic/claude-3-5-sonnet-latest`, `anthropic/claude-3-5-haiku-latest`
755+
- **OpenAI**: `openai/gpt-4`, `openai/gpt-4-turbo`, `openai/gpt-3.5-turbo`
756+
- **Google Gemini**: `google/gemini-2.0-flash`, `google/gemini-1.5-pro`
757+
- **Ollama models**: `ollama/llama3.2`, `ollama/qwen2.5:3b`, `ollama/mistral`
758758
- **OpenAI-compatible**: Any model via custom endpoint with `--provider-url`
759759

760760
### Examples
761761

762762
#### Interactive Mode
763763
```bash
764764
# Use Ollama with Qwen model
765-
mcphost -m ollama:qwen2.5:3b
765+
mcphost -m ollama/qwen2.5:3b
766766

767767
# Use OpenAI's GPT-4
768-
mcphost -m openai:gpt-4
768+
mcphost -m openai/gpt-4
769769

770770
# Use OpenAI-compatible model with custom URL and API key
771-
mcphost --model openai:<your-model-name> \
771+
mcphost --model openai/<your-model-name> \
772772
--provider-url <your-base-url> \
773773
--provider-api-key <your-api-key>
774774
```
@@ -800,7 +800,7 @@ mcphost -p "Generate a random UUID" --quiet | tr '[:lower:]' '[:upper:]'
800800
- `--system-prompt string`: system-prompt file location
801801
- `--debug`: Enable debug logging
802802
- `--max-steps int`: Maximum number of agent steps (0 for unlimited, default: 0)
803-
- `-m, --model string`: Model to use (format: provider:model) (default "anthropic:claude-sonnet-4-20250514")
803+
- `-m, --model string`: Model to use (format: provider/model) (default "anthropic/claude-sonnet-4-5-20250929")
804804
- `-p, --prompt string`: **Run in non-interactive mode with the given prompt**
805805
- `--quiet`: **Suppress all output except the AI response (only works with --prompt)**
806806
- `--compact`: **Enable compact output mode without fancy styling (ideal for scripting and automation)**
@@ -845,7 +845,7 @@ mcpServers:
845845
url: "https://api.example.com/mcp"
846846

847847
# Application settings
848-
model: "anthropic:claude-sonnet-4-20250514"
848+
model: "anthropic/claude-sonnet-4-5-20250929"
849849
max-steps: 20
850850
debug: false
851851
system-prompt: "/path/to/system-prompt.txt"

cmd/root.go

Lines changed: 14 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -96,16 +96,16 @@ through a unified interface. It supports various tools through MCP servers
9696
and provides streaming responses.
9797
9898
Available models can be specified using the --model flag:
99-
- Anthropic Claude (default): anthropic:claude-sonnet-4-20250514
100-
- OpenAI: openai:gpt-4
101-
- Ollama models: ollama:modelname
102-
- Google: google:modelname
99+
- Anthropic Claude (default): anthropic/claude-sonnet-4-5-20250929
100+
- OpenAI: openai/gpt-4
101+
- Ollama models: ollama/modelname
102+
- Google: google/modelname
103103
104104
Examples:
105105
# Interactive mode
106-
mcphost -m ollama:qwen2.5:3b
107-
mcphost -m openai:gpt-4
108-
mcphost -m google:gemini-2.0-flash
106+
mcphost -m ollama/qwen2.5:3b
107+
mcphost -m openai/gpt-4
108+
mcphost -m google/gemini-2.0-flash
109109
110110
# Non-interactive mode
111111
mcphost -p "What is the weather like today?"
@@ -284,8 +284,8 @@ func init() {
284284
StringVar(&systemPromptFile, "system-prompt", "", "system prompt text or path to text file")
285285

286286
rootCmd.PersistentFlags().
287-
StringVarP(&modelFlag, "model", "m", "anthropic:claude-sonnet-4-20250514",
288-
"model to use (format: provider:model)")
287+
StringVarP(&modelFlag, "model", "m", "anthropic/claude-sonnet-4-5-20250929",
288+
"model to use (format: provider/model)")
289289
rootCmd.PersistentFlags().
290290
BoolVar(&debugMode, "debug", false, "enable debug logging")
291291
rootCmd.PersistentFlags().
@@ -467,10 +467,9 @@ func runNormalMode(ctx context.Context) error {
467467
// Initialize hook executor if hooks are configured
468468
// Get model name for display
469469
modelString := viper.GetString("model")
470-
parts := strings.SplitN(modelString, ":", 2)
471-
modelName := "Unknown"
472-
if len(parts) == 2 {
473-
modelName = parts[1]
470+
parsedProvider, modelName, _ := models.ParseModelString(modelString)
471+
if modelName == "" {
472+
modelName = "Unknown"
474473
}
475474

476475
var hookExecutor *hooks.Executor
@@ -533,7 +532,7 @@ func runNormalMode(ctx context.Context) error {
533532
}
534533

535534
// Add Ollama-specific parameters if using Ollama
536-
if strings.HasPrefix(viper.GetString("model"), "ollama:") {
535+
if parsedProvider == "ollama" {
537536
debugConfig["num-gpu-layers"] = viper.GetInt("num-gpu-layers")
538537
debugConfig["main-gpu"] = viper.GetInt("main-gpu")
539538
}
@@ -733,7 +732,7 @@ func runNormalMode(ctx context.Context) error {
733732
// Set metadata
734733
sessionManager.SetMetadata(session.Metadata{
735734
MCPHostVersion: "dev", // TODO: Get actual version
736-
Provider: parts[0],
735+
Provider: parsedProvider,
737736
Model: modelName,
738737
})
739738
}

cmd/script.go

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -17,6 +17,7 @@ import (
1717
"github.com/mark3labs/mcphost/internal/models"
1818
"github.com/mark3labs/mcphost/internal/tools"
1919
"github.com/mark3labs/mcphost/internal/ui"
20+
2021
"github.com/spf13/cobra"
2122
"github.com/spf13/viper"
2223
)
@@ -34,7 +35,7 @@ model settings, and other options.
3435
3536
Example script file:
3637
---
37-
model: "anthropic:claude-sonnet-4-20250514"
38+
model: "anthropic/claude-sonnet-4-5-20250929"
3839
max-steps: 10
3940
mcpServers:
4041
filesystem:
@@ -520,7 +521,7 @@ func runScriptMode(ctx context.Context, mcpConfig *config.Config, prompt string,
520521
finalModel = mcpConfig.Model
521522
}
522523
if finalModel == "" {
523-
finalModel = "anthropic:claude-sonnet-4-20250514" // default
524+
finalModel = "anthropic/claude-sonnet-4-5-20250929" // default
524525
}
525526

526527
finalSystemPrompt := viper.GetString("system-prompt")
@@ -626,10 +627,9 @@ func runScriptMode(ctx context.Context, mcpConfig *config.Config, prompt string,
626627
defer mcpAgent.Close()
627628

628629
// Get model name for display
629-
parts := strings.SplitN(finalModel, ":", 2)
630-
modelName := "Unknown"
631-
if len(parts) == 2 {
632-
modelName = parts[1]
630+
_, modelName, _ := models.ParseModelString(finalModel)
631+
if modelName == "" {
632+
modelName = "Unknown"
633633
}
634634

635635
// Create an adapter for the agent to match the UI interface

cmd/script_deepseek_test.go

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ func TestDeepSeekChatScriptMode(t *testing.T) {
2020

2121
scriptContent := `#!/usr/bin/env -S mcphost script
2222
---
23-
model: "openai:deepseek-chat"
23+
model: "openai/deepseek-chat"
2424
provider-url: "https://api.deepseek.com/v1"
2525
provider-api-key: "${env://DEEPSEEK_API_KEY}"
2626
---
@@ -44,8 +44,8 @@ Calculate 3 times 4 equal to?
4444
}
4545

4646
// Verify the script config has the correct values
47-
if scriptConfig.Model != "openai:deepseek-chat" {
48-
t.Errorf("Expected model=openai:deepseek-chat, got %s", scriptConfig.Model)
47+
if scriptConfig.Model != "openai/deepseek-chat" {
48+
t.Errorf("Expected model=openai/deepseek-chat, got %s", scriptConfig.Model)
4949
}
5050
if scriptConfig.ProviderURL != "https://api.deepseek.com/v1" {
5151
t.Errorf("Expected provider-url=https://api.deepseek.com/v1, got %s", scriptConfig.ProviderURL)
@@ -87,7 +87,7 @@ Calculate 3 times 4 equal to?
8787
func TestDeepSeekChatCLIMode(t *testing.T) {
8888
// Test the CLI mode behavior - this should work
8989
providerConfig := &models.ProviderConfig{
90-
ModelString: "openai:deepseek-chat",
90+
ModelString: "openai/deepseek-chat",
9191
ProviderAPIKey: "sk-test-key",
9292
ProviderURL: "https://api.deepseek.com/v1", // This should skip validation
9393
MaxTokens: 0,
@@ -119,25 +119,25 @@ func TestProviderURLValidationSkip(t *testing.T) {
119119
}{
120120
{
121121
name: "OpenAI with custom URL should skip validation",
122-
model: "openai:custom-model",
122+
model: "openai/custom-model",
123123
providerURL: "https://api.custom.com/v1",
124124
shouldSkip: true,
125125
},
126126
{
127127
name: "OpenAI without custom URL should validate",
128-
model: "openai:custom-model",
128+
model: "openai/custom-model",
129129
providerURL: "",
130130
shouldSkip: false,
131131
},
132132
{
133133
name: "Ollama should always skip validation",
134-
model: "ollama:custom-model",
134+
model: "ollama/custom-model",
135135
providerURL: "",
136136
shouldSkip: true,
137137
},
138138
{
139139
name: "Anthropic with custom URL should skip validation",
140-
model: "anthropic:custom-model",
140+
model: "anthropic/custom-model",
141141
providerURL: "https://api.custom.com/v1",
142142
shouldSkip: true,
143143
},

cmd/script_integration_test.go

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@ mcpServers:
2828
options:
2929
allowed_directories: ["${env://WORK_DIR:-/tmp}"]
3030
31-
model: "${env://MODEL:-anthropic:claude-sonnet-4-20250514}"
31+
model: "${env://MODEL:-anthropic/claude-sonnet-4-5-20250929}"
3232
debug: ${env://DEBUG:-false}
3333
---
3434
List ${repo_type:-public} repositories for user ${username}.
@@ -91,8 +91,8 @@ Working directory is ${env://WORK_DIR:-/tmp}.
9191
}
9292

9393
// Verify global config values
94-
if scriptConfig.Model != "anthropic:claude-sonnet-4-20250514" {
95-
t.Errorf("Expected model=anthropic:claude-sonnet-4-20250514, got %s", scriptConfig.Model)
94+
if scriptConfig.Model != "anthropic/claude-sonnet-4-5-20250929" {
95+
t.Errorf("Expected model=anthropic/claude-sonnet-4-5-20250929, got %s", scriptConfig.Model)
9696
}
9797
if !scriptConfig.Debug {
9898
t.Error("Expected debug=true")
@@ -250,7 +250,7 @@ mcpServers:
250250
options:
251251
allowed_directories: ["/tmp"]
252252
253-
model: "anthropic:claude-sonnet-4-20250514"
253+
model: "anthropic/claude-sonnet-4-5-20250929"
254254
---
255255
List files in ${directory:-/tmp} for user ${username}.
256256
`
@@ -278,7 +278,7 @@ List files in ${directory:-/tmp} for user ${username}.
278278
}
279279

280280
// Verify that config is unchanged
281-
if scriptConfig.Model != "anthropic:claude-sonnet-4-20250514" {
281+
if scriptConfig.Model != "anthropic/claude-sonnet-4-5-20250929" {
282282
t.Errorf("Expected model unchanged, got %s", scriptConfig.Model)
283283
}
284284
}

cmd/script_test.go

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -244,7 +244,7 @@ func TestSubstituteVariables(t *testing.T) {
244244
func TestBackwardCompatibility(t *testing.T) {
245245
// Test that existing scripts without default syntax continue to work
246246
content := `---
247-
model: "anthropic:claude-sonnet-4-20250514"
247+
model: "anthropic/claude-sonnet-4-5-20250929"
248248
---
249249
Hello ${name}! Please analyze ${directory}.`
250250

@@ -262,7 +262,7 @@ Hello ${name}! Please analyze ${directory}.`
262262
// Should substitute correctly
263263
result := substituteVariables(content, variables)
264264
expected := `---
265-
model: "anthropic:claude-sonnet-4-20250514"
265+
model: "anthropic/claude-sonnet-4-5-20250929"
266266
---
267267
Hello John! Please analyze /tmp.`
268268

@@ -302,7 +302,7 @@ Test prompt with compact mode`
302302

303303
func TestParseScriptContentMCPServersNewFormat(t *testing.T) {
304304
content := `---
305-
model: "anthropic:claude-sonnet-4-20250514"
305+
model: "anthropic/claude-sonnet-4-5-20250929"
306306
mcpServers:
307307
filesystem:
308308
type: "local"
@@ -379,7 +379,7 @@ Test prompt with new format MCP servers`
379379

380380
func TestParseScriptContentMCPServersLegacyFormat(t *testing.T) {
381381
content := `---
382-
model: "anthropic:claude-sonnet-4-20250514"
382+
model: "anthropic/claude-sonnet-4-5-20250929"
383383
mcpServers:
384384
legacy-stdio:
385385
transport: "stdio"

examples/scripts/default-values-demo.sh

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
#!/usr/bin/env -S mcphost script
22
---
33
# Demo script showcasing default values in MCPHost scripts
4-
model: "anthropic:claude-sonnet-4-20250514"
4+
model: "anthropic/claude-sonnet-4-5-20250929"
55
mcpServers:
66
filesystem:
77
type: "builtin"

examples/scripts/env-substitution-script.sh

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@ mcpServers:
1717
options:
1818
allowed_directories: ["${env://WORK_DIR:-/tmp}"]
1919

20-
model: "${env://MODEL:-anthropic:claude-sonnet-4-20250514}"
20+
model: "${env://MODEL:-anthropic/claude-sonnet-4-5-20250929}"
2121
debug: ${env://DEBUG:-false}
2222
---
2323
List ${repo_type:-public} repositories for user ${username}.

examples/scripts/tls-test-script.sh

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
#!/usr/bin/env -S mcphost script
22
---
33
# Example script demonstrating TLS skip verify for self-signed certificates
4-
model: "ollama:llama3.2"
4+
model: "ollama/llama3.2"
55
provider-url: "https://localhost:8443"
66
tls-skip-verify: true
77
max-tokens: 1000

internal/agent/agent.go

Lines changed: 2 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,6 @@ import (
44
"context"
55
"encoding/json"
66
"fmt"
7-
"strings"
87
"time"
98

109
tea "charm.land/bubbletea/v2"
@@ -115,9 +114,8 @@ func NewAgent(ctx context.Context, agentConfig *AgentConfig) (*Agent, error) {
115114
// Determine provider type from model string
116115
providerType := "default"
117116
if agentConfig.ModelConfig != nil && agentConfig.ModelConfig.ModelString != "" {
118-
parts := strings.SplitN(agentConfig.ModelConfig.ModelString, ":", 2)
119-
if len(parts) >= 1 {
120-
providerType = parts[0]
117+
if p, _, err := models.ParseModelString(agentConfig.ModelConfig.ModelString); err == nil {
118+
providerType = p
121119
}
122120
}
123121

0 commit comments

Comments
 (0)