Skip to content

Commit e0b89e7

Browse files
committed
feat: upstream providers[] + convert_from_map (#131)
* feat: upstream providers[] + convert_from_map - Replace upstream.provider with providers[] - Add per-upstream convert_from_map for cross-format fallback - Add config migration + UI editor updates * test(frontend): add Vitest to prevent config/dashboard regressions So config normalization, upstream defaults, and dashboard helpers stay reliable as the UI evolves and CI runs. * perf: cut overhead when tracking or capture is disabled Avoid blocking Tokio threads on temp-file cleanup and reduce allocations/background tasks for noop token tracking; also skip redundant log capture-state loads.
1 parent b845899 commit e0b89e7

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

50 files changed

+2868
-1329
lines changed

README.md

Lines changed: 28 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ Local AI API gateway for OpenAI / Gemini / Anthropic. Runs on your machine, keep
99
---
1010

1111
## What you get
12-
- Multiple providers: `openai`, `openai-response`, `anthropic`, `gemini`, `kiro`
12+
- Multiple providers: `openai`, `openai-response`, `anthropic`, `gemini`, `kiro`, `codex`, `antigravity`
1313
- Built-in routing + optional format conversion (OpenAI Chat ⇄ Responses; Anthropic Messages ↔ OpenAI; Gemini ↔ OpenAI/Anthropic; SSE supported)
1414
- Per-upstream priority + two balancing strategies (fill-first / round-robin)
1515
- Model alias mapping (exact / prefix* / wildcard*) and response model rewrite
@@ -62,6 +62,26 @@ cargo run -p token_proxy_cli -- config init
6262
cargo run -p token_proxy_cli -- --config ./config.jsonc config path
6363
```
6464

65+
## Frontend tests
66+
```bash
67+
# watch mode
68+
pnpm test
69+
70+
# run once (CI-friendly)
71+
pnpm test:run
72+
73+
# coverage (optional)
74+
pnpm test:coverage
75+
76+
# TypeScript typecheck
77+
pnpm exec tsc --noEmit
78+
```
79+
80+
Notes:
81+
- Test files live in `src/**/*.test.{ts,tsx}`.
82+
- Global test setup (Tauri mocks + jsdom polyfills) is in `src/test/setup.ts`.
83+
- Vitest config is in `vitest.config.ts`.
84+
6585
## Configuration reference
6686
- File: `config.jsonc` (comments + trailing commas allowed)
6787
- Location:
@@ -79,33 +99,33 @@ cargo run -p token_proxy_cli -- --config ./config.jsonc config path
7999
| `max_request_body_bytes` | `20971520` (20 MiB) | 0 = fallback to default. Protects inbound body size. |
80100
| `tray_token_rate.enabled` | `true` | macOS tray live rate; harmless elsewhere. |
81101
| `tray_token_rate.format` | `split` | `combined` (`total`), `split` (`↑in ↓out`), `both` (`total | ↑in ↓out`). |
82-
| `enable_api_format_conversion` | `true` | Allow OpenAI/Anthropic/Gemini fallback via request/response body and SSE stream conversion. |
83102
| `upstream_strategy` | `priority_fill_first` | `priority_fill_first` (default) keeps trying the highest-priority group in list order; `priority_round_robin` rotates within each priority group. |
84103

85104
### Upstream entries (`upstreams[]`)
86105
| Field | Default | Notes |
87106
| --- | --- | --- |
88107
| `id` | required | Unique per upstream. |
89-
| `provider` | required | One of `openai`, `openai-response`, `anthropic`, `gemini`, `kiro`. |
90-
| `base_url` | required | Full base; overlapping path parts are de-duplicated. (`kiro` can be empty.) |
108+
| `providers` | required | One upstream can serve multiple providers. Special providers `kiro/codex/antigravity` cannot be mixed with others. |
109+
| `base_url` | required | Full base; overlapping path parts are de-duplicated. (`providers=["kiro"]` / `["codex"]` / `["antigravity"]` can be empty.) |
91110
| `api_key` | `null` | Provider-specific bearer/key; overrides request headers. |
92-
| `kiro_account_id` | `null` | Required when `provider=kiro`. |
93-
| `preferred_endpoint` | `null` | `kiro` only: `ide` or `cli`. |
111+
| `kiro_account_id` | `null` | Required when `providers=["kiro"]`. |
112+
| `preferred_endpoint` | `null` | `kiro` only (`providers=["kiro"]`): `ide` or `cli`. |
94113
| `proxy_url` | `null` | Per-upstream proxy; supports `http/https/socks5/socks5h`; default is **no system proxy**. `$app_proxy_url` placeholder allowed. |
95114
| `priority` | `0` | Higher = tried earlier. Grouped by priority then by order (or round-robin). |
96115
| `enabled` | `true` | Disabled upstreams are skipped. |
97116
| `model_mappings` | `{}` | Exact / `prefix*` / `*`. Priority: exact > longest prefix > wildcard. Response echoes original alias. |
117+
| `convert_from_map` | `{}` | Explicitly allow inbound format conversion per provider. Example: `{ "openai-response": ["openai_chat", "anthropic_messages"] }`. |
98118
| `overrides.header` | `{}` | Set/remove headers (null removes). Hop-by-hop/Host/Content-Length are always ignored. |
99119

100120
## Routing & format conversion
101121
- Gemini: `/v1beta/models/*:generateContent` and `*:streamGenerateContent``gemini` (SSE supported).
102122
- Anthropic: `/v1/messages` (and subpaths) and `/v1/complete``anthropic` (Kiro shares the same format).
103123
- OpenAI: `/v1/chat/completions``openai`; `/v1/responses``openai-response`.
104124
- Other paths: choose the provider with the highest configured priority; tie-break is `openai` > `openai-response` > `anthropic`.
105-
- If the preferred provider is missing but `enable_api_format_conversion=true`, the proxy auto-converts request/response bodies and streams between supported formats (including SSE).
125+
- Cross-format fallback/conversion is controlled by `upstreams[].convert_from_map` (no global switch). If a provider has no eligible upstream for the inbound format, it won't be selected.
106126
- If `openai` is missing for `/v1/chat/completions`: fallback can be `openai-response`, `anthropic`, or `gemini` (priority-based; tie-break prefers `openai-response`).
107127
- For `/v1/messages`: choose between `anthropic` and `kiro` by priority; tie-break uses upstream id. If the chosen provider returns a retryable error, the proxy will fall back to the other native provider (Anthropic ↔ Kiro) when configured.
108-
- If neither `anthropic` nor `kiro` exists for `/v1/messages` and `enable_api_format_conversion=true`: fallback can be `openai-response`, `openai`, or `gemini` (priority-based; tie-break prefers `openai-response`).
128+
- If neither `anthropic` nor `kiro` exists for `/v1/messages`: fallback can be `openai-response`, `openai`, or `gemini` when the target provider is allowed for `anthropic_messages` via `convert_from_map`.
109129
- If `openai-response` is missing for `/v1/responses`: fallback can be `openai`, `anthropic`, or `gemini` (priority-based; tie-break prefers `openai`).
110130
- If `gemini` is missing for `/v1beta/models/*:generateContent`: fallback can be `openai-response`, `openai`, or `anthropic` (priority-based; tie-break prefers `openai-response`).
111131

README.zh-CN.md

Lines changed: 27 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@
99
---
1010

1111
## 你能得到什么
12-
- 多提供商:`openai``openai-response``anthropic``gemini``kiro`
12+
- 多提供商:`openai``openai-response``anthropic``gemini``kiro``codex``antigravity`
1313
- 内置路由,支持可选的 API 格式互转(OpenAI Chat ⇄ Responses;Anthropic Messages ↔ OpenAI;Gemini ↔ OpenAI/Anthropic,含 SSE)
1414
- 上游优先级 + 两种策略(填满优先级组 / 轮询)
1515
- 模型别名映射(精确 / 前缀* / 通配*),响应会回写原始别名
@@ -62,6 +62,26 @@ cargo run -p token_proxy_cli -- config init
6262
cargo run -p token_proxy_cli -- --config ./config.jsonc config path
6363
```
6464

65+
## 前端测试
66+
```bash
67+
# watch 模式
68+
pnpm test
69+
70+
# 单次运行(CI 友好)
71+
pnpm test:run
72+
73+
# 覆盖率(可选)
74+
pnpm test:coverage
75+
76+
# TypeScript 类型检查
77+
pnpm exec tsc --noEmit
78+
```
79+
80+
说明:
81+
- 测试文件约定:`src/**/*.test.{ts,tsx}`
82+
- 全局测试初始化(Tauri mocks + jsdom polyfills):`src/test/setup.ts`
83+
- Vitest 配置:`vitest.config.ts`
84+
6585
## 配置参考
6686
- 文件:`config.jsonc`(支持注释与尾随逗号)
6787
- 位置:
@@ -79,33 +99,33 @@ cargo run -p token_proxy_cli -- --config ./config.jsonc config path
7999
| `max_request_body_bytes` | `20971520` (20 MiB) | 0 表示回落到默认;保护入站体积 |
80100
| `tray_token_rate.enabled` | `true` | macOS 托盘实时速率;其他平台无害 |
81101
| `tray_token_rate.format` | `split` | `combined`(总数) / `split`(↑入 ↓出) / `both`(总数 | ↑入 ↓出) |
82-
| `enable_api_format_conversion` | `true` | 允许 OpenAI/Anthropic/Gemini 自动 fallback(含请求/响应体转换与 SSE 流式转换) |
83102
| `upstream_strategy` | `priority_fill_first` | `priority_fill_first` 默认先填满高优先级;`priority_round_robin` 在同组内轮询 |
84103

85104
### 上游条目(`upstreams[]`
86105
| 字段 | 默认值 | 说明 |
87106
| --- | --- | --- |
88107
| `id` | 必填 | 唯一 |
89-
| `provider` | 必填 | `openai` / `openai-response` / `anthropic` / `gemini` / `kiro` |
90-
| `base_url` | 必填 | 完整基址,重复路径段会去重(`kiro` 可为空) |
108+
| `providers` | 必填 | 一个上游可同时服务多个 provider。特殊 provider(`kiro/codex/antigravity`)不可与其它 provider 混用。 |
109+
| `base_url` | 必填 | 完整基址,重复路径段会去重(`providers=["kiro"]` / `["codex"]` / `["antigravity"]` 可为空) |
91110
| `api_key` | `null` | 该 provider 的密钥;优先于请求头 |
92-
| `kiro_account_id` | `null` | `provider=kiro` 时必填 |
111+
| `kiro_account_id` | `null` | `providers=["kiro"]` 时必填 |
93112
| `preferred_endpoint` | `null` | `kiro` 专用:`ide``cli` |
94113
| `proxy_url` | `null` | 每个上游独立代理,支持 `http/https/socks5/socks5h`;默认**不走系统代理**;支持 `$app_proxy_url` |
95114
| `priority` | `0` | 越大越先尝试;同组按列表顺序或轮询 |
96115
| `enabled` | `true` | 可临时禁用上游 |
97116
| `model_mappings` | `{}` | 精确 / `前缀*` / `*`;优先级:精确 > 最长前缀 > 通配;响应回写原始模型别名 |
117+
| `convert_from_map` | `{}` | 显式声明允许从哪些入站格式转换后使用该 provider。例:`{ "openai-response": ["openai_chat", "anthropic_messages"] }` |
98118
| `overrides.header` | `{}` | 设置/删除 header(null 表示删除);hop-by-hop/Host/Content-Length 永远忽略 |
99119

100120
## 路由与格式转换
101121
- Gemini:`/v1beta/models/*:generateContent``*:streamGenerateContent``gemini`(支持 SSE)
102122
- Anthropic:`/v1/messages`(含子路径)与 `/v1/complete``anthropic`(Kiro 同格式)
103123
- OpenAI:`/v1/chat/completions``openai``/v1/responses``openai-response`
104124
- 其他路径:按已配置 provider 的最高优先级选择;优先级相同则按 `openai` > `openai-response` > `anthropic` 打破平局
105-
- 若首选 provider 缺失且 `enable_api_format_conversion=true`,将自动在已支持的格式之间转换请求与响应(含 SSE 流式)
125+
- 跨格式 fallback/转换由 `upstreams[].convert_from_map` 控制(不再有全局开关);若某个 provider 在该入站格式下没有任何可用 upstream,则不会被选中。
106126
- `/v1/chat/completions` 缺少 `openai`:可 fallback 到 `openai-response` / `anthropic` / `gemini`(按优先级选择,平级优先 `openai-response`
107127
- `/v1/messages`:在 `anthropic``kiro` 间按优先级选择;平级按 upstream id 排序。若命中 provider 返回“可重试错误”,且另一个 native provider 已配置,则会自动 fallback(Anthropic ↔ Kiro)
108-
-`/v1/messages` 缺少 `anthropic``kiro` 也不存在,且 `enable_api_format_conversion=true` 时:可 fallback 到 `openai-response` / `openai` / `gemini`(按优先级选择,平级优先 `openai-response`
128+
-`/v1/messages` 缺少 `anthropic``kiro` 也不存在时:若目标 provider 在 `convert_from_map` 中允许 `anthropic_messages`,则可 fallback 到 `openai-response` / `openai` / `gemini`(按优先级选择,平级优先 `openai-response`
109129
- `/v1/responses` 缺少 `openai-response`:可 fallback 到 `openai` / `anthropic` / `gemini`(按优先级选择,平级优先 `openai`
110130
- `/v1beta/models/*:generateContent` 缺少 `gemini`:可 fallback 到 `openai-response` / `openai` / `anthropic`(按优先级选择,平级优先 `openai-response`
111131

crates/token_proxy_core/src/proxy/config/io.rs

Lines changed: 23 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -4,14 +4,23 @@ use std::time::Instant;
44
use crate::paths::TokenProxyPaths;
55

66
use super::ProxyConfigFile;
7+
use super::migrate::migrate_config_json;
78

89
const DEFAULT_CONFIG_HEADER: &str = concat!(
910
"// Token Proxy config (JSONC). Comments and trailing commas are supported.\n",
1011
"// log_level (optional): silent|error|warn|info|debug|trace. Default: silent.\n",
1112
"// app_proxy_url (optional): http(s)://... | socks5(h)://... (used for app updates and upstream proxy reuse).\n",
12-
"// upstreams[].proxy_url (optional): empty => direct; \"$app_proxy_url\" => use app_proxy_url; or an explicit proxy URL.\n"
13+
"// upstreams[].proxy_url (optional): empty => direct; \"$app_proxy_url\" => use app_proxy_url; or an explicit proxy URL.\n",
14+
"// upstreams[].providers (required): one upstream can serve multiple providers. Example: [\"openai\", \"openai-response\"].\n",
15+
"// upstreams[].convert_from_map (optional): explicitly allow inbound format conversion per provider.\n",
16+
"// Example: { \"openai-response\": [\"openai_chat\", \"anthropic_messages\"] }\n"
1317
);
1418

19+
struct ParsedConfigFile {
20+
config: ProxyConfigFile,
21+
migrated: bool,
22+
}
23+
1524
pub(super) async fn load_config_file(paths: &TokenProxyPaths) -> Result<ProxyConfigFile, String> {
1625
let path = paths.config_file();
1726
tracing::debug!(path = %path.display(), "load_config_file start");
@@ -24,7 +33,12 @@ pub(super) async fn load_config_file(paths: &TokenProxyPaths) -> Result<ProxyCon
2433
elapsed_ms = start.elapsed().as_millis(),
2534
"load_config_file read"
2635
);
27-
parse_config_file(&contents, &path)
36+
let parsed = parse_config_file(&contents, &path)?;
37+
if parsed.migrated {
38+
tracing::info!(path = %path.display(), "config migrated, writing back");
39+
save_config_file(paths, &parsed.config).await?;
40+
}
41+
Ok(parsed.config)
2842
}
2943
Err(err) if err.kind() == std::io::ErrorKind::NotFound => {
3044
tracing::debug!(
@@ -91,10 +105,14 @@ pub(super) async fn init_default_config_file(paths: &TokenProxyPaths) -> Result<
91105
save_config_file(paths, &ProxyConfigFile::default()).await
92106
}
93107

94-
fn parse_config_file(contents: &str, path: &Path) -> Result<ProxyConfigFile, String> {
108+
fn parse_config_file(contents: &str, path: &Path) -> Result<ParsedConfigFile, String> {
95109
let sanitized = crate::jsonc::sanitize_jsonc(contents);
96-
serde_json::from_str(&sanitized)
97-
.map_err(|err| format!("Failed to parse config file {}: {err}", path.display()))
110+
let mut value: serde_json::Value = serde_json::from_str(&sanitized)
111+
.map_err(|err| format!("Failed to parse config file {}: {err}", path.display()))?;
112+
let migrated = migrate_config_json(&mut value);
113+
let config: ProxyConfigFile = serde_json::from_value(value)
114+
.map_err(|err| format!("Failed to parse config file {}: {err}", path.display()))?;
115+
Ok(ParsedConfigFile { config, migrated })
98116
}
99117

100118
async fn read_existing_header(path: &Path) -> Option<String> {

0 commit comments

Comments
 (0)