You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/en/latest/plugins/ai-proxy-multi.md
+55Lines changed: 55 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -193,3 +193,58 @@ curl "http://127.0.0.1:9180/apisix/admin/routes" -X PUT \
193
193
```
194
194
195
195
In the above configuration `priority` for the deepseek provider is set to `0`. Which means if `openai` provider is unavailable then `ai-proxy-multi` plugin will retry sending request to `deepseek` in the second attempt.
196
+
197
+
### Send request to an OpenAI compatible LLM
198
+
199
+
Create a route with the `ai-proxy-multi` plugin with `provider.name` set to `openai-compatible` and the endpoint of the model set to `provider.override.endpoint` like so:
200
+
201
+
```shell
202
+
curl "http://127.0.0.1:9180/apisix/admin/routes" -X PUT \
0 commit comments