Skip to content

Commit eae1eb2

Browse files
authored
Merge pull request #1046 from JGoutin/feature/bedrock
docs(AWS Bedrock): Add alternative integrations and remove outdated Bedrock models registration
2 parents d9ad1b4 + 4086ec1 commit eae1eb2

File tree

2 files changed

+180
-23
lines changed

2 files changed

+180
-23
lines changed

docs/getting-started/quick-start/connect-a-provider/starting-with-openai-compatible.mdx

Lines changed: 180 additions & 23 deletions
Original file line numberDiff line numberDiff line change
@@ -223,31 +223,40 @@ If running Open WebUI in Docker and your model server is on the host machine, re
223223
</TabItem>
224224
<TabItem value="bedrock" label="Amazon Bedrock">
225225

226-
**Amazon Bedrock** is a fully managed AWS service that provides access to foundation models from leading AI companies (Anthropic, Meta, Mistral, Cohere, Stability AI, Amazon, and more) through a single API. Bedrock does **not** natively expose an OpenAI-compatible API, so you need to run the **Bedrock Access Gateway (BAG)** — a middleware proxy that translates OpenAI API calls to Bedrock SDK calls.
226+
**Amazon Bedrock** is a fully managed AWS service that provides access to foundation models from leading AI companies (Anthropic, Meta, Mistral, Cohere, Stability AI, Amazon, and more) through a single API.
227227

228-
| Setting | Value |
229-
|---|---|
230-
| **URL** | `http://host.docker.internal:8000/api/v1` |
231-
| **API Key** | `bedrock` (default BAG key — change via `DEFAULT_API_KEYS` in BAG config) |
232-
| **Model IDs** | Auto-detected from your enabled Bedrock models |
228+
There are multiple OpenAI-compatible ways to connect Open WebUI to AWS Bedrock:
233229

234-
**Prerequisites:**
235-
- An active AWS account
236-
- An active AWS Access Key and Secret Key
237-
- IAM permissions in AWS to enable Bedrock models (or already enabled models)
238-
- Docker installed on your system
230+
* **Bedrock Access Gateway** (BAG)
231+
* **stdapi.ai**
232+
* **LiteLLM** with its Bedrock provider (LiteLLM is not dedicated to AWS).
233+
* **Bedrock Mantle** - AWS native solution, no installation required
239234

240-
To learn more about Bedrock, visit the [Amazon Bedrock Official Page](https://aws.amazon.com/bedrock/).
235+
#### Feature Comparison
241236

242-
**Step 1: Verify Access to Amazon Bedrock Base Models**
237+
| Capability | Bedrock Access Gateway (BAG) | stdapi.ai | LiteLLM (Bedrock provider) | AWS Bedrock Mantle |
238+
|------------------------------| --- | --- | --- | --- |
239+
| Automatic models discovery |||||
240+
| Chat completion |||||
241+
| Embeddings |||||
242+
| Text to speech |||||
243+
| Speech to text |||||
244+
| Image generation |||||
245+
| Image editing |||||
246+
| Models from multiple regions |||||
247+
| No installation required |||||
248+
| License | MIT | AGPL or Commercial | MIT or Commercial | AWS Service |
243249

244-
Before integrating, verify you have access to at least one base model. You'll know if you have access if it says "Access Granted" next to the model. If you don't have access to any models, the next steps will fail.
250+
#### Solution 1: Bedrock Access Gateway (BAG)
245251

246-
AWS provides documentation for requesting model access in the [Amazon Bedrock Model Access docs](https://docs.aws.amazon.com/bedrock/latest/userguide/model-access-modify.html).
252+
**Prerequisites**
247253

248-
![Amazon Bedrock Base Models](/images/tutorials/amazon-bedrock/amazon-bedrock-base-models.png)
254+
- An active AWS account
255+
- An active AWS Access Key and Secret Key
256+
- IAM permissions in AWS to enable Bedrock models (or already enabled models)
257+
- Docker installed on your system
249258

250-
**Step 2: Configure the Bedrock Access Gateway**
259+
**Step 1: Configure the Bedrock Access Gateway**
251260

252261
The BAG is a proxy developed by AWS that wraps around the native Bedrock SDK and exposes OpenAI-compatible endpoints. Here's the endpoint mapping:
253262

@@ -281,7 +290,7 @@ If running Open WebUI in Docker and your model server is on the host machine, re
281290

282291
![Bedrock Access Gateway Swagger](/images/tutorials/amazon-bedrock/amazon-bedrock-proxy-api.png)
283292

284-
**Step 3: Add Connection in Open WebUI**
293+
**Step 2: Add Connection in Open WebUI**
285294

286295
1. Under the **Admin Panel**, go to **Settings****Connections**.
287296
2. Use the **+** button to add a new connection under OpenAI.
@@ -291,15 +300,163 @@ If running Open WebUI in Docker and your model server is on the host machine, re
291300

292301
![Add New Connection](/images/tutorials/amazon-bedrock/amazon-bedrock-proxy-connection.png)
293302

294-
**Step 4: Start Using Bedrock Models**
303+
**Other Helpful Tutorials**
304+
305+
- [Connecting Open WebUI to AWS Bedrock](https://gauravve.medium.com/connecting-open-webui-to-aws-bedrock-a1f0082c8cb2)
306+
- [Using Amazon Bedrock with Open WebUI for Sensitive Data](https://jrpospos.blog/posts/2024/08/using-amazon-bedrock-with-openwebui-when-working-with-sensitive-data/)
307+
308+
#### Solution 2: stdapi.ai
309+
310+
[stdapi.ai](https://stdapi.ai/) is an OpenAI-compatible API gateway you deploy in your AWS account, or run locally using Docker.
311+
312+
Open WebUI connects to it as if it were OpenAI, and stdapi.ai routes requests to Bedrock and other AWS AI services such as Amazon Polly and Transcribe. It also supports multi-region access to Bedrock, making it easier to reach more models that may only be available in specific AWS regions.
313+
314+
**Deploying on AWS**
315+
316+
stdapi.ai provides a full Terraform sample that provisions Open WebUI on ECS Fargate, connects it to stdapi.ai, and includes supporting services like Elasticache Valkey, Aurora PostgreSQL with vector extension, SearXNG, and Playwright.
317+
This method handles both the stdapi.ai and Open WebUI configuration:
318+
319+
- [stdapi.ai Documentation - Open WebUI integration](https://stdapi.ai/use_cases_openwebui/)
320+
- [stdapi-ai GitHub - Open WebUI Terraform sample](https://github.com/stdapi-ai/samples/tree/main/getting_started_openwebui)
321+
322+
stdapi.ai also provides documentation and Terraform samples to deploy it independently if you prefer to connect it to an existing Open WebUI instance.
323+
324+
- [stdapi.ai Documentation - Getting started](https://stdapi.ai/operations_getting_started/)
325+
326+
**Deploying Locally**
327+
328+
stdapi.ai also provides a Docker image for local usage.
329+
330+
Here is a minimal command to run it using your AWS access key:
331+
```bash
332+
docker run \
333+
-e AWS_ACCESS_KEY_ID=$AWS_ACCESS_KEY_ID \
334+
-e AWS_SECRET_ACCESS_KEY=$AWS_SECRET_ACCESS_KEY \
335+
-e AWS_SESSION_TOKEN=$AWS_SESSION_TOKEN \
336+
-e AWS_BEDROCK_REGIONS=us-east-1,us-west-2 \
337+
-e ENABLE_DOCS=true \
338+
--rm \
339+
-p 8000:8000 \
340+
ghcr.io/stdapi-ai/stdapi.ai-community:latest
341+
```
342+
The application is now available at http://localhost:8000 (use it as `YOUR_STDAPI_URL` in the Open WebUI configuration below).
343+
344+
The `AWS_BEDROCK_REGIONS` variable lets you select regions where you want to load models, in this case `us-east-1` and `us-west-2`.
345+
346+
If you pass the `ENABLE_DOCS=true` variable, an interactive Swagger documentation page is available at http://localhost:8000/docs.
347+
348+
`API_KEY=my_secret_password` can also be used to set a custom API key for the application (defaults to no API key required). This is highly recommended if the server is reachable from elsewhere. Use this API key as `YOUR_STDAPI_KEY` in the Open WebUI configuration below.
349+
350+
Many other configuration options are available; see [the documentation](https://stdapi.ai/operations_configuration/) for more information.
351+
352+
**Open WebUI Configuration**
353+
354+
Open WebUI is configured via environment variables, and you can also set the same values from the Open WebUI admin panel.
355+
356+
Use the same stdapi.ai key for all `*_OPENAI_API_KEY` entries.
357+
358+
Core connection (chat + background tasks):
359+
360+
```bash
361+
OPENAI_API_BASE_URL=YOUR_STDAPI_URL/v1
362+
OPENAI_API_KEY=YOUR_STDAPI_KEY
363+
# Use a fast, low-cost chat model for `TASK_MODEL_EXTERNAL`.
364+
TASK_MODEL_EXTERNAL=amazon.nova-micro-v1:0
365+
```
366+
367+
RAG embeddings:
368+
369+
```bash
370+
RAG_EMBEDDING_ENGINE=openai
371+
RAG_OPENAI_API_BASE_URL=YOUR_STDAPI_URL/v1
372+
RAG_OPENAI_API_KEY=YOUR_STDAPI_KEY
373+
RAG_EMBEDDING_MODEL=cohere.embed-v4:0
374+
```
375+
376+
Image generation:
377+
378+
```bash
379+
ENABLE_IMAGE_GENERATION=true
380+
IMAGE_GENERATION_ENGINE=openai
381+
IMAGES_OPENAI_API_BASE_URL=YOUR_STDAPI_URL/v1
382+
IMAGES_OPENAI_API_KEY=YOUR_STDAPI_KEY
383+
IMAGE_GENERATION_MODEL=stability.stable-image-core-v1:1
384+
```
385+
386+
Image editing:
387+
388+
```bash
389+
ENABLE_IMAGE_EDIT=true
390+
IMAGE_EDIT_ENGINE=openai
391+
IMAGES_EDIT_OPENAI_API_BASE_URL=YOUR_STDAPI_URL/v1
392+
IMAGES_EDIT_OPENAI_API_KEY=YOUR_STDAPI_KEY
393+
IMAGE_EDIT_MODEL=stability.stable-image-control-structure-v1:0
394+
```
395+
396+
Speech to text (STT):
397+
398+
```bash
399+
AUDIO_STT_ENGINE=openai
400+
AUDIO_STT_OPENAI_API_BASE_URL=YOUR_STDAPI_URL/v1
401+
AUDIO_STT_OPENAI_API_KEY=YOUR_STDAPI_KEY
402+
AUDIO_STT_MODEL=amazon.transcribe
403+
```
404+
405+
Text to speech (TTS):
406+
407+
```bash
408+
AUDIO_TTS_ENGINE=openai
409+
AUDIO_TTS_OPENAI_API_BASE_URL=YOUR_STDAPI_URL/v1
410+
AUDIO_TTS_OPENAI_API_KEY=YOUR_STDAPI_KEY
411+
AUDIO_TTS_MODEL=amazon.polly-neural
412+
```
413+
414+
If you see inconsistent auto-detection for TTS languages, set a fixed language in stdapi.ai (for example, `DEFAULT_TTS_LANGUAGE=en-US`).
415+
416+
#### Solution 3: AWS Bedrock Mantle
295417

296-
You should now see all your enabled Bedrock models available in the model selector.
418+
[Bedrock Mantle](https://docs.aws.amazon.com/bedrock/latest/userguide/bedrock-mantle.html) is an AWS-native solution that provides an OpenAI-compatible API endpoint for Amazon Bedrock without requiring any additional infrastructure or installation. This makes it the simplest integration option for accessing Bedrock models.
419+
420+
**Key Advantages**
421+
422+
- No installation required - Uses AWS-managed endpoints directly
423+
- Simple configuration - Just requires an API key
424+
- Native AWS integration - Fully managed by AWS
425+
426+
**Limitations**
427+
428+
- Chat completion only - Does not support embeddings, image generation, or other features
429+
- Subset of models - Only provides access to a limited selection of Bedrock models (Open weight models)
430+
- Single region - Does not support multi-region access
431+
432+
**Prerequisites**
433+
434+
- An active AWS account
435+
- An [Amazon Bedrock API key](https://docs.aws.amazon.com/bedrock/latest/userguide/api-keys.html) (create one from the AWS console)
436+
- IAM permissions to use Bedrock models (recommended: `AmazonBedrockMantleInferenceAccess` IAM policy)
437+
438+
**Configuration**
439+
440+
Configure Open WebUI using environment variables:
441+
442+
```bash
443+
OPENAI_API_BASE_URL=https://bedrock.us-east-1.api.aws/v1
444+
OPENAI_API_KEY=your_bedrock_api_key
445+
```
446+
447+
Replace `your_bedrock_api_key` with the [Amazon Bedrock API key](https://docs.aws.amazon.com/bedrock/latest/userguide/api-keys.html) you created.
448+
449+
Replace `us-east-1` in the URL with your preferred AWS region (e.g., `us-west-2`, `eu-west-1`, etc.).
450+
451+
You can also set the same values from the Open WebUI admin panel.
452+
453+
For more information, see the [Bedrock Mantle documentation](https://docs.aws.amazon.com/bedrock/latest/userguide/bedrock-mantle.html).
454+
455+
#### Start using Bedrock Base Models
297456

298457
![Use Bedrock Models](/images/tutorials/amazon-bedrock/amazon-bedrock-models-in-oui.png)
299458

300-
**Other helpful tutorials:**
301-
- [Connecting Open WebUI to AWS Bedrock](https://gauravve.medium.com/connecting-open-webui-to-aws-bedrock-a1f0082c8cb2)
302-
- [Using Amazon Bedrock with Open WebUI for Sensitive Data](https://jrpospos.blog/posts/2024/08/using-amazon-bedrock-with-openwebui-when-working-with-sensitive-data/)
459+
You should now see all your Bedrock models available!
303460

304461
</TabItem>
305462
<TabItem value="azure" label="Azure OpenAI">
Binary file not shown.

0 commit comments

Comments
 (0)