You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -11,40 +11,58 @@ This tutorial is a community contribution and is not supported by the Open WebUI
11
11
12
12
---
13
13
14
-
# Integrating Open-WebUI with Amazon Bedrock
14
+
# Integrating OpenWebUI with Amazon Bedrock
15
15
16
-
In this tutorial, we'll explore one of the most common and popular approaches to integrate Open-WebUI with Amazon Bedrock.
17
-
18
-
## Prerequisites
19
-
20
-
In order to follow this tutorial, you must have the following:
21
-
22
-
- An active AWS account
23
-
- An active AWS Access Key and Secret Key
24
-
- IAM permissions in AWS to enable Bedrock models or already enabled models
25
-
- Docker installed on your system
16
+
In this tutorial, we'll explore the most common and popular approaches to integrate Open WebUI with Amazon Bedrock.
26
17
27
18
## What is Amazon Bedrock
28
19
29
20
Direct from AWS' website:
30
21
31
22
"Amazon Bedrock is a fully managed service that offers a choice of high-performing foundation models (FMs) from leading AI companies like AI21 Labs, Anthropic, Cohere, Luma, Meta, Mistral AI, poolside (coming soon), Stability AI, and Amazon through a single API, along with a broad set of capabilities you need to build generative AI applications with security, privacy, and responsible AI. Using Amazon Bedrock, you can easily experiment with and evaluate top FMs for your use case, privately customize them with your data using techniques such as fine-tuning and Retrieval Augmented Generation (RAG), and build agents that execute tasks using your enterprise systems and data sources. Since Amazon Bedrock is serverless, you don't have to manage any infrastructure, and you can securely integrate and deploy generative AI capabilities into your applications using the AWS services you are already familiar with."
32
23
33
-
To learn more about Bedrock, visit: [Amazon Bedrock's Official Page](https://aws.amazon.com/bedrock/)
24
+
To learn more about Bedrock, visit the [Amazon Bedrock official page](https://aws.amazon.com/bedrock/).
25
+
26
+
# Integration Options
27
+
28
+
There are multiple OpenAI-compatible ways to connect Open WebUI to AWS Bedrock:
29
+
30
+
***Bedrock Access Gateway** (BAG)
31
+
***stdapi.ai**
32
+
***LiteLLM** with its Bedrock provider (LiteLLM is not dedicated to AWS).
33
+
***Bedrock Mantle** - AWS native solution, no installation required
| License | MIT | AGPL or Commercial | MIT or Commercial | AWS Service |
34
49
35
50
# Integration Steps
36
51
37
-
## Step 1: Verify access to Amazon Bedrock Base Models
52
+
## Solution 1: Bedrock Access Gateway (BAG)
38
53
39
-
Before we can integrate with Bedrock, you first have to verify that you have access to at least one, but preferably many, of the available Base Models. At the time of this writing (Feb 2025), there were 47 base models available. You can see in the screenshot below that I have access to multiple models. You'll know if you have access to a model if it says "✅ Access Granted" next to the model. If you don't have access to any models, you will get an error on the next step.
54
+
### Prerequisites
40
55
41
-
AWS provides good documentation for request accessing / enabling these models in the [Amazon Bedrock's Model Access Docs](https://docs.aws.amazon.com/bedrock/latest/userguide/model-access-modify.html)
56
+
In order to follow this tutorial, you must have the following:
42
57
43
-

58
+
- An active AWS account
59
+
- An active AWS Access Key and Secret Key
60
+
- IAM permissions in AWS to enable Bedrock models or already enabled models
61
+
- Docker installed on your system
44
62
45
-
## Step 2: Configure the Bedrock Access Gateway
63
+
###Step 1: Configure the Bedrock Access Gateway
46
64
47
-
Now that we have access to at least one Bedrock base model, we need to configure the Bedrock Access Gateway, or BAG. You can think of the BAG as kind of proxy or middleware developed by AWS that wraps around AWS native endpoints/SDK for Bedrock and, in turn, exposes endpoints that are compatible with OpenAI's schema, which is what Open-WebUI requires.
65
+
We need to configure the Bedrock Access Gateway, or BAG. You can think of the BAG as kind of proxy or middleware developed by AWS that wraps around AWS native endpoints/SDK for Bedrock and, in turn, exposes endpoints that are compatible with OpenAI's schema, which is what Open-WebUI requires.
48
66
49
67
For reference, here is a simple mapping between the endpoints:
50
68
@@ -75,9 +93,9 @@ You should now be able to access the BAG's swagger page at: http://localhost:800
Open WebUI connects to it as if it were OpenAI, and stdapi.ai routes requests to Bedrock and other AWS AI services such as Amazon Polly and Transcribe. It also supports multi-region access to Bedrock, making it easier to reach more models that may only be available in specific AWS regions.
120
+
121
+
### stdapi.ai Deployment
122
+
123
+
#### Deploying on AWS
124
+
125
+
stdapi.ai provides a full Terraform sample that provisions Open WebUI on ECS Fargate, connects it to stdapi.ai, and includes supporting services like Elasticache Valkey, Aurora PostgreSQL with vector extension, SearXNG, and Playwright.
126
+
This method handles both the stdapi.ai and Open WebUI configuration:
127
+
128
+
-[stdapi.ai Documentation - Open WebUI integration](https://stdapi.ai/use_cases_openwebui/)
129
+
-[stdapi-ai GitHub - Open WebUI Terraform sample](https://github.com/stdapi-ai/samples/tree/main/getting_started_openwebui)
130
+
131
+
stdapi.ai also provides documentation and Terraform samples to deploy it independently if you prefer to connect it to an existing Open WebUI instance.
stdapi.ai also provides a Docker image for local usage.
138
+
139
+
Here is a minimal command to run it using your AWS access key:
140
+
```bash
141
+
docker run \
142
+
-e AWS_ACCESS_KEY_ID=$AWS_ACCESS_KEY_ID \
143
+
-e AWS_SECRET_ACCESS_KEY=$AWS_SECRET_ACCESS_KEY \
144
+
-e AWS_SESSION_TOKEN=$AWS_SESSION_TOKEN \
145
+
-e AWS_BEDROCK_REGIONS=us-east-1,us-west-2 \
146
+
-e ENABLE_DOCS=true \
147
+
--rm \
148
+
-p 8000:8000 \
149
+
ghcr.io/stdapi-ai/stdapi.ai-community:latest
150
+
```
151
+
The application is now available at http://localhost:8000 (use it as `YOUR_STDAPI_URL` in the Open WebUI configuration below).
152
+
153
+
The `AWS_BEDROCK_REGIONS` variable lets you select regions where you want to load models, in this case `us-east-1` and `us-west-2`.
154
+
155
+
If you pass the `ENABLE_DOCS=true` variable, an interactive Swagger documentation page is available at http://localhost:8000/docs.
156
+
157
+
`API_KEY=my_secret_password` can also be used to set a custom API key for the application (defaults to no API key required). This is highly recommended if the server is reachable from elsewhere. Use this API key as `YOUR_STDAPI_KEY` in the Open WebUI configuration below.
158
+
159
+
Many other configuration options are available; see [the documentation](https://stdapi.ai/operations_configuration/) for more information.
160
+
161
+
### Open WebUI Configuration
162
+
163
+
Open WebUI is configured via environment variables, and you can also set the same values from the Open WebUI admin panel.
164
+
165
+
Use the same stdapi.ai key for all `*_OPENAI_API_KEY` entries.
166
+
167
+
#### Core connection (chat + background tasks)
168
+
169
+
```bash
170
+
OPENAI_API_BASE_URL=YOUR_STDAPI_URL/v1
171
+
OPENAI_API_KEY=YOUR_STDAPI_KEY
172
+
TASK_MODEL_EXTERNAL=amazon.nova-micro-v1:0
173
+
```
174
+
175
+
Use a fast, low-cost chat model for `TASK_MODEL_EXTERNAL`.
Pick any image-editing model that supports edits without a mask.
211
+
212
+
#### Speech to text (STT)
213
+
214
+
```bash
215
+
AUDIO_STT_ENGINE=openai
216
+
AUDIO_STT_OPENAI_API_BASE_URL=YOUR_STDAPI_URL/v1
217
+
AUDIO_STT_OPENAI_API_KEY=YOUR_STDAPI_KEY
218
+
AUDIO_STT_MODEL=amazon.transcribe
219
+
```
220
+
221
+
#### Text to speech (TTS)
222
+
223
+
```bash
224
+
AUDIO_TTS_ENGINE=openai
225
+
AUDIO_TTS_OPENAI_API_BASE_URL=YOUR_STDAPI_URL/v1
226
+
AUDIO_TTS_OPENAI_API_KEY=YOUR_STDAPI_KEY
227
+
AUDIO_TTS_MODEL=amazon.polly-neural
228
+
```
229
+
230
+
If you see inconsistent auto-detection for TTS languages, set a fixed language in stdapi.ai (for example, `DEFAULT_TTS_LANGUAGE=en-US`).
231
+
232
+
## Solution 3: AWS Bedrock Mantle
233
+
234
+
[Bedrock Mantle](https://docs.aws.amazon.com/bedrock/latest/userguide/bedrock-mantle.html) is an AWS-native solution that provides an OpenAI-compatible API endpoint for Amazon Bedrock without requiring any additional infrastructure or installation. This makes it the simplest integration option for accessing Bedrock models.
0 commit comments