You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -223,31 +223,40 @@ If running Open WebUI in Docker and your model server is on the host machine, re
223
223
</TabItem>
224
224
<TabItemvalue="bedrock"label="Amazon Bedrock">
225
225
226
-
**Amazon Bedrock** is a fully managed AWS service that provides access to foundation models from leading AI companies (Anthropic, Meta, Mistral, Cohere, Stability AI, Amazon, and more) through a single API. Bedrock does **not** natively expose an OpenAI-compatible API, so you need to run the **Bedrock Access Gateway (BAG)** — a middleware proxy that translates OpenAI API calls to Bedrock SDK calls.
226
+
**Amazon Bedrock** is a fully managed AWS service that provides access to foundation models from leading AI companies (Anthropic, Meta, Mistral, Cohere, Stability AI, Amazon, and more) through a single API.
| License | MIT | AGPL or Commercial | MIT or Commercial | AWS Service |
243
249
244
-
Before integrating, verify you have access to at least one base model. You'll know if you have access if it says "Access Granted" next to the model. If you don't have access to any models, the next steps will fail.
250
+
#### Solution 1: Bedrock Access Gateway (BAG)
245
251
246
-
AWS provides documentation for requesting model access in the [Amazon Bedrock Model Access docs](https://docs.aws.amazon.com/bedrock/latest/userguide/model-access-modify.html).
252
+
**Prerequisites**
247
253
248
-

254
+
- An active AWS account
255
+
- An active AWS Access Key and Secret Key
256
+
- IAM permissions in AWS to enable Bedrock models (or already enabled models)
257
+
- Docker installed on your system
249
258
250
-
**Step 2: Configure the Bedrock Access Gateway**
259
+
**Step 1: Configure the Bedrock Access Gateway**
251
260
252
261
The BAG is a proxy developed by AWS that wraps around the native Bedrock SDK and exposes OpenAI-compatible endpoints. Here's the endpoint mapping:
253
262
@@ -281,7 +290,7 @@ If running Open WebUI in Docker and your model server is on the host machine, re
1. Under the **Admin Panel**, go to **Settings** → **Connections**.
287
296
2. Use the **+** button to add a new connection under OpenAI.
@@ -291,15 +300,163 @@ If running Open WebUI in Docker and your model server is on the host machine, re
291
300
292
301

293
302
294
-
**Step 4: Start Using Bedrock Models**
303
+
**Other Helpful Tutorials**
304
+
305
+
-[Connecting Open WebUI to AWS Bedrock](https://gauravve.medium.com/connecting-open-webui-to-aws-bedrock-a1f0082c8cb2)
306
+
-[Using Amazon Bedrock with Open WebUI for Sensitive Data](https://jrpospos.blog/posts/2024/08/using-amazon-bedrock-with-openwebui-when-working-with-sensitive-data/)
307
+
308
+
#### Solution 2: stdapi.ai
309
+
310
+
[stdapi.ai](https://stdapi.ai/) is an OpenAI-compatible API gateway you deploy in your AWS account, or run locally using Docker.
311
+
312
+
Open WebUI connects to it as if it were OpenAI, and stdapi.ai routes requests to Bedrock and other AWS AI services such as Amazon Polly and Transcribe. It also supports multi-region access to Bedrock, making it easier to reach more models that may only be available in specific AWS regions.
313
+
314
+
**Deploying on AWS**
315
+
316
+
stdapi.ai provides a full Terraform sample that provisions Open WebUI on ECS Fargate, connects it to stdapi.ai, and includes supporting services like Elasticache Valkey, Aurora PostgreSQL with vector extension, SearXNG, and Playwright.
317
+
This method handles both the stdapi.ai and Open WebUI configuration:
318
+
319
+
-[stdapi.ai Documentation - Open WebUI integration](https://stdapi.ai/use_cases_openwebui/)
320
+
-[stdapi-ai GitHub - Open WebUI Terraform sample](https://github.com/stdapi-ai/samples/tree/main/getting_started_openwebui)
321
+
322
+
stdapi.ai also provides documentation and Terraform samples to deploy it independently if you prefer to connect it to an existing Open WebUI instance.
stdapi.ai also provides a Docker image for local usage.
329
+
330
+
Here is a minimal command to run it using your AWS access key:
331
+
```bash
332
+
docker run \
333
+
-e AWS_ACCESS_KEY_ID=$AWS_ACCESS_KEY_ID \
334
+
-e AWS_SECRET_ACCESS_KEY=$AWS_SECRET_ACCESS_KEY \
335
+
-e AWS_SESSION_TOKEN=$AWS_SESSION_TOKEN \
336
+
-e AWS_BEDROCK_REGIONS=us-east-1,us-west-2 \
337
+
-e ENABLE_DOCS=true \
338
+
--rm \
339
+
-p 8000:8000 \
340
+
ghcr.io/stdapi-ai/stdapi.ai-community:latest
341
+
```
342
+
The application is now available at http://localhost:8000 (use it as `YOUR_STDAPI_URL` in the Open WebUI configuration below).
343
+
344
+
The `AWS_BEDROCK_REGIONS` variable lets you select regions where you want to load models, in this case `us-east-1` and `us-west-2`.
345
+
346
+
If you pass the `ENABLE_DOCS=true` variable, an interactive Swagger documentation page is available at http://localhost:8000/docs.
347
+
348
+
`API_KEY=my_secret_password` can also be used to set a custom API key for the application (defaults to no API key required). This is highly recommended if the server is reachable from elsewhere. Use this API key as `YOUR_STDAPI_KEY` in the Open WebUI configuration below.
349
+
350
+
Many other configuration options are available; see [the documentation](https://stdapi.ai/operations_configuration/) for more information.
351
+
352
+
**Open WebUI Configuration**
353
+
354
+
Open WebUI is configured via environment variables, and you can also set the same values from the Open WebUI admin panel.
355
+
356
+
Use the same stdapi.ai key for all `*_OPENAI_API_KEY` entries.
357
+
358
+
Core connection (chat + background tasks):
359
+
360
+
```bash
361
+
OPENAI_API_BASE_URL=YOUR_STDAPI_URL/v1
362
+
OPENAI_API_KEY=YOUR_STDAPI_KEY
363
+
# Use a fast, low-cost chat model for `TASK_MODEL_EXTERNAL`.
If you see inconsistent auto-detection for TTS languages, set a fixed language in stdapi.ai (for example, `DEFAULT_TTS_LANGUAGE=en-US`).
415
+
416
+
#### Solution 3: AWS Bedrock Mantle
295
417
296
-
You should now see all your enabled Bedrock models available in the model selector.
418
+
[Bedrock Mantle](https://docs.aws.amazon.com/bedrock/latest/userguide/bedrock-mantle.html) is an AWS-native solution that provides an OpenAI-compatible API endpoint for Amazon Bedrock without requiring any additional infrastructure or installation. This makes it the simplest integration option for accessing Bedrock models.
419
+
420
+
**Key Advantages**
421
+
422
+
- No installation required - Uses AWS-managed endpoints directly
423
+
- Simple configuration - Just requires an API key
424
+
- Native AWS integration - Fully managed by AWS
425
+
426
+
**Limitations**
427
+
428
+
- Chat completion only - Does not support embeddings, image generation, or other features
429
+
- Subset of models - Only provides access to a limited selection of Bedrock models (Open weight models)
430
+
- Single region - Does not support multi-region access
431
+
432
+
**Prerequisites**
433
+
434
+
- An active AWS account
435
+
- An [Amazon Bedrock API key](https://docs.aws.amazon.com/bedrock/latest/userguide/api-keys.html) (create one from the AWS console)
436
+
- IAM permissions to use Bedrock models (recommended: `AmazonBedrockMantleInferenceAccess` IAM policy)
-[Connecting Open WebUI to AWS Bedrock](https://gauravve.medium.com/connecting-open-webui-to-aws-bedrock-a1f0082c8cb2)
302
-
-[Using Amazon Bedrock with Open WebUI for Sensitive Data](https://jrpospos.blog/posts/2024/08/using-amazon-bedrock-with-openwebui-when-working-with-sensitive-data/)
459
+
You should now see all your Bedrock models available!
0 commit comments