|
| 1 | +--- |
| 2 | +sidebar_position: 31 |
| 3 | +title: "Integrate with Amazon Bedrock" |
| 4 | +--- |
| 5 | + |
| 6 | +:::warning |
| 7 | + |
| 8 | +This tutorial is a community contribution and is not supported by the Open WebUI team. It serves only as a demonstration on how to customize Open WebUI for your specific use case. Want to contribute? Check out the contributing tutorial. |
| 9 | + |
| 10 | +::: |
| 11 | + |
| 12 | +--- |
| 13 | + |
| 14 | +# Integrating Open WebUI with Amazon Bedrock |
| 15 | + |
| 16 | +In this tutorial, we'll explore the most common and popular approaches to integrate Open WebUI with Amazon Bedrock. |
| 17 | + |
| 18 | +## What is Amazon Bedrock |
| 19 | + |
| 20 | +Direct from AWS' website: |
| 21 | + |
| 22 | +"Amazon Bedrock is a fully managed service that offers a choice of high-performing foundation models (FMs) from leading AI companies like AI21 Labs, Anthropic, Cohere, Luma, Meta, Mistral AI, poolside (coming soon), Stability AI, and Amazon through a single API, along with a broad set of capabilities you need to build generative AI applications with security, privacy, and responsible AI. Using Amazon Bedrock, you can easily experiment with and evaluate top FMs for your use case, privately customize them with your data using techniques such as fine-tuning and Retrieval Augmented Generation (RAG), and build agents that execute tasks using your enterprise systems and data sources. Since Amazon Bedrock is serverless, you don't have to manage any infrastructure, and you can securely integrate and deploy generative AI capabilities into your applications using the AWS services you are already familiar with." |
| 23 | + |
| 24 | +To learn more about Bedrock, visit the [Amazon Bedrock official page](https://aws.amazon.com/bedrock/). |
| 25 | + |
| 26 | +# Integration Options |
| 27 | + |
| 28 | +There are multiple OpenAI-compatible ways to connect Open WebUI to AWS Bedrock: |
| 29 | + |
| 30 | +* **Bedrock Access Gateway** (BAG) |
| 31 | +* **stdapi.ai** |
| 32 | +* **LiteLLM** with its Bedrock provider (LiteLLM is not dedicated to AWS). |
| 33 | +* **Bedrock Mantle** - AWS native solution, no installation required |
| 34 | + |
| 35 | +## Feature Comparison |
| 36 | + |
| 37 | +| Capability | Bedrock Access Gateway (BAG) | stdapi.ai | LiteLLM (Bedrock provider) | AWS Bedrock Mantle | |
| 38 | +|------------------------------| --- | --- | --- | --- | |
| 39 | +| Automatic models discovery | ✅ | ✅ | — | ✅ | |
| 40 | +| Chat completion | ✅ | ✅ | ✅ | ✅ | |
| 41 | +| Embeddings | ✅ | ✅ | ✅ | — | |
| 42 | +| Text to speech | — | ✅ | — | — | |
| 43 | +| Speech to text | — | ✅ | — | — | |
| 44 | +| Image generation | — | ✅ | ✅ | — | |
| 45 | +| Image editing | — | ✅ | — | — | |
| 46 | +| Models from multiple regions | — | ✅ | ✅ | — | |
| 47 | +| No installation required | — | — | — | ✅ | |
| 48 | +| License | MIT | AGPL or Commercial | MIT or Commercial | AWS Service | |
| 49 | + |
| 50 | +# Integration Steps |
| 51 | + |
| 52 | +## Solution 1: Bedrock Access Gateway (BAG) |
| 53 | + |
| 54 | +### Prerequisites |
| 55 | + |
| 56 | +In order to follow this tutorial, you must have the following: |
| 57 | + |
| 58 | +- An active AWS account |
| 59 | +- An active AWS Access Key and Secret Key |
| 60 | +- IAM permissions in AWS to enable Bedrock models or already enabled models |
| 61 | +- Docker installed on your system |
| 62 | + |
| 63 | +### Step 1: Configure the Bedrock Access Gateway |
| 64 | + |
| 65 | +We need to configure the Bedrock Access Gateway, or BAG. You can think of the BAG as kind of proxy or middleware developed by AWS that wraps around AWS native endpoints/SDK for Bedrock and, in turn, exposes endpoints that are compatible with OpenAI's schema, which is what Open-WebUI requires. |
| 66 | + |
| 67 | +For reference, here is a simple mapping between the endpoints: |
| 68 | + |
| 69 | +| OpenAI Endpoint | Bedrock Method | |
| 70 | +|-----------------------|------------------------| |
| 71 | +| `/models` | list_inference_profiles | |
| 72 | +| `/models/{model_id}` | list_inference_profiles | |
| 73 | +| `/chat/completions` | converse or converse_stream | |
| 74 | +| `/embeddings` | invoke_model | |
| 75 | + |
| 76 | +The BAG repo can be found in the [Bedrock Access Gateway Repo](https://github.com/aws-samples/bedrock-access-gateway) |
| 77 | + |
| 78 | +To set-up the BAG, follow the below steps: |
| 79 | + |
| 80 | +- Clone the BAG repo |
| 81 | +- Remove the default `dockerfile` |
| 82 | +- Change the name of the `Dockerfile_ecs` to `Dockerfile` |
| 83 | + |
| 84 | +We're now ready to build and launch the docker container using: |
| 85 | + |
| 86 | +```bash |
| 87 | +docker build . -f Dockerfile -t bedrock-gateway |
| 88 | + |
| 89 | +docker run -e AWS_ACCESS_KEY_ID=$AWS_ACCESS_KEY_ID -e AWS_SECRET_ACCESS_KEY=$AWS_SECRET_ACCESS_KEY -e AWS_SESSION_TOKEN=$AWS_SESSION_TOKEN -e AWS_REGION=us-east-1 -d -p 8000:80 bedrock-gateway |
| 90 | +``` |
| 91 | + |
| 92 | +You should now be able to access the BAG's swagger page at: http://localhost:8000/docs |
| 93 | + |
| 94 | +:::warning Troubleshooting: Container Exits Immediately |
| 95 | + |
| 96 | +If the Bedrock Gateway container starts and immediately exits (especially on Windows), check the logs with `docker logs <container_id>`. If you see Python/Uvicorn errors, this is likely a **Python 3.13 compatibility issue** with the BAG's Dockerfile. |
| 97 | + |
| 98 | +**Workaround:** Edit the `Dockerfile` before building and change the Python version from 3.13 to 3.12: |
| 99 | + |
| 100 | +```dockerfile |
| 101 | +# Change this line: |
| 102 | +FROM python:3.13-slim |
| 103 | +# To: |
| 104 | +FROM python:3.12-slim |
| 105 | +``` |
| 106 | + |
| 107 | +Then rebuild with `docker build . -f Dockerfile -t bedrock-gateway`. |
| 108 | + |
| 109 | +::: |
| 110 | + |
| 111 | + |
| 112 | + |
| 113 | +### Step 2: Add Connection in Open WebUI |
| 114 | + |
| 115 | +Now that you the BAG up-and-running, it's time to add it as a new connection in Open WebUI. |
| 116 | + |
| 117 | +- Under the Admin Panel, go to Settings -> Connections. |
| 118 | +- Use the "+" (plus) button to add a new connection under the OpenAI |
| 119 | +- For the URL, use "http://host.docker.internal:8000/api/v1" |
| 120 | +- For the password, the default password defined in BAG is "bedrock". You can always change this password in the BAG settings (see DEFAULT_API_KEYS) |
| 121 | +- Click the "Verify Connection" button and you should see "Server connection verified" alert in the top-right |
| 122 | + |
| 123 | + |
| 124 | + |
| 125 | +### Other Helpful Tutorials |
| 126 | + |
| 127 | +These are a few other very helpful tutorials when working to integrate Open WebUI with Amazon Bedrock using the Bedrock Access Gateway. |
| 128 | + |
| 129 | +- https://gauravve.medium.com/connecting-open-webui-to-aws-bedrock-a1f0082c8cb2 |
| 130 | +- https://jrpospos.blog/posts/2024/08/using-amazon-bedrock-with-openwebui-when-working-with-sensitive-data/ |
| 131 | + |
| 132 | +## Solution 2: stdapi.ai |
| 133 | + |
| 134 | +[stdapi.ai](https://stdapi.ai/) is an OpenAI-compatible API gateway you deploy in your AWS account, or run locally using Docker. |
| 135 | + |
| 136 | +Open WebUI connects to it as if it were OpenAI, and stdapi.ai routes requests to Bedrock and other AWS AI services such as Amazon Polly and Transcribe. It also supports multi-region access to Bedrock, making it easier to reach more models that may only be available in specific AWS regions. |
| 137 | + |
| 138 | +### stdapi.ai Deployment |
| 139 | + |
| 140 | +#### Deploying on AWS |
| 141 | + |
| 142 | +stdapi.ai provides a full Terraform sample that provisions Open WebUI on ECS Fargate, connects it to stdapi.ai, and includes supporting services like Elasticache Valkey, Aurora PostgreSQL with vector extension, SearXNG, and Playwright. |
| 143 | +This method handles both the stdapi.ai and Open WebUI configuration: |
| 144 | + |
| 145 | +- [stdapi.ai Documentation - Open WebUI integration](https://stdapi.ai/use_cases_openwebui/) |
| 146 | +- [stdapi-ai GitHub - Open WebUI Terraform sample](https://github.com/stdapi-ai/samples/tree/main/getting_started_openwebui) |
| 147 | + |
| 148 | +stdapi.ai also provides documentation and Terraform samples to deploy it independently if you prefer to connect it to an existing Open WebUI instance. |
| 149 | + |
| 150 | +- [stdapi.ai Documentation - Getting started](https://stdapi.ai/operations_getting_started/) |
| 151 | + |
| 152 | +#### Deploying Locally |
| 153 | + |
| 154 | +stdapi.ai also provides a Docker image for local usage. |
| 155 | + |
| 156 | +Here is a minimal command to run it using your AWS access key: |
| 157 | +```bash |
| 158 | +docker run \ |
| 159 | + -e AWS_ACCESS_KEY_ID=$AWS_ACCESS_KEY_ID \ |
| 160 | + -e AWS_SECRET_ACCESS_KEY=$AWS_SECRET_ACCESS_KEY \ |
| 161 | + -e AWS_SESSION_TOKEN=$AWS_SESSION_TOKEN \ |
| 162 | + -e AWS_BEDROCK_REGIONS=us-east-1,us-west-2 \ |
| 163 | + -e ENABLE_DOCS=true \ |
| 164 | + --rm \ |
| 165 | + -p 8000:8000 \ |
| 166 | + ghcr.io/stdapi-ai/stdapi.ai-community:latest |
| 167 | +``` |
| 168 | +The application is now available at http://localhost:8000 (use it as `YOUR_STDAPI_URL` in the Open WebUI configuration below). |
| 169 | + |
| 170 | +The `AWS_BEDROCK_REGIONS` variable lets you select regions where you want to load models, in this case `us-east-1` and `us-west-2`. |
| 171 | + |
| 172 | +If you pass the `ENABLE_DOCS=true` variable, an interactive Swagger documentation page is available at http://localhost:8000/docs. |
| 173 | + |
| 174 | +`API_KEY=my_secret_password` can also be used to set a custom API key for the application (defaults to no API key required). This is highly recommended if the server is reachable from elsewhere. Use this API key as `YOUR_STDAPI_KEY` in the Open WebUI configuration below. |
| 175 | + |
| 176 | +Many other configuration options are available; see [the documentation](https://stdapi.ai/operations_configuration/) for more information. |
| 177 | + |
| 178 | +### Open WebUI Configuration |
| 179 | + |
| 180 | +Open WebUI is configured via environment variables, and you can also set the same values from the Open WebUI admin panel. |
| 181 | + |
| 182 | +Use the same stdapi.ai key for all `*_OPENAI_API_KEY` entries. |
| 183 | + |
| 184 | +#### Core connection (chat + background tasks) |
| 185 | + |
| 186 | +```bash |
| 187 | +OPENAI_API_BASE_URL=YOUR_STDAPI_URL/v1 |
| 188 | +OPENAI_API_KEY=YOUR_STDAPI_KEY |
| 189 | +TASK_MODEL_EXTERNAL=amazon.nova-micro-v1:0 |
| 190 | +``` |
| 191 | + |
| 192 | +Use a fast, low-cost chat model for `TASK_MODEL_EXTERNAL`. |
| 193 | + |
| 194 | +#### RAG embeddings |
| 195 | + |
| 196 | +```bash |
| 197 | +RAG_EMBEDDING_ENGINE=openai |
| 198 | +RAG_OPENAI_API_BASE_URL=YOUR_STDAPI_URL/v1 |
| 199 | +RAG_OPENAI_API_KEY=YOUR_STDAPI_KEY |
| 200 | +RAG_EMBEDDING_MODEL=cohere.embed-v4:0 |
| 201 | +``` |
| 202 | + |
| 203 | +Pick any embedding model you prefer. |
| 204 | + |
| 205 | +#### Image generation |
| 206 | + |
| 207 | +```bash |
| 208 | +ENABLE_IMAGE_GENERATION=true |
| 209 | +IMAGE_GENERATION_ENGINE=openai |
| 210 | +IMAGES_OPENAI_API_BASE_URL=YOUR_STDAPI_URL/v1 |
| 211 | +IMAGES_OPENAI_API_KEY=YOUR_STDAPI_KEY |
| 212 | +IMAGE_GENERATION_MODEL=stability.stable-image-core-v1:1 |
| 213 | +``` |
| 214 | + |
| 215 | +Choose any image generation model you prefer. |
| 216 | + |
| 217 | +#### Image editing |
| 218 | + |
| 219 | +```bash |
| 220 | +ENABLE_IMAGE_EDIT=true |
| 221 | +IMAGE_EDIT_ENGINE=openai |
| 222 | +IMAGES_EDIT_OPENAI_API_BASE_URL=YOUR_STDAPI_URL/v1 |
| 223 | +IMAGES_EDIT_OPENAI_API_KEY=YOUR_STDAPI_KEY |
| 224 | +IMAGE_EDIT_MODEL=stability.stable-image-control-structure-v1:0 |
| 225 | +``` |
| 226 | + |
| 227 | +Pick any image-editing model that supports edits without a mask. |
| 228 | + |
| 229 | +#### Speech to text (STT) |
| 230 | + |
| 231 | +```bash |
| 232 | +AUDIO_STT_ENGINE=openai |
| 233 | +AUDIO_STT_OPENAI_API_BASE_URL=YOUR_STDAPI_URL/v1 |
| 234 | +AUDIO_STT_OPENAI_API_KEY=YOUR_STDAPI_KEY |
| 235 | +AUDIO_STT_MODEL=amazon.transcribe |
| 236 | +``` |
| 237 | + |
| 238 | +#### Text to speech (TTS) |
| 239 | + |
| 240 | +```bash |
| 241 | +AUDIO_TTS_ENGINE=openai |
| 242 | +AUDIO_TTS_OPENAI_API_BASE_URL=YOUR_STDAPI_URL/v1 |
| 243 | +AUDIO_TTS_OPENAI_API_KEY=YOUR_STDAPI_KEY |
| 244 | +AUDIO_TTS_MODEL=amazon.polly-neural |
| 245 | +``` |
| 246 | + |
| 247 | +If you see inconsistent auto-detection for TTS languages, set a fixed language in stdapi.ai (for example, `DEFAULT_TTS_LANGUAGE=en-US`). |
| 248 | + |
| 249 | +## Solution 3: AWS Bedrock Mantle |
| 250 | + |
| 251 | +[Bedrock Mantle](https://docs.aws.amazon.com/bedrock/latest/userguide/bedrock-mantle.html) is an AWS-native solution that provides an OpenAI-compatible API endpoint for Amazon Bedrock without requiring any additional infrastructure or installation. This makes it the simplest integration option for accessing Bedrock models. |
| 252 | + |
| 253 | +### Key Advantages |
| 254 | + |
| 255 | +- **No installation required** - Uses AWS-managed endpoints directly |
| 256 | +- **Simple configuration** - Just requires an API key |
| 257 | +- **Native AWS integration** - Fully managed by AWS |
| 258 | + |
| 259 | +### Limitations |
| 260 | + |
| 261 | +- **Chat completion only** - Does not support embeddings, image generation, or other features |
| 262 | +- **Subset of models** - Only provides access to a limited selection of Bedrock models (Open weight models) |
| 263 | +- **Single region** - Does not support multi-region access |
| 264 | + |
| 265 | +### Prerequisites |
| 266 | + |
| 267 | +- An active AWS account |
| 268 | +- An [Amazon Bedrock API key](https://docs.aws.amazon.com/bedrock/latest/userguide/api-keys.html) (create one from the AWS console) |
| 269 | +- IAM permissions to use Bedrock models (recommended: `AmazonBedrockMantleInferenceAccess` IAM policy) |
| 270 | + |
| 271 | +### Configuration |
| 272 | + |
| 273 | +Configure Open WebUI using environment variables: |
| 274 | + |
| 275 | +```bash |
| 276 | +OPENAI_API_BASE_URL=https://bedrock.us-east-1.api.aws/v1 |
| 277 | +OPENAI_API_KEY=your_bedrock_api_key |
| 278 | +``` |
| 279 | + |
| 280 | +Replace `your_bedrock_api_key` with the [Amazon Bedrock API key](https://docs.aws.amazon.com/bedrock/latest/userguide/api-keys.html) you created. |
| 281 | + |
| 282 | +Replace `us-east-1` in the URL with your preferred AWS region (e.g., `us-west-2`, `eu-west-1`, etc.). |
| 283 | + |
| 284 | +You can also set the same values from the Open WebUI admin panel. |
| 285 | + |
| 286 | +For more information, see the [Bedrock Mantle documentation](https://docs.aws.amazon.com/bedrock/latest/userguide/bedrock-mantle.html). |
| 287 | + |
| 288 | +# Start using Bedrock Base Models |
| 289 | + |
| 290 | +You should now see all your Bedrock models available! |
| 291 | + |
| 292 | + |
0 commit comments