Skip to content

Commit e8e42f9

Browse files
lenoxysCopilot
andauthored
Docker Improvements: Streamlined Configuration and Enhanced OpenWebUI Integration (#14)
* feat: update version to 0.12.0 in Cargo.toml and bump dependencies in Cargo.lock * refactor: move Docker files to their own directory This commit reorganizes the Docker setup for improved organization and maintainability: - Moved Dockerfile and docker-compose files to dedicated 'docker' directory - Updated README.md to reference Docker documentation in the new location - Simplified main documentation to focus on non-Docker installation - Added custom-config.json to .gitignore - Streamlined setup instructions in the main README - Added explicit step to download a model before running - Improved OpenWebUI configuration instructions for non-Docker installations The Docker setup now lives in its own dedicated directory with its own README, making it easier to maintain the containerized deployment separately from the main application code. * Add configuration examples for OpenWebUI integration - Added Configuration Examples section to README.md - Created config-examples directory with sample configurations: - OpenWebUI global configuration showing both secure and direct connections - Model configuration examples demonstrating security proxy usage (PANW prefix) - Model configuration examples demonstrating direct usage (NOPAWN prefix) - Updated .gitignore to remove custom-config.json These examples help users understand how to configure OpenWebUI to work with the security proxy and demonstrate side-by-side comparisons of protected vs unprotected model responses. * refactor(docker): update configuration and use named volumes for Open WebUI - Add explicit compose project name 'panw-api-ollama' to all docker-compose files - Replace static Open WebUI configuration with Docker volume: - Remove hardcoded config files (config.json and config.json.example) - Switch from bind mount to named Docker volume 'open-webui' for persistent data - Remove custom-config.json mount point - Update environment variables for Open WebUI container: - Change OLLAMA_BASE_URL to OLLAMA_BASE_URLS (plural form) - Add ENABLE_OPENAI_API environment variable (defaults to 'false') - Add dedicated 'volumes' section to all docker-compose files This change improves configuration management by leveraging Open WebUI's built-in configuration capabilities rather than using static config files, enhancing portability and simplifying setup across different environments. * Update OpenWebUI configuration and documentation - Remove OpenWebUI custom configuration instructions and file setup - Simplify Docker README by removing reference to custom OpenWebUI config - Update environment variable name from OLLAMA_BASE_URLS to OLLAMA_BASE_URL in docker-compose.apple.yaml - Add ENABLE_OPENAI_API environment variable to docker-compose.win.yaml and docker-compose.yaml - Standardize OpenWebUI configuration across all Docker environments This commit streamlines the OpenWebUI integration by using environment variables instead of custom configuration files, ensuring consistent behavior across all deployment environments. * refactor(docker): revert ollama volume configuration to use named volume * feat(docker): add Dockerfile for building and running panw-api-ollama * Update config-examples/PANW.llama2-uncensored_latest-1747909321539.json Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> * Update README.md Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> * Update README.md Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> --------- Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
1 parent d49173e commit e8e42f9

10 files changed

+598
-297
lines changed

Cargo.lock

Lines changed: 108 additions & 132 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

Cargo.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
[package]
22
name = "panw-api-ollama"
3-
version = "0.11.0"
3+
version = "0.12.0"
44
edition = "2021"
55

66
[dependencies]

README.md

Lines changed: 67 additions & 155 deletions
Original file line numberDiff line numberDiff line change
@@ -30,6 +30,12 @@ The best part? It's completely transparent to your existing setup - [Ollama](htt
3030
- **Protect adversarial input**: Safeguard AI agents from malicious inputs and outputs while maintaining workflow flexibility.
3131
- **Prevent sensitive data leakage**: Use API-based threat detection to block sensitive data leaks during AI interactions.
3232

33+
## Docker Setup
34+
35+
For Docker-based deployment, please refer to the instructions in the [Docker Setup README](docker/README.md).
36+
37+
The Docker setup provides a complete stack with Ollama, panw-api-ollama, and OpenWebUI in a pre-configured environment.
38+
3339
## Quick Start
3440

3541
### Step 1: Install
@@ -61,174 +67,80 @@ pan_api:
6167
6268
### Step 4: Update OpenWebUI
6369
64-
Change the Ollama port in OpenWebUI from 11434 to 11435 by updating your environment settings:
65-
[OpenWebUI Environment Configuration](https://docs.openwebui.com/getting-started/env-configuration#ollama_base_urls)
66-
67-
### Step 5: Run
68-
69-
```
70-
./target/release/panw-api-ollama
71-
```
72-
73-
You're all set! You can now use OpenWebUI as normal, but with enterprise security scanning all interactions.
74-
75-
## Docker Setup
76-
77-
You can easily run this entire stack (Ollama, panw-api-ollama, and OpenWebUI) using Docker Compose:
78-
79-
### Step 1: Configure your environment variables
80-
81-
Create a `.env` file in the root directory with your configuration:
82-
83-
```bash
84-
# Required for security
85-
SECURITY_API_KEY=your_panw_api_key_here
86-
SECURITY_PROFILE_NAME=your_profile_name
87-
88-
# Optional configuration (defaults shown)
89-
SERVER_HOST=0.0.0.0
90-
SERVER_PORT=11435
91-
SERVER_DEBUG_LEVEL=INFO
92-
OLLAMA_BASE_URL=http://ollama:11434
93-
SECURITY_BASE_URL=https://service.api.aisecurity.paloaltonetworks.com
94-
SECURITY_APP_NAME=panw-api-ollama
95-
SECURITY_APP_USER=docker
96-
RUST_LOG=info
97-
98-
# OpenWebUI and Ollama settings
99-
OPEN_WEBUI_PORT=3000
100-
OLLAMA_DOCKER_TAG=latest
101-
WEBUI_DOCKER_TAG=main
102-
```
103-
104-
### Step 2: Start the Docker stack
70+
For non-Docker installations, you need to change the Ollama port in OpenWebUI from 11434 to 11435:
10571
106-
```bash
107-
docker-compose up -d
108-
```
109-
110-
This will start three containers:
111-
- **ollama**: The Ollama service on port 11434 (internal only, not exposed to host)
112-
- **panw-api-ollama**: The security broker service on port 11435 (internal only, not exposed to host)
113-
- **open-webui**: The UI running on port 3000, connected to your security broker and exposed to the host system
114-
115-
### Platform-Specific Docker Configurations
72+
1. Go to Settings > Server Management in the OpenWebUI interface
73+
2. Add a new Ollama server with URL: `http://localhost:11435`
74+
3. Save your configuration
11675

117-
The project includes optimized Docker Compose configurations for different platforms:
118-
119-
#### Standard Configuration (All Platforms)
120-
```bash
121-
docker-compose up -d
122-
```
123-
124-
#### Windows with NVIDIA GPU
125-
For Windows users with NVIDIA GPUs:
126-
```bash
127-
docker-compose -f docker-compose.win.yaml up -d
128-
```
129-
130-
### Apple Silicon Native Installation
131-
132-
For optimal performance on Apple Silicon Macs (M1/M2/M3/M4), using native Ollama installation is recommended:
133-
134-
#### Step 1: Install Ollama natively
135-
Download and install Ollama from [ollama.com/download](https://ollama.com/download)
136-
137-
#### Step 2: Start native Ollama
138-
Launch the Ollama app on your Mac or start it from terminal:
139-
```bash
140-
ollama serve
141-
```
142-
143-
#### Step 3: Run Docker components with native Ollama
144-
Use the special Docker Compose file that connects to your native Ollama instance:
145-
```bash
146-
docker-compose -f docker-compose.apple.yaml up -d
147-
```
148-
149-
This configuration:
150-
- Uses your natively installed Ollama with full Apple Silicon hardware acceleration
151-
- Runs panw-api-ollama and OpenWebUI in containers
152-
- Connects the containerized components to your native Ollama instance
153-
154-
### Step 3: Access OpenWebUI
155-
156-
Open your browser and navigate to:
157-
```
158-
http://localhost:3000
159-
```
160-
161-
OpenWebUI will automatically connect to your panw-api-ollama broker, which then securely connects to Ollama.
162-
163-
### Environment Variables
164-
165-
You can customize your Docker deployment using these environment variables:
166-
167-
#### Required Environment Variables:
168-
- `SECURITY_API_KEY`: Your Palo Alto Networks API key
169-
- `SECURITY_PROFILE_NAME`: Your security profile name
170-
171-
#### Optional Environment Variables:
172-
- **Server Configuration**:
173-
- `SERVER_HOST`: Host to bind the server to (default: 0.0.0.0)
174-
- `SERVER_PORT`: Port to listen on (default: 11435)
175-
- `SERVER_DEBUG_LEVEL`: Logging level: INFO, DEBUG, ERROR (default: INFO)
176-
177-
- **Ollama Configuration**:
178-
- `OLLAMA_BASE_URL`: URL to connect to Ollama (default: http://ollama:11434)
179-
180-
- **Security Configuration**:
181-
- `SECURITY_BASE_URL`: Base URL for the security API (default: https://service.api.aisecurity.paloaltonetworks.com)
182-
- `SECURITY_APP_NAME`: Application name (default: panw-api-ollama)
183-
- `SECURITY_APP_USER`: Application user identifier (default: docker)
184-
185-
- **Docker Image Tags**:
186-
- `OLLAMA_DOCKER_TAG`: Specify the Ollama image version (default: latest)
187-
- `WEBUI_DOCKER_TAG`: Specify the OpenWebUI image version (default: main)
188-
189-
- **Port Mappings**:
190-
- `OPEN_WEBUI_PORT`: Change the port for OpenWebUI (default: 3000)
191-
- `PANW_API_PORT`: Change the port for panw-api-ollama (default: 11435)
192-
193-
- **Logging**:
194-
- `RUST_LOG`: Set the logging level for panw-api-ollama (default: info)
195-
196-
Example with custom settings:
197-
```bash
198-
OPEN_WEBUI_PORT=8080 RUST_LOG=debug SECURITY_APP_USER=custom-user docker-compose up -d
199-
```
200-
201-
## GitHub Container Registry
202-
203-
This project publishes Docker images to the GitHub Container Registry (ghcr.io), making it easy to deploy without building the image yourself.
76+
Alternatively, update your OpenWebUI environment settings:
77+
[OpenWebUI Environment Configuration](https://docs.openwebui.com/getting-started/env-configuration#ollama_base_urls)
20478

205-
### Using the Pre-built Image
79+
### Step 5: Download a model
20680

207-
You can use the pre-built Docker image from GitHub Container Registry in your docker-compose.yaml:
81+
Before using the service, make sure you have a model available:
20882

20983
```bash
210-
# Pull and run using the latest image
211-
docker-compose up -d
84+
ollama pull llama2-uncensored:latest
21285
```
21386

214-
By default, docker-compose will use the latest image from `ghcr.io/paloaltonetworks/panw-api-ollama`. You can specify a different version tag using the `PANW_API_IMAGE` environment variable:
87+
### Step 6: Run
21588

21689
```bash
217-
# Use a specific version
218-
PANW_API_IMAGE=ghcr.io/paloaltonetworks/panw-api-ollama:v0.9.0 docker-compose up -d
219-
220-
# Or build from local source instead of using the registry
221-
PANW_API_IMAGE='' docker-compose up -d
90+
./target/release/panw-api-ollama
22291
```
22392

224-
### Container Image Release Tags
225-
226-
The following tags are available for the Docker image:
93+
You're all set! You can now use OpenWebUI as normal, but with enterprise security scanning all interactions.
22794

228-
- `latest`: Points to the most recent release
229-
- `vX.Y.Z`: Specific version (e.g., `v0.9.0`)
230-
- `vX.Y`: Minor version release (e.g., `v0.9`)
231-
- `vX`: Major version release (e.g., `v0`)
95+
## Configuration Examples
96+
97+
The project includes example configuration files in the `config-examples` directory that demonstrate different setup options:
98+
99+
### OpenWebUI Global Configuration
100+
101+
The `config-1747909231428.json` file shows how to set up OpenWebUI with both secured and unsecured Ollama connections:
102+
103+
```json
104+
{
105+
"ollama": {
106+
"enable": true,
107+
"base_urls": [
108+
"http://panw-api-ollama:11435", // Secure connection through panw-api-ollama
109+
"http://host.docker.internal:11434" // Direct connection to Ollama
110+
],
111+
"api_configs": {
112+
"0": {
113+
"enable": true,
114+
"tags": [],
115+
"prefix_id": "PANW", // Models with this prefix use the security proxy
116+
"model_ids": [
117+
"llama2-uncensored:latest"
118+
],
119+
"key": ""
120+
},
121+
"1": {
122+
"enable": true,
123+
"tags": [],
124+
"prefix_id": "NOPAWN", // Models with this prefix bypass the security proxy
125+
"model_ids": [
126+
"nomic-embed-text:latest",
127+
"llama2-uncensored:latest"
128+
],
129+
"key": ""
130+
}
131+
}
132+
}
133+
}
134+
```
135+
136+
### Model Configurations
137+
138+
Two example model configurations are included to demonstrate before/after comparisons:
139+
140+
1. `PANW.llama2-uncensored_latest-1747909321539.json` - A model using the security proxy
141+
2. `NOPAWN.llama2-uncensored_latest-1747909327080.json` - The same model bypassing the security proxy
142+
143+
These configurations allow you to perform side-by-side comparisons and demonstrations of how the Palo Alto Networks AI Runtime Security affects the model responses.
232144

233145
## Resources
234146

Lines changed: 54 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,54 @@
1+
[
2+
{
3+
"id": "NOPAWN.llama2-uncensored:latest",
4+
"name": "NOPAWN llama2-uncensored",
5+
"object": "model",
6+
"created": 1747909318,
7+
"owned_by": "ollama",
8+
"ollama": {
9+
"name": "llama2-uncensored:latest",
10+
"model": "NOPAWN.llama2-uncensored:latest",
11+
"modified_at": "2025-04-05T17:58:58.898718533+02:00",
12+
"size": 3825819449,
13+
"digest": "44040b9222331f7eacd27ec9254e42de585af28d2c5d1211cdaeb3ffa361fe3f",
14+
"details": {
15+
"parent_model": "",
16+
"format": "gguf",
17+
"family": "llama",
18+
"families": null,
19+
"parameter_size": "7B",
20+
"quantization_level": "Q4_0"
21+
},
22+
"urls": [
23+
1
24+
]
25+
},
26+
"tags": [],
27+
"user_id": "6ee7d451-39ea-4255-b027-f445b5863604",
28+
"base_model_id": null,
29+
"params": {},
30+
"meta": {
31+
"profile_image_url": "/static/favicon.png",
32+
"description": null,
33+
"capabilities": {
34+
"vision": true,
35+
"citations": true
36+
},
37+
"suggestion_prompts": null,
38+
"tags": []
39+
},
40+
"access_control": {
41+
"read": {
42+
"group_ids": [],
43+
"user_ids": []
44+
},
45+
"write": {
46+
"group_ids": [],
47+
"user_ids": []
48+
}
49+
},
50+
"is_active": true,
51+
"updated_at": 1747909093,
52+
"created_at": 1747909093
53+
}
54+
]

config-examples/PANW.llama2-uncensored_latest-1747909321539.json

Lines changed: 54 additions & 0 deletions
Large diffs are not rendered by default.
Lines changed: 46 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,46 @@
1+
{
2+
"version": 0,
3+
"ui": {
4+
"enable_signup": false
5+
},
6+
"openai": {
7+
"enable": false,
8+
"api_base_urls": [
9+
"https://api.openai.com/v1"
10+
],
11+
"api_keys": [
12+
""
13+
],
14+
"api_configs": {
15+
"0": {}
16+
}
17+
},
18+
"ollama": {
19+
"enable": true,
20+
"base_urls": [
21+
"http://panw-api-ollama:11435",
22+
"http://host.docker.internal:11434"
23+
],
24+
"api_configs": {
25+
"0": {
26+
"enable": true,
27+
"tags": [],
28+
"prefix_id": "PANW",
29+
"model_ids": [
30+
"llama2-uncensored:latest"
31+
],
32+
"key": ""
33+
},
34+
"1": {
35+
"enable": true,
36+
"tags": [],
37+
"prefix_id": "NOPAWN",
38+
"model_ids": [
39+
"nomic-embed-text:latest",
40+
"llama2-uncensored:latest"
41+
],
42+
"key": ""
43+
}
44+
}
45+
}
46+
}

0 commit comments

Comments
 (0)