Skip to content

Commit 0cd322a

Browse files
authored
Merge pull request #258 from codelion/feat-ssl-verification
Add configurable SSL certificate verification support
2 parents 6956fd1 + f82443b commit 0cd322a

File tree

10 files changed

+561
-17
lines changed

10 files changed

+561
-17
lines changed

.github/workflows/publish-docker-offline-amd64.yml

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,14 @@ jobs:
1313
steps:
1414
- uses: actions/checkout@v4
1515

16+
- name: Free up disk space
17+
run: |
18+
sudo rm -rf /usr/share/dotnet
19+
sudo rm -rf /usr/local/lib/android
20+
sudo rm -rf /opt/ghc
21+
sudo rm -rf /opt/hostedtoolcache/CodeQL
22+
docker system prune -af
23+
1624
- name: Set up Docker Buildx
1725
uses: docker/setup-buildx-action@v3
1826

.github/workflows/publish-docker-offline-arm64.yml

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,14 @@ jobs:
1313
steps:
1414
- uses: actions/checkout@v4
1515

16+
- name: Free up disk space
17+
run: |
18+
sudo rm -rf /usr/share/dotnet
19+
sudo rm -rf /usr/local/lib/android
20+
sudo rm -rf /opt/ghc
21+
sudo rm -rf /opt/hostedtoolcache/CodeQL
22+
docker system prune -af
23+
1624
- name: Set up QEMU
1725
uses: docker/setup-qemu-action@v3
1826

README.md

Lines changed: 35 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -102,10 +102,18 @@ docker run -p 8000:8000 ghcr.io/codelion/optillm:latest
102102
2024-10-22 07:45:06,293 - INFO - Starting server with approach: auto
103103
```
104104

105-
To use optillm without local inference and only as a proxy you can add the `-proxy` suffix.
105+
**Available Docker image variants:**
106+
107+
- **Full image** (`latest`): Includes all dependencies for local inference and plugins
108+
- **Proxy-only** (`latest-proxy`): Lightweight image without local inference capabilities
109+
- **Offline** (`latest-offline`): Self-contained image with pre-downloaded models (spaCy) for fully offline operation
106110

107111
```bash
112+
# Proxy-only (smallest)
108113
docker pull ghcr.io/codelion/optillm:latest-proxy
114+
115+
# Offline (largest, includes pre-downloaded models)
116+
docker pull ghcr.io/codelion/optillm:latest-offline
109117
```
110118

111119
### Install from source
@@ -120,6 +128,32 @@ source .venv/bin/activate
120128
pip install -r requirements.txt
121129
```
122130

131+
## 🔒 SSL Configuration
132+
133+
OptILLM supports SSL certificate verification configuration for working with self-signed certificates or corporate proxies.
134+
135+
**Disable SSL verification (development only):**
136+
```bash
137+
# Command line
138+
optillm --no-ssl-verify
139+
140+
# Environment variable
141+
export OPTILLM_SSL_VERIFY=false
142+
optillm
143+
```
144+
145+
**Use custom CA certificate:**
146+
```bash
147+
# Command line
148+
optillm --ssl-cert-path /path/to/ca-bundle.crt
149+
150+
# Environment variable
151+
export OPTILLM_SSL_CERT_PATH=/path/to/ca-bundle.crt
152+
optillm
153+
```
154+
155+
⚠️ **Security Note**: Disabling SSL verification is insecure and should only be used in development. For production environments with custom CAs, use `--ssl-cert-path` instead. See [SSL_CONFIGURATION.md](SSL_CONFIGURATION.md) for details.
156+
123157
## Implemented techniques
124158

125159
| Approach | Slug | Description |

SSL_CONFIGURATION.md

Lines changed: 88 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,88 @@
1+
# SSL Certificate Configuration
2+
3+
OptILLM now supports SSL certificate verification configuration to work with self-signed certificates or corporate proxies.
4+
5+
## Usage
6+
7+
### Disable SSL Verification (Development Only)
8+
9+
**⚠️ WARNING: Only use this in development environments. Disabling SSL verification is insecure.**
10+
11+
#### Via Command Line
12+
```bash
13+
python optillm.py --no-ssl-verify
14+
```
15+
16+
#### Via Environment Variable
17+
```bash
18+
export OPTILLM_SSL_VERIFY=false
19+
python optillm.py
20+
```
21+
22+
### Use Custom CA Certificate Bundle
23+
24+
For corporate environments with custom Certificate Authorities:
25+
26+
#### Via Command Line
27+
```bash
28+
python optillm.py --ssl-cert-path /path/to/ca-bundle.crt
29+
```
30+
31+
#### Via Environment Variable
32+
```bash
33+
export OPTILLM_SSL_CERT_PATH=/path/to/ca-bundle.crt
34+
python optillm.py
35+
```
36+
37+
## Configuration Options
38+
39+
| Option | Environment Variable | Default | Description |
40+
|--------|---------------------|---------|-------------|
41+
| `--ssl-verify` / `--no-ssl-verify` | `OPTILLM_SSL_VERIFY` | `true` | Enable/disable SSL certificate verification |
42+
| `--ssl-cert-path` | `OPTILLM_SSL_CERT_PATH` | `""` | Path to custom CA certificate bundle |
43+
44+
## Affected Components
45+
46+
SSL configuration applies to:
47+
- **OpenAI API clients** (OpenAI, Azure, Cerebras)
48+
- **HTTP plugins** (readurls, deep_research)
49+
- **All external HTTPS connections**
50+
51+
## Examples
52+
53+
### Development with Self-Signed Certificate
54+
```bash
55+
# Disable SSL verification temporarily
56+
python optillm.py --no-ssl-verify --base-url https://localhost:8443/v1
57+
```
58+
59+
### Production with Corporate CA
60+
```bash
61+
# Use corporate certificate bundle
62+
python optillm.py --ssl-cert-path /etc/ssl/certs/corporate-ca-bundle.crt
63+
```
64+
65+
### Docker Environment
66+
```bash
67+
docker run -e OPTILLM_SSL_VERIFY=false optillm
68+
```
69+
70+
## Security Notes
71+
72+
1. **Never disable SSL verification in production** - This makes your application vulnerable to man-in-the-middle attacks
73+
2. **Use custom CA bundles instead** - For corporate environments, provide the proper CA certificate path
74+
3. **Warning messages** - When SSL verification is disabled, OptILLM will log a warning message for security awareness
75+
76+
## Testing
77+
78+
Run the SSL configuration test suite:
79+
```bash
80+
python -m unittest tests.test_ssl_config -v
81+
```
82+
83+
This validates:
84+
- CLI argument parsing
85+
- Environment variable configuration
86+
- HTTP client SSL settings
87+
- Plugin SSL propagation
88+
- Warning messages

optillm/__init__.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
# Version information
2-
__version__ = "0.3.1"
2+
__version__ = "0.3.2"
33

44
# Import from server module
55
from .server import (

optillm/plugins/deep_research_plugin.py

Lines changed: 7 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -66,6 +66,9 @@ def create(self, **kwargs):
6666
)
6767
else:
6868
# OpenAI or AzureOpenAI
69+
# Get existing http_client to preserve SSL settings
70+
existing_http_client = getattr(self.parent.client, '_client', None)
71+
6972
if 'Azure' in self.parent.client.__class__.__name__:
7073
from openai import AzureOpenAI
7174
# AzureOpenAI has different parameters
@@ -75,15 +78,17 @@ def create(self, **kwargs):
7578
azure_endpoint=getattr(self.parent.client, 'azure_endpoint', None),
7679
azure_ad_token_provider=getattr(self.parent.client, 'azure_ad_token_provider', None),
7780
timeout=self.parent.timeout,
78-
max_retries=self.parent.max_retries
81+
max_retries=self.parent.max_retries,
82+
http_client=existing_http_client
7983
)
8084
else:
8185
from openai import OpenAI
8286
custom_client = OpenAI(
8387
api_key=self.parent.client.api_key,
8488
base_url=getattr(self.parent.client, 'base_url', None),
8589
timeout=self.parent.timeout,
86-
max_retries=self.parent.max_retries
90+
max_retries=self.parent.max_retries,
91+
http_client=existing_http_client
8792
)
8893
return custom_client.chat.completions.create(**kwargs)
8994
except Exception as e:

optillm/plugins/readurls_plugin.py

Lines changed: 19 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,10 @@
11
import re
2-
from typing import Tuple, List
2+
from typing import Tuple, List, Optional
33
import requests
44
import os
55
from bs4 import BeautifulSoup
66
from urllib.parse import urlparse
7-
from optillm import __version__
7+
from optillm import __version__, server_config
88

99
SLUG = "readurls"
1010

@@ -24,13 +24,27 @@ def extract_urls(text: str) -> List[str]:
2424

2525
return cleaned_urls
2626

27-
def fetch_webpage_content(url: str, max_length: int = 100000) -> str:
27+
def fetch_webpage_content(url: str, max_length: int = 100000, verify_ssl: Optional[bool] = None, cert_path: Optional[str] = None) -> str:
2828
try:
2929
headers = {
3030
'User-Agent': f'optillm/{__version__} (https://github.com/codelion/optillm)'
3131
}
32-
33-
response = requests.get(url, headers=headers, timeout=10)
32+
33+
# Use SSL configuration from server_config if not explicitly provided
34+
if verify_ssl is None:
35+
verify_ssl = server_config.get('ssl_verify', True)
36+
if cert_path is None:
37+
cert_path = server_config.get('ssl_cert_path', '')
38+
39+
# Determine verify parameter for requests
40+
if not verify_ssl:
41+
verify = False
42+
elif cert_path:
43+
verify = cert_path
44+
else:
45+
verify = True
46+
47+
response = requests.get(url, headers=headers, timeout=10, verify=verify)
3448
response.raise_for_status()
3549

3650
# Make a soup

optillm/server.py

Lines changed: 40 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -58,7 +58,24 @@
5858
conversation_logger = None
5959

6060
def get_config():
61+
import httpx
62+
6163
API_KEY = None
64+
65+
# Create httpx client with SSL configuration
66+
ssl_verify = server_config.get('ssl_verify', True)
67+
ssl_cert_path = server_config.get('ssl_cert_path', '')
68+
69+
# Determine SSL verification setting
70+
if not ssl_verify:
71+
logger.warning("SSL certificate verification is DISABLED. This is insecure and should only be used for development.")
72+
http_client = httpx.Client(verify=False)
73+
elif ssl_cert_path:
74+
logger.info(f"Using custom CA certificate bundle: {ssl_cert_path}")
75+
http_client = httpx.Client(verify=ssl_cert_path)
76+
else:
77+
http_client = httpx.Client(verify=True)
78+
6279
if os.environ.get("OPTILLM_API_KEY"):
6380
# Use local inference engine
6481
from optillm.inference import create_inference_client
@@ -69,16 +86,16 @@ def get_config():
6986
API_KEY = os.environ.get("CEREBRAS_API_KEY")
7087
base_url = server_config['base_url']
7188
if base_url != "":
72-
default_client = Cerebras(api_key=API_KEY, base_url=base_url)
89+
default_client = Cerebras(api_key=API_KEY, base_url=base_url, http_client=http_client)
7390
else:
74-
default_client = Cerebras(api_key=API_KEY)
91+
default_client = Cerebras(api_key=API_KEY, http_client=http_client)
7592
elif os.environ.get("OPENAI_API_KEY"):
7693
API_KEY = os.environ.get("OPENAI_API_KEY")
7794
base_url = server_config['base_url']
7895
if base_url != "":
79-
default_client = OpenAI(api_key=API_KEY, base_url=base_url)
96+
default_client = OpenAI(api_key=API_KEY, base_url=base_url, http_client=http_client)
8097
else:
81-
default_client = OpenAI(api_key=API_KEY)
98+
default_client = OpenAI(api_key=API_KEY, http_client=http_client)
8299
elif os.environ.get("AZURE_OPENAI_API_KEY"):
83100
API_KEY = os.environ.get("AZURE_OPENAI_API_KEY")
84101
API_VERSION = os.environ.get("AZURE_API_VERSION")
@@ -88,6 +105,7 @@ def get_config():
88105
api_key=API_KEY,
89106
api_version=API_VERSION,
90107
azure_endpoint=AZURE_ENDPOINT,
108+
http_client=http_client
91109
)
92110
else:
93111
from azure.identity import DefaultAzureCredential, get_bearer_token_provider
@@ -96,7 +114,8 @@ def get_config():
96114
default_client = AzureOpenAI(
97115
api_version=API_VERSION,
98116
azure_endpoint=AZURE_ENDPOINT,
99-
azure_ad_token_provider=token_provider
117+
azure_ad_token_provider=token_provider,
118+
http_client=http_client
100119
)
101120
else:
102121
# Import the LiteLLM wrapper
@@ -152,7 +171,7 @@ def count_reasoning_tokens(text: str, tokenizer=None) -> int:
152171

153172
# Server configuration
154173
server_config = {
155-
'approach': 'none',
174+
'approach': 'none',
156175
'mcts_simulations': 2,
157176
'mcts_exploration': 0.2,
158177
'mcts_depth': 1,
@@ -167,6 +186,8 @@ def count_reasoning_tokens(text: str, tokenizer=None) -> int:
167186
'return_full_response': False,
168187
'port': 8000,
169188
'log': 'info',
189+
'ssl_verify': True,
190+
'ssl_cert_path': '',
170191
}
171192

172193
# List of known approaches
@@ -977,7 +998,19 @@ def parse_args():
977998
base_url_default = os.environ.get("OPTILLM_BASE_URL", "")
978999
parser.add_argument("--base-url", "--base_url", dest="base_url", type=str, default=base_url_default,
9791000
help="Base url for OpenAI compatible endpoint")
980-
1001+
1002+
# SSL configuration arguments
1003+
ssl_verify_default = os.environ.get("OPTILLM_SSL_VERIFY", "true").lower() in ("true", "1", "yes")
1004+
parser.add_argument("--ssl-verify", dest="ssl_verify", action="store_true" if ssl_verify_default else "store_false",
1005+
default=ssl_verify_default,
1006+
help="Enable SSL certificate verification (default: True)")
1007+
parser.add_argument("--no-ssl-verify", dest="ssl_verify", action="store_false",
1008+
help="Disable SSL certificate verification")
1009+
1010+
ssl_cert_path_default = os.environ.get("OPTILLM_SSL_CERT_PATH", "")
1011+
parser.add_argument("--ssl-cert-path", dest="ssl_cert_path", type=str, default=ssl_cert_path_default,
1012+
help="Path to custom CA certificate bundle for SSL verification")
1013+
9811014
# Use the function to get the default path
9821015
default_config_path = get_config_path()
9831016

pyproject.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ build-backend = "setuptools.build_meta"
44

55
[project]
66
name = "optillm"
7-
version = "0.3.1"
7+
version = "0.3.2"
88
description = "An optimizing inference proxy for LLMs."
99
readme = "README.md"
1010
license = "Apache-2.0"

0 commit comments

Comments
 (0)