Skip to content

Commit ac610c4

Browse files
authored
Merge branch 'main' into REL-10773
2 parents d048a37 + ad3444a commit ac610c4

File tree

8 files changed

+196
-27
lines changed

8 files changed

+196
-27
lines changed

.release-please-manifest.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
11
{
2-
"packages/sdk/server-ai": "0.10.1",
2+
"packages/sdk/server-ai": "0.11.0",
33
"packages/ai-providers/server-ai-langchain": "0.1.0"
44
}

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@ We encourage pull requests and other contributions from the community. Check out
2424

2525
## Verifying library build provenance with the SLSA framework
2626

27-
LaunchDarkly uses the [SLSA framework](https://slsa.dev/spec/v1.0/about) (Supply-chain Levels for Software Artifacts) to help developers make their supply chain more secure by ensuring the authenticity and build integrity of our published library packages. To learn more, see the [provenance guide](PROVENANCE.md).
27+
LaunchDarkly uses the [SLSA framework](https://slsa.dev/spec/v1.0/about) (Supply-chain Levels for Software Artifacts) to help developers make their supply chain more secure by ensuring the authenticity and build integrity of our published library packages. To learn more, see the [provenance guide](packages/sdk/server-ai/PROVENANCE.md).
2828

2929
## About LaunchDarkly
3030

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,8 @@
1+
# Changelog
2+
3+
## [0.11.0](https://github.com/launchdarkly/python-server-sdk-ai/compare/launchdarkly-server-sdk-ai-0.10.1...launchdarkly-server-sdk-ai-0.11.0) (2025-12-19)
4+
5+
6+
### Features
7+
8+
* Add Chat and Judge supporting methods ([#73](https://github.com/launchdarkly/python-server-sdk-ai/issues/73)) ([62cb2aa](https://github.com/launchdarkly/python-server-sdk-ai/commit/62cb2aa7f4641777c61fdc2cc6013357ef7289be))
Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ To verify SLSA provenance attestations, we recommend using [slsa-verifier](https
1010

1111
```
1212
# Set the version of the library to verify
13-
VERSION=0.10.1
13+
VERSION=0.11.0
1414
```
1515

1616
<!-- x-release-please-end -->

packages/sdk/server-ai/README.md

Lines changed: 181 additions & 21 deletions
Original file line numberDiff line numberDiff line change
@@ -2,48 +2,208 @@
22

33
This package contains the LaunchDarkly Server-Side AI SDK for Python (`launchdarkly-server-sdk-ai`).
44

5-
## Installation
5+
# ⛔️⛔️⛔️⛔️
6+
7+
> [!CAUTION]
8+
> This library is a alpha version and should not be considered ready for production use while this message is visible.
9+
10+
# ☝️☝️☝️☝️☝️☝️
11+
12+
## LaunchDarkly overview
13+
14+
[LaunchDarkly](https://www.launchdarkly.com) is a feature management platform that serves over 100 billion feature flags daily to help teams build better software, faster. [Get started](https://docs.launchdarkly.com/home/getting-started) using LaunchDarkly today!
15+
16+
[![Twitter Follow](https://img.shields.io/twitter/follow/launchdarkly.svg?style=social&label=Follow&maxAge=2592000)](https://twitter.com/intent/follow?screen_name=launchdarkly)
17+
18+
## Quick Setup
19+
20+
This assumes that you have already installed the LaunchDarkly Python (server-side) SDK.
21+
22+
1. Install this package with `pip`:
623

724
```bash
825
pip install launchdarkly-server-sdk-ai
926
```
1027

11-
## Quick Start
28+
2. Create an AI SDK instance:
1229

1330
```python
1431
from ldclient import LDClient, Config, Context
15-
from ldai import LDAIClient, AICompletionConfigDefault, ModelConfig
32+
from ldai import LDAIClient
1633

17-
# Initialize LaunchDarkly client
34+
# The ld_client instance should be created based on the instructions in the relevant SDK.
1835
ld_client = LDClient(Config("your-sdk-key"))
19-
20-
# Create AI client
2136
ai_client = LDAIClient(ld_client)
37+
```
38+
39+
## Setting Default AI Configurations
40+
41+
When retrieving AI configurations, you need to provide default values that will be used if the configuration is not available from LaunchDarkly:
42+
43+
### Fully Configured Default
44+
45+
```python
46+
from ldai import AICompletionConfigDefault, ModelConfig, LDMessage
47+
48+
default_config = AICompletionConfigDefault(
49+
enabled=True,
50+
model=ModelConfig(
51+
name='gpt-4',
52+
parameters={'temperature': 0.7, 'maxTokens': 1000}
53+
),
54+
messages=[
55+
LDMessage(role='system', content='You are a helpful assistant.')
56+
]
57+
)
58+
```
59+
60+
### Disabled Default
61+
62+
```python
63+
from ldai import AICompletionConfigDefault
64+
65+
default_config = AICompletionConfigDefault(
66+
enabled=False
67+
)
68+
```
69+
70+
## Retrieving AI Configurations
71+
72+
The `completion_config` method retrieves AI configurations from LaunchDarkly with support for dynamic variables and fallback values:
73+
74+
```python
75+
from ldclient import Context
76+
from ldai import LDAIClient, AICompletionConfigDefault, ModelConfig
2277

23-
# Get AI configuration
2478
context = Context.create("user-123")
25-
config = ai_client.completion_config(
26-
"my-ai-config",
79+
ai_config = ai_client.completion_config(
80+
ai_config_key,
2781
context,
28-
AICompletionConfigDefault(
29-
enabled=True,
30-
model=ModelConfig("gpt-4")
31-
)
82+
default_config,
83+
variables={'myVariable': 'My User Defined Variable'} # Variables for template interpolation
3284
)
3385

34-
# Use the configuration with your AI provider
35-
if config.enabled:
36-
# Your AI implementation here
37-
pass
86+
# Ensure configuration is enabled
87+
if ai_config.enabled:
88+
messages = ai_config.messages
89+
model = ai_config.model
90+
tracker = ai_config.tracker
91+
# Use with your AI provider
3892
```
3993

40-
## Documentation
94+
## Chat for Conversational AI
4195

42-
For full documentation, please refer to the [LaunchDarkly AI SDK documentation](https://docs.launchdarkly.com/sdk/ai/python).
96+
`Chat` provides a high-level interface for conversational AI with automatic conversation management and metrics tracking:
97+
98+
- Automatically configures models based on AI configuration
99+
- Maintains conversation history across multiple interactions
100+
- Automatically tracks token usage, latency, and success rates
101+
- Works with any supported AI provider (see [AI Providers](https://github.com/launchdarkly/python-server-sdk-ai#ai-providers) for available packages)
102+
103+
### Using Chat
104+
105+
```python
106+
import asyncio
107+
from ldclient import Context
108+
from ldai import LDAIClient, AICompletionConfigDefault, ModelConfig, LDMessage
109+
110+
# Use the same default_config from the retrieval section above
111+
async def main():
112+
context = Context.create("user-123")
113+
chat = await ai_client.create_chat(
114+
'customer-support-chat',
115+
context,
116+
default_config,
117+
variables={'customerName': 'John'}
118+
)
119+
120+
if chat:
121+
# Simple conversation flow - metrics are automatically tracked by invoke()
122+
response1 = await chat.invoke('I need help with my order')
123+
print(response1.message.content)
124+
125+
response2 = await chat.invoke("What's the status?")
126+
print(response2.message.content)
127+
128+
# Access conversation history
129+
messages = chat.get_messages()
130+
print(f'Conversation has {len(messages)} messages')
131+
132+
asyncio.run(main())
133+
```
134+
135+
## Advanced Usage with Providers
136+
137+
For more control, you can use the configuration directly with AI providers. We recommend using [LaunchDarkly AI Provider packages](https://github.com/launchdarkly/python-server-sdk-ai#ai-providers) when available:
138+
139+
### Using AI Provider Packages
140+
141+
```python
142+
import asyncio
143+
from ldai import LDAIClient, AICompletionConfigDefault, ModelConfig
144+
from ldai.providers.types import LDAIMetrics, TokenUsage
43145

44-
## Contributing
146+
from ldai_langchain import LangChainProvider
45147

46-
See [CONTRIBUTING.md](../../../CONTRIBUTING.md) in the repository root.
148+
async def main():
149+
ai_config = ai_client.completion_config(ai_config_key, context, default_value)
150+
151+
# Create LangChain model from configuration
152+
llm = await LangChainProvider.create_langchain_model(ai_config)
153+
154+
# Use with tracking
155+
response = await ai_config.tracker.track_metrics_of(
156+
lambda: llm.invoke(messages),
157+
lambda result: LangChainProvider.get_ai_metrics_from_response(result)
158+
)
159+
160+
print('AI Response:', response.content)
161+
162+
asyncio.run(main())
163+
```
164+
165+
### Using Custom Providers
166+
167+
```python
168+
import asyncio
169+
from ldai import LDAIClient, AICompletionConfigDefault, ModelConfig
170+
from ldai.providers.types import LDAIMetrics, TokenUsage
171+
172+
async def main():
173+
ai_config = ai_client.completion_config(ai_config_key, context, default_value)
174+
175+
# Define custom metrics mapping for your provider
176+
def map_custom_provider_metrics(response):
177+
return LDAIMetrics(
178+
success=True,
179+
usage=TokenUsage(
180+
total=response.usage.get('total_tokens', 0) if response.usage else 0,
181+
input=response.usage.get('prompt_tokens', 0) if response.usage else 0,
182+
output=response.usage.get('completion_tokens', 0) if response.usage else 0,
183+
)
184+
)
185+
186+
# Use with custom provider and tracking
187+
async def call_custom_provider():
188+
return await custom_provider.generate(
189+
messages=ai_config.messages or [],
190+
model=ai_config.model.name if ai_config.model else 'custom-model',
191+
temperature=ai_config.model.get_parameter('temperature') if ai_config.model else 0.5,
192+
)
193+
194+
result = await ai_config.tracker.track_metrics_of(
195+
call_custom_provider,
196+
map_custom_provider_metrics
197+
)
198+
199+
print('AI Response:', result.content)
200+
201+
asyncio.run(main())
202+
```
203+
204+
## Documentation
205+
206+
For full documentation, please refer to the [LaunchDarkly AI SDK documentation](https://docs.launchdarkly.com/sdk/ai/python).
47207

48208
## License
49209

packages/sdk/server-ai/pyproject.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
[tool.poetry]
22
name = "launchdarkly-server-sdk-ai"
3-
version = "0.10.1"
3+
version = "0.11.0"
44
description = "LaunchDarkly SDK for AI"
55
authors = ["LaunchDarkly <[email protected]>"]
66
license = "Apache-2.0"

packages/sdk/server-ai/src/ldai/__init__.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
__version__ = "0.10.1" # x-release-please-version
1+
__version__ = "0.11.0" # x-release-please-version
22

33
# Export main client
44
# Export chat

release-please-config.json

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,12 +1,13 @@
11
{
22
"include-component-in-tag": true,
3+
"bootstrap-sha": "b63dbb5e3ccd6d2024d5c2b2761e04fcb7e86079",
34
"packages": {
45
"packages/sdk/server-ai": {
56
"release-type": "python",
67
"versioning": "default",
78
"bump-minor-pre-major": true,
89
"include-v-in-tag": false,
9-
"extra-files": ["src/ldai/__init__.py", "../../../PROVENANCE.md"],
10+
"extra-files": ["src/ldai/__init__.py", "PROVENANCE.md"],
1011
"component": "launchdarkly-server-sdk-ai"
1112
},
1213
"packages/ai-providers/server-ai-langchain": {

0 commit comments

Comments
 (0)