Skip to content

Commit 8b78222

Browse files
committed
feat: Add Chat and Judge supporting methods
1 parent 96c9936 commit 8b78222

File tree

1 file changed

+181
-21
lines changed

1 file changed

+181
-21
lines changed

packages/sdk/server-ai/README.md

Lines changed: 181 additions & 21 deletions
Original file line numberDiff line numberDiff line change
@@ -2,48 +2,208 @@
22

33
This package contains the LaunchDarkly Server-Side AI SDK for Python (`launchdarkly-server-sdk-ai`).
44

5-
## Installation
5+
# ⛔️⛔️⛔️⛔️
6+
7+
> [!CAUTION]
8+
> This library is a alpha version and should not be considered ready for production use while this message is visible.
9+
10+
# ☝️☝️☝️☝️☝️☝️
11+
12+
## LaunchDarkly overview
13+
14+
[LaunchDarkly](https://www.launchdarkly.com) is a feature management platform that serves over 100 billion feature flags daily to help teams build better software, faster. [Get started](https://docs.launchdarkly.com/home/getting-started) using LaunchDarkly today!
15+
16+
[![Twitter Follow](https://img.shields.io/twitter/follow/launchdarkly.svg?style=social&label=Follow&maxAge=2592000)](https://twitter.com/intent/follow?screen_name=launchdarkly)
17+
18+
## Quick Setup
19+
20+
This assumes that you have already installed the LaunchDarkly Python (server-side) SDK.
21+
22+
1. Install this package with `pip`:
623

724
```bash
825
pip install launchdarkly-server-sdk-ai
926
```
1027

11-
## Quick Start
28+
2. Create an AI SDK instance:
1229

1330
```python
1431
from ldclient import LDClient, Config, Context
15-
from ldai import LDAIClient, AICompletionConfigDefault, ModelConfig
32+
from ldai import LDAIClient
1633

17-
# Initialize LaunchDarkly client
34+
# The ld_client instance should be created based on the instructions in the relevant SDK.
1835
ld_client = LDClient(Config("your-sdk-key"))
19-
20-
# Create AI client
2136
ai_client = LDAIClient(ld_client)
37+
```
38+
39+
## Setting Default AI Configurations
40+
41+
When retrieving AI configurations, you need to provide default values that will be used if the configuration is not available from LaunchDarkly:
42+
43+
### Fully Configured Default
44+
45+
```python
46+
from ldai import AICompletionConfigDefault, ModelConfig, LDMessage
47+
48+
default_config = AICompletionConfigDefault(
49+
enabled=True,
50+
model=ModelConfig(
51+
name='gpt-4',
52+
parameters={'temperature': 0.7, 'maxTokens': 1000}
53+
),
54+
messages=[
55+
LDMessage(role='system', content='You are a helpful assistant.')
56+
]
57+
)
58+
```
59+
60+
### Disabled Default
61+
62+
```python
63+
from ldai import AICompletionConfigDefault
64+
65+
default_config = AICompletionConfigDefault(
66+
enabled=False
67+
)
68+
```
69+
70+
## Retrieving AI Configurations
71+
72+
The `completion_config` method retrieves AI configurations from LaunchDarkly with support for dynamic variables and fallback values:
73+
74+
```python
75+
from ldclient import Context
76+
from ldai import LDAIClient, AICompletionConfigDefault, ModelConfig
2277

23-
# Get AI configuration
2478
context = Context.create("user-123")
25-
config = ai_client.completion_config(
26-
"my-ai-config",
79+
ai_config = ai_client.completion_config(
80+
ai_config_key,
2781
context,
28-
AICompletionConfigDefault(
29-
enabled=True,
30-
model=ModelConfig("gpt-4")
31-
)
82+
default_config,
83+
variables={'myVariable': 'My User Defined Variable'} # Variables for template interpolation
3284
)
3385

34-
# Use the configuration with your AI provider
35-
if config.enabled:
36-
# Your AI implementation here
37-
pass
86+
# Ensure configuration is enabled
87+
if ai_config.enabled:
88+
messages = ai_config.messages
89+
model = ai_config.model
90+
tracker = ai_config.tracker
91+
# Use with your AI provider
3892
```
3993

40-
## Documentation
94+
## Chat for Conversational AI
4195

42-
For full documentation, please refer to the [LaunchDarkly AI SDK documentation](https://docs.launchdarkly.com/sdk/ai/python).
96+
`Chat` provides a high-level interface for conversational AI with automatic conversation management and metrics tracking:
97+
98+
- Automatically configures models based on AI configuration
99+
- Maintains conversation history across multiple interactions
100+
- Automatically tracks token usage, latency, and success rates
101+
- Works with any supported AI provider (see [AI Providers](https://github.com/launchdarkly/python-server-sdk-ai#ai-providers) for available packages)
102+
103+
### Using Chat
104+
105+
```python
106+
import asyncio
107+
from ldclient import Context
108+
from ldai import LDAIClient, AICompletionConfigDefault, ModelConfig, LDMessage
109+
110+
# Use the same default_config from the retrieval section above
111+
async def main():
112+
context = Context.create("user-123")
113+
chat = await ai_client.create_chat(
114+
'customer-support-chat',
115+
context,
116+
default_config,
117+
variables={'customerName': 'John'}
118+
)
119+
120+
if chat:
121+
# Simple conversation flow - metrics are automatically tracked by invoke()
122+
response1 = await chat.invoke('I need help with my order')
123+
print(response1.message.content)
124+
125+
response2 = await chat.invoke("What's the status?")
126+
print(response2.message.content)
127+
128+
# Access conversation history
129+
messages = chat.get_messages()
130+
print(f'Conversation has {len(messages)} messages')
131+
132+
asyncio.run(main())
133+
```
134+
135+
## Advanced Usage with Providers
136+
137+
For more control, you can use the configuration directly with AI providers. We recommend using [LaunchDarkly AI Provider packages](https://github.com/launchdarkly/python-server-sdk-ai#ai-providers) when available:
138+
139+
### Using AI Provider Packages
140+
141+
```python
142+
import asyncio
143+
from ldai import LDAIClient, AICompletionConfigDefault, ModelConfig
144+
from ldai.providers.types import LDAIMetrics, TokenUsage
43145

44-
## Contributing
146+
from ldai_langchain import LangChainProvider
45147

46-
See [CONTRIBUTING.md](../../../CONTRIBUTING.md) in the repository root.
148+
async def main():
149+
ai_config = ai_client.completion_config(ai_config_key, context, default_value)
150+
151+
# Create LangChain model from configuration
152+
llm = await LangChainProvider.create_langchain_model(ai_config)
153+
154+
# Use with tracking
155+
response = await ai_config.tracker.track_metrics_of(
156+
lambda: llm.invoke(messages),
157+
lambda result: LangChainProvider.get_ai_metrics_from_response(result)
158+
)
159+
160+
print('AI Response:', response.content)
161+
162+
asyncio.run(main())
163+
```
164+
165+
### Using Custom Providers
166+
167+
```python
168+
import asyncio
169+
from ldai import LDAIClient, AICompletionConfigDefault, ModelConfig
170+
from ldai.providers.types import LDAIMetrics, TokenUsage
171+
172+
async def main():
173+
ai_config = ai_client.completion_config(ai_config_key, context, default_value)
174+
175+
# Define custom metrics mapping for your provider
176+
def map_custom_provider_metrics(response):
177+
return LDAIMetrics(
178+
success=True,
179+
usage=TokenUsage(
180+
total=response.usage.get('total_tokens', 0) if response.usage else 0,
181+
input=response.usage.get('prompt_tokens', 0) if response.usage else 0,
182+
output=response.usage.get('completion_tokens', 0) if response.usage else 0,
183+
)
184+
)
185+
186+
# Use with custom provider and tracking
187+
async def call_custom_provider():
188+
return await custom_provider.generate(
189+
messages=ai_config.messages or [],
190+
model=ai_config.model.name if ai_config.model else 'custom-model',
191+
temperature=ai_config.model.get_parameter('temperature') if ai_config.model else 0.5,
192+
)
193+
194+
result = await ai_config.tracker.track_metrics_of(
195+
call_custom_provider,
196+
map_custom_provider_metrics
197+
)
198+
199+
print('AI Response:', result.content)
200+
201+
asyncio.run(main())
202+
```
203+
204+
## Documentation
205+
206+
For full documentation, please refer to the [LaunchDarkly AI SDK documentation](https://docs.launchdarkly.com/sdk/ai/python).
47207

48208
## License
49209

0 commit comments

Comments
 (0)