Skip to content

Commit 3bec515

Browse files
committed
Add langchain provider package
1 parent daf4813 commit 3bec515

File tree

5 files changed

+739
-12
lines changed

5 files changed

+739
-12
lines changed

packages/ai-providers/server-ai-langchain/README.md

Lines changed: 177 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,23 +1,195 @@
11
# LaunchDarkly AI SDK - LangChain Provider
22

3-
This package provides LangChain integration for the LaunchDarkly Server-Side AI SDK.
3+
[![PyPI](https://img.shields.io/pypi/v/launchdarkly-server-sdk-ai-langchain.svg)](https://pypi.org/project/launchdarkly-server-sdk-ai-langchain/)
44

5-
## Status
6-
7-
🚧 **Coming Soon** - This package is a placeholder for future LangChain integration.
5+
This package provides LangChain integration for the LaunchDarkly Server-Side AI SDK, allowing you to use LangChain models and chains with LaunchDarkly's tracking and configuration capabilities.
86

97
## Installation
108

119
```bash
1210
pip install launchdarkly-server-sdk-ai-langchain
1311
```
1412

13+
You'll also need to install the LangChain provider packages for the models you want to use:
14+
15+
```bash
16+
# For OpenAI
17+
pip install langchain-openai
18+
19+
# For Anthropic
20+
pip install langchain-anthropic
21+
22+
# For Google
23+
pip install langchain-google-genai
24+
```
25+
26+
## Quick Start
27+
28+
```python
29+
import asyncio
30+
from ldclient import LDClient, Config, Context
31+
from ldai import init
32+
from ldai_langchain import LangChainProvider
33+
34+
# Initialize LaunchDarkly client
35+
ld_client = LDClient(Config("your-sdk-key"))
36+
ai_client = init(ld_client)
37+
38+
# Get AI configuration
39+
context = Context.builder("user-123").build()
40+
config = ai_client.config("ai-config-key", context, {})
41+
42+
async def main():
43+
# Create a LangChain provider from the AI configuration
44+
provider = await LangChainProvider.create(config)
45+
46+
# Use the provider to invoke the model
47+
from ldai.models import LDMessage
48+
messages = [
49+
LDMessage(role="system", content="You are a helpful assistant."),
50+
LDMessage(role="user", content="Hello, how are you?"),
51+
]
52+
53+
response = await provider.invoke_model(messages)
54+
print(response.message.content)
55+
56+
asyncio.run(main())
57+
```
58+
1559
## Usage
1660

61+
### Using LangChainProvider with the Create Factory
62+
63+
The simplest way to use the LangChain provider is with the static `create` factory method, which automatically creates the appropriate LangChain model based on your LaunchDarkly AI configuration:
64+
1765
```python
18-
# Coming soon
66+
from ldai_langchain import LangChainProvider
67+
68+
# Create provider from AI configuration
69+
provider = await LangChainProvider.create(ai_config)
70+
71+
# Invoke the model
72+
response = await provider.invoke_model(messages)
1973
```
2074

75+
### Using an Existing LangChain Model
76+
77+
If you already have a LangChain model configured, you can use it directly:
78+
79+
```python
80+
from langchain_openai import ChatOpenAI
81+
from ldai_langchain import LangChainProvider
82+
83+
# Create your own LangChain model
84+
llm = ChatOpenAI(model="gpt-4", temperature=0.7)
85+
86+
# Wrap it with LangChainProvider
87+
provider = LangChainProvider(llm)
88+
89+
# Use with LaunchDarkly tracking
90+
response = await provider.invoke_model(messages)
91+
```
92+
93+
### Structured Output
94+
95+
The provider supports structured output using LangChain's `with_structured_output`:
96+
97+
```python
98+
response_structure = {
99+
"type": "object",
100+
"properties": {
101+
"sentiment": {"type": "string", "enum": ["positive", "negative", "neutral"]},
102+
"confidence": {"type": "number"},
103+
},
104+
"required": ["sentiment", "confidence"],
105+
}
106+
107+
result = await provider.invoke_structured_model(messages, response_structure)
108+
print(result.data) # {"sentiment": "positive", "confidence": 0.95}
109+
```
110+
111+
### Tracking Metrics
112+
113+
Use the provider with LaunchDarkly's tracking capabilities:
114+
115+
```python
116+
# Get the AI config with tracker
117+
config = ai_client.config("ai-config-key", context, {})
118+
119+
# Create provider
120+
provider = await LangChainProvider.create(config)
121+
122+
# Track metrics automatically
123+
async def invoke():
124+
return await provider.invoke_model(messages)
125+
126+
response = await config.tracker.track_metrics_of(
127+
invoke,
128+
lambda r: r.metrics
129+
)
130+
```
131+
132+
### Static Utility Methods
133+
134+
The `LangChainProvider` class provides several utility methods:
135+
136+
#### Converting Messages
137+
138+
```python
139+
from ldai.models import LDMessage
140+
from ldai_langchain import LangChainProvider
141+
142+
messages = [
143+
LDMessage(role="system", content="You are helpful."),
144+
LDMessage(role="user", content="Hello!"),
145+
]
146+
147+
# Convert to LangChain messages
148+
langchain_messages = LangChainProvider.convert_messages_to_langchain(messages)
149+
```
150+
151+
#### Extracting Metrics
152+
153+
```python
154+
from ldai_langchain import LangChainProvider
155+
156+
# After getting a response from LangChain
157+
metrics = LangChainProvider.get_ai_metrics_from_response(ai_message)
158+
print(f"Success: {metrics.success}")
159+
print(f"Tokens used: {metrics.usage.total if metrics.usage else 'N/A'}")
160+
```
161+
162+
#### Provider Name Mapping
163+
164+
```python
165+
# Map LaunchDarkly provider names to LangChain provider names
166+
langchain_provider = LangChainProvider.map_provider("gemini") # Returns "google-genai"
167+
```
168+
169+
## API Reference
170+
171+
### LangChainProvider
172+
173+
#### Constructor
174+
175+
```python
176+
LangChainProvider(llm: BaseChatModel, logger: Optional[Any] = None)
177+
```
178+
179+
#### Static Methods
180+
181+
- `create(ai_config: AIConfigKind, logger: Optional[Any] = None) -> LangChainProvider` - Factory method to create a provider from AI configuration
182+
- `convert_messages_to_langchain(messages: List[LDMessage]) -> List[BaseMessage]` - Convert LaunchDarkly messages to LangChain messages
183+
- `get_ai_metrics_from_response(response: AIMessage) -> LDAIMetrics` - Extract metrics from a LangChain response
184+
- `map_provider(ld_provider_name: str) -> str` - Map LaunchDarkly provider names to LangChain names
185+
- `create_langchain_model(ai_config: AIConfigKind) -> BaseChatModel` - Create a LangChain model from AI configuration
186+
187+
#### Instance Methods
188+
189+
- `invoke_model(messages: List[LDMessage]) -> ChatResponse` - Invoke the model with messages
190+
- `invoke_structured_model(messages: List[LDMessage], response_structure: Dict[str, Any]) -> StructuredResponse` - Invoke with structured output
191+
- `get_chat_model() -> BaseChatModel` - Get the underlying LangChain model
192+
21193
## Documentation
22194

23195
For full documentation, please refer to the [LaunchDarkly AI SDK documentation](https://docs.launchdarkly.com/sdk/ai/python).

packages/ai-providers/server-ai-langchain/pyproject.toml

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -25,8 +25,8 @@ packages = [{ include = "ldai_langchain", from = "src" }]
2525
[tool.poetry.dependencies]
2626
python = ">=3.9,<4"
2727
launchdarkly-server-sdk-ai = ">=0.10.1"
28-
# langchain-core = ">=0.1.0" # Uncomment when implementing
29-
28+
langchain-core = ">=0.2.0"
29+
langchain = ">=0.2.0"
3030

3131
[tool.poetry.group.dev.dependencies]
3232
pytest = ">=2.8"
@@ -44,6 +44,7 @@ non_interactive = true
4444
[tool.pytest.ini_options]
4545
addopts = ["-ra"]
4646
testpaths = ["tests"]
47+
asyncio_mode = "auto"
4748

4849

4950
[build-system]
Lines changed: 6 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,14 +1,15 @@
11
"""LaunchDarkly AI SDK - LangChain Provider.
22
3-
This package provides LangChain integration for the LaunchDarkly Server-Side AI SDK.
3+
This package provides LangChain integration for the LaunchDarkly Server-Side AI SDK,
4+
allowing you to use LangChain models and chains with LaunchDarkly's tracking and
5+
configuration capabilities.
46
"""
57

6-
__version__ = "0.1.0"
8+
from ldai_langchain.langchain_provider import LangChainProvider
79

8-
# Placeholder for future LangChain provider implementation
9-
# from ldai_langchain.langchain_provider import LangChainProvider
10+
__version__ = "0.1.0"
1011

1112
__all__ = [
1213
'__version__',
13-
# 'LangChainProvider', # Uncomment when implemented
14+
'LangChainProvider',
1415
]

0 commit comments

Comments
 (0)