Skip to content

Commit 70c794f

Browse files
feat: Add langchain provider package (#70)
Tracking as: REL-10773 - Add LangChain provider implementation for Python AI SDK - Implements LangChainProvider class with invoke_model(), invoke_structured_model(), and static utility methods for message conversion and metrics extraction --------- Co-authored-by: Jason Bailey <[email protected]>
1 parent ad3444a commit 70c794f

File tree

8 files changed

+749
-13
lines changed

8 files changed

+749
-13
lines changed

.github/workflows/ci.yml

Lines changed: 53 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -58,3 +58,56 @@ jobs:
5858

5959
- name: Run tests
6060
run: make -C packages/sdk/server-ai test
61+
62+
server-ai-langchain-linux:
63+
runs-on: ubuntu-latest
64+
strategy:
65+
matrix:
66+
python-version: ["3.9", "3.10", "3.11", "3.12", "3.13"]
67+
68+
steps:
69+
- uses: actions/checkout@v4
70+
71+
- uses: ./.github/actions/ci
72+
with:
73+
workspace_path: packages/ai-providers/server-ai-langchain
74+
python_version: ${{ matrix.python-version }}
75+
76+
- uses: ./.github/actions/build
77+
with:
78+
workspace_path: packages/ai-providers/server-ai-langchain
79+
80+
server-ai-langchain-windows:
81+
runs-on: windows-latest
82+
defaults:
83+
run:
84+
shell: powershell
85+
86+
strategy:
87+
matrix:
88+
python-version: ["3.9", "3.10", "3.11", "3.12", "3.13"]
89+
90+
steps:
91+
- uses: actions/checkout@v4
92+
93+
- name: Set up Python ${{ matrix.python-version }}
94+
uses: actions/setup-python@v5
95+
with:
96+
python-version: ${{ matrix.python-version }}
97+
98+
- name: Install poetry
99+
uses: abatilo/actions-poetry@7b6d33e44b4f08d7021a1dee3c044e9c253d6439
100+
101+
- name: Configure poetry for local virtualenvs
102+
run: poetry config virtualenvs.in-project true
103+
104+
- name: Install server-ai dependency first
105+
working-directory: packages/sdk/server-ai
106+
run: poetry install
107+
108+
- name: Install requirements
109+
working-directory: packages/ai-providers/server-ai-langchain
110+
run: poetry install
111+
112+
- name: Run tests
113+
run: make -C packages/ai-providers/server-ai-langchain test

packages/ai-providers/server-ai-langchain/Makefile

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -20,6 +20,7 @@ test: install
2020
lint: #! Run type analysis and linting checks
2121
lint: install
2222
poetry run mypy src/ldai_langchain
23+
poetry run isort --check --atomic src/ldai_langchain
2324
poetry run pycodestyle src/ldai_langchain
2425

2526
.PHONY: build

packages/ai-providers/server-ai-langchain/README.md

Lines changed: 177 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,23 +1,195 @@
11
# LaunchDarkly AI SDK - LangChain Provider
22

3-
This package provides LangChain integration for the LaunchDarkly Server-Side AI SDK.
3+
[![PyPI](https://img.shields.io/pypi/v/launchdarkly-server-sdk-ai-langchain.svg)](https://pypi.org/project/launchdarkly-server-sdk-ai-langchain/)
44

5-
## Status
6-
7-
🚧 **Coming Soon** - This package is a placeholder for future LangChain integration.
5+
This package provides LangChain integration for the LaunchDarkly Server-Side AI SDK, allowing you to use LangChain models and chains with LaunchDarkly's tracking and configuration capabilities.
86

97
## Installation
108

119
```bash
1210
pip install launchdarkly-server-sdk-ai-langchain
1311
```
1412

13+
You'll also need to install the LangChain provider packages for the models you want to use:
14+
15+
```bash
16+
# For OpenAI
17+
pip install langchain-openai
18+
19+
# For Anthropic
20+
pip install langchain-anthropic
21+
22+
# For Google
23+
pip install langchain-google-genai
24+
```
25+
26+
## Quick Start
27+
28+
```python
29+
import asyncio
30+
from ldclient import LDClient, Config, Context
31+
from ldai import init
32+
from ldai_langchain import LangChainProvider
33+
34+
# Initialize LaunchDarkly client
35+
ld_client = LDClient(Config("your-sdk-key"))
36+
ai_client = init(ld_client)
37+
38+
# Get AI configuration
39+
context = Context.builder("user-123").build()
40+
config = ai_client.config("ai-config-key", context, {})
41+
42+
async def main():
43+
# Create a LangChain provider from the AI configuration
44+
provider = await LangChainProvider.create(config)
45+
46+
# Use the provider to invoke the model
47+
from ldai.models import LDMessage
48+
messages = [
49+
LDMessage(role="system", content="You are a helpful assistant."),
50+
LDMessage(role="user", content="Hello, how are you?"),
51+
]
52+
53+
response = await provider.invoke_model(messages)
54+
print(response.message.content)
55+
56+
asyncio.run(main())
57+
```
58+
1559
## Usage
1660

61+
### Using LangChainProvider with the Create Factory
62+
63+
The simplest way to use the LangChain provider is with the static `create` factory method, which automatically creates the appropriate LangChain model based on your LaunchDarkly AI configuration:
64+
1765
```python
18-
# Coming soon
66+
from ldai_langchain import LangChainProvider
67+
68+
# Create provider from AI configuration
69+
provider = await LangChainProvider.create(ai_config)
70+
71+
# Invoke the model
72+
response = await provider.invoke_model(messages)
1973
```
2074

75+
### Using an Existing LangChain Model
76+
77+
If you already have a LangChain model configured, you can use it directly:
78+
79+
```python
80+
from langchain_openai import ChatOpenAI
81+
from ldai_langchain import LangChainProvider
82+
83+
# Create your own LangChain model
84+
llm = ChatOpenAI(model="gpt-4", temperature=0.7)
85+
86+
# Wrap it with LangChainProvider
87+
provider = LangChainProvider(llm)
88+
89+
# Use with LaunchDarkly tracking
90+
response = await provider.invoke_model(messages)
91+
```
92+
93+
### Structured Output
94+
95+
The provider supports structured output using LangChain's `with_structured_output`:
96+
97+
```python
98+
response_structure = {
99+
"type": "object",
100+
"properties": {
101+
"sentiment": {"type": "string", "enum": ["positive", "negative", "neutral"]},
102+
"confidence": {"type": "number"},
103+
},
104+
"required": ["sentiment", "confidence"],
105+
}
106+
107+
result = await provider.invoke_structured_model(messages, response_structure)
108+
print(result.data) # {"sentiment": "positive", "confidence": 0.95}
109+
```
110+
111+
### Tracking Metrics
112+
113+
Use the provider with LaunchDarkly's tracking capabilities:
114+
115+
```python
116+
# Get the AI config with tracker
117+
config = ai_client.config("ai-config-key", context, {})
118+
119+
# Create provider
120+
provider = await LangChainProvider.create(config)
121+
122+
# Track metrics automatically
123+
async def invoke():
124+
return await provider.invoke_model(messages)
125+
126+
response = await config.tracker.track_metrics_of(
127+
invoke,
128+
lambda r: r.metrics
129+
)
130+
```
131+
132+
### Static Utility Methods
133+
134+
The `LangChainProvider` class provides several utility methods:
135+
136+
#### Converting Messages
137+
138+
```python
139+
from ldai.models import LDMessage
140+
from ldai_langchain import LangChainProvider
141+
142+
messages = [
143+
LDMessage(role="system", content="You are helpful."),
144+
LDMessage(role="user", content="Hello!"),
145+
]
146+
147+
# Convert to LangChain messages
148+
langchain_messages = LangChainProvider.convert_messages_to_langchain(messages)
149+
```
150+
151+
#### Extracting Metrics
152+
153+
```python
154+
from ldai_langchain import LangChainProvider
155+
156+
# After getting a response from LangChain
157+
metrics = LangChainProvider.get_ai_metrics_from_response(ai_message)
158+
print(f"Success: {metrics.success}")
159+
print(f"Tokens used: {metrics.usage.total if metrics.usage else 'N/A'}")
160+
```
161+
162+
#### Provider Name Mapping
163+
164+
```python
165+
# Map LaunchDarkly provider names to LangChain provider names
166+
langchain_provider = LangChainProvider.map_provider("gemini") # Returns "google-genai"
167+
```
168+
169+
## API Reference
170+
171+
### LangChainProvider
172+
173+
#### Constructor
174+
175+
```python
176+
LangChainProvider(llm: BaseChatModel, logger: Optional[Any] = None)
177+
```
178+
179+
#### Static Methods
180+
181+
- `create(ai_config: AIConfigKind, logger: Optional[Any] = None) -> LangChainProvider` - Factory method to create a provider from AI configuration
182+
- `convert_messages_to_langchain(messages: List[LDMessage]) -> List[BaseMessage]` - Convert LaunchDarkly messages to LangChain messages
183+
- `get_ai_metrics_from_response(response: AIMessage) -> LDAIMetrics` - Extract metrics from a LangChain response
184+
- `map_provider(ld_provider_name: str) -> str` - Map LaunchDarkly provider names to LangChain names
185+
- `create_langchain_model(ai_config: AIConfigKind) -> BaseChatModel` - Create a LangChain model from AI configuration
186+
187+
#### Instance Methods
188+
189+
- `invoke_model(messages: List[LDMessage]) -> ChatResponse` - Invoke the model with messages
190+
- `invoke_structured_model(messages: List[LDMessage], response_structure: Dict[str, Any]) -> StructuredResponse` - Invoke with structured output
191+
- `get_chat_model() -> BaseChatModel` - Get the underlying LangChain model
192+
21193
## Documentation
22194

23195
For full documentation, please refer to the [LaunchDarkly AI SDK documentation](https://docs.launchdarkly.com/sdk/ai/python).

packages/ai-providers/server-ai-langchain/pyproject.toml

Lines changed: 11 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -24,26 +24,34 @@ packages = [{ include = "ldai_langchain", from = "src" }]
2424

2525
[tool.poetry.dependencies]
2626
python = ">=3.9,<4"
27-
launchdarkly-server-sdk-ai = ">=0.10.1"
28-
# langchain-core = ">=0.1.0" # Uncomment when implementing
29-
27+
launchdarkly-server-sdk-ai = ">=0.11.0"
28+
langchain-core = ">=0.2.0"
29+
langchain = ">=0.2.0"
3030

3131
[tool.poetry.group.dev.dependencies]
3232
pytest = ">=2.8"
3333
pytest-cov = ">=2.4.0"
3434
pytest-asyncio = ">=0.21.0"
3535
mypy = "==1.18.2"
36+
pycodestyle = ">=2.11.0"
37+
isort = ">=5.12.0"
3638

3739
[tool.mypy]
3840
python_version = "3.9"
3941
ignore_missing_imports = true
4042
install_types = true
4143
non_interactive = true
4244

45+
[tool.isort]
46+
profile = "black"
47+
known_third_party = ["langchain", "langchain_core", "ldai"]
48+
sections = ["FUTURE", "STDLIB", "THIRDPARTY", "FIRSTPARTY", "LOCALFOLDER"]
49+
4350

4451
[tool.pytest.ini_options]
4552
addopts = ["-ra"]
4653
testpaths = ["tests"]
54+
asyncio_mode = "auto"
4755

4856

4957
[build-system]
Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,2 @@
1+
[pycodestyle]
2+
max-line-length = 120
Lines changed: 4 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,14 +1,13 @@
11
"""LaunchDarkly AI SDK - LangChain Provider.
22
3-
This package provides LangChain integration for the LaunchDarkly Server-Side AI SDK.
3+
This package provides LangChain integration for the LaunchDarkly Server-Side AI SDK,
44
"""
55

6-
__version__ = "0.1.0"
6+
from ldai_langchain.langchain_provider import LangChainProvider
77

8-
# Placeholder for future LangChain provider implementation
9-
# from ldai_langchain.langchain_provider import LangChainProvider
8+
__version__ = "0.1.0"
109

1110
__all__ = [
1211
'__version__',
13-
# 'LangChainProvider', # Uncomment when implemented
12+
'LangChainProvider',
1413
]

0 commit comments

Comments
 (0)