Skip to content
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .github/workflows/manual-publish.yml
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,7 @@ on:
- packages/tooling/jest
- packages/sdk/browser
- packages/sdk/server-ai
- packages/ai-providers/server-ai-langchain
- packages/telemetry/browser-telemetry
- packages/sdk/combined-browser
prerelease:
Expand Down
12 changes: 11 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,10 @@ This includes shared libraries, used by SDKs and other tools, as well as SDKs.
| [@launchdarkly/node-server-sdk-otel](packages/telemetry/node-server-sdk-otel/README.md) | [![NPM][node-otel-npm-badge]][node-otel-npm-link] | [Node OTel][node-otel-issues] | [![Actions Status][node-otel-ci-badge]][node-otel-ci] |
| [@launchdarkly/browser-telemetry](packages/telemetry/browser-telemetry/README.md) | [![NPM][browser-telemetry-npm-badge]][browser-telemetry-npm-link] | [Browser Telemetry][browser-telemetry-issues] | [![Actions Status][browser-telemetry-ci-badge]][browser-telemetry-ci] |

| AI Providers | npm | issues | tests |
| ------------------------------------------------------------------------------------------ | ------------------------------------------------------------- | ------------------------------------------- | ------------------------------------------------------------------- |
| [@launchdarkly/server-sdk-ai-langchain](packages/ai-providers/server-ai-langchain/README.md) | [![NPM][server-ai-langchain-npm-badge]][server-ai-langchain-npm-link] | [server-ai-langchain][package-ai-providers-server-ai-langchain-issues] | [![Actions Status][server-ai-langchain-ci-badge]][server-ai-langchain-ci] |

## Organization

`packages` Top level directory containing package implementations.
Expand Down Expand Up @@ -219,4 +223,10 @@ We encourage pull requests and other contributions from the community. Check out
[sdk-combined-browser-ghp-link]: https://launchdarkly.github.io/js-core/packages/sdk/combined-browser/docs/
[sdk-combined-browser-dm-badge]: https://img.shields.io/npm/dm/@launchdarkly/browser.svg?style=flat-square
[sdk-combined-browser-dt-badge]: https://img.shields.io/npm/dt/@launchdarkly/browser.svg?style=flat-square
[package-sdk-browser-issues]: https://github.com/launchdarkly/js-core/issues?q=is%3Aissue+is%3Aopen+label%3A%22package%3A+sdk%2Fcombined-browser%22+
[package-sdk-browser-issues]: https://github.com/launchdarkly/js-core/issues?q=is%3Aissue+is%3Aopen+label%3A%22package%3A+sdk%2Fcombined-browser%22+
[//]: # 'ai-providers/server-ai-langchain'
[server-ai-langchain-ci-badge]: https://github.com/launchdarkly/js-core/actions/workflows/server-ai-langchain.yml/badge.svg
[server-ai-langchain-ci]: https://github.com/launchdarkly/js-core/actions/workflows/server-ai-langchain.yml
[server-ai-langchain-npm-badge]: https://img.shields.io/npm/v/@launchdarkly/server-sdk-ai-langchain.svg?style=flat-square
[server-ai-langchain-npm-link]: https://www.npmjs.com/package/@launchdarkly/server-sdk-ai-langchain
[package-ai-providers-server-ai-langchain-issues]: https://github.com/launchdarkly/js-core/issues?q=is%3Aissue+is%3Aopen+label%3A%22package%3A+ai-providers%2Fserver-ai-langchain%22+
110 changes: 110 additions & 0 deletions packages/ai-providers/server-ai-langchain/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,110 @@
# LaunchDarkly AI SDK LangChain Provider for Server-Side JavaScript

[![NPM][server-ai-langchain-npm-badge]][server-ai-langchain-npm-link]
[![Actions Status][server-ai-langchain-ci-badge]][server-ai-langchain-ci]
[![Documentation][server-ai-langchain-ghp-badge]][server-ai-langchain-ghp-link]
[![NPM][server-ai-langchain-dm-badge]][server-ai-langchain-npm-link]
[![NPM][server-ai-langchain-dt-badge]][server-ai-langchain-npm-link]

# ⛔️⛔️⛔️⛔️

> [!CAUTION]
> This library is a alpha version and should not be considered ready for production use while this message is visible.

# ☝️☝️☝️☝️☝️☝️

## LaunchDarkly overview

[LaunchDarkly](https://www.launchdarkly.com) is a feature management platform that serves over 100 billion feature flags daily to help teams build better software, faster. [Get started](https://docs.launchdarkly.com/home/getting-started) using LaunchDarkly today!

[![Twitter Follow](https://img.shields.io/twitter/follow/launchdarkly.svg?style=social&label=Follow&maxAge=2592000)](https://twitter.com/intent/follow?screen_name=launchdarkly)

## Quick Setup

This package provides LangChain integration for the LaunchDarkly AI SDK. The simplest way to use it is with the LaunchDarkly AI SDK's `initChat` method:

1. Install the required packages:

```shell
npm install @launchdarkly/server-sdk-ai @launchdarkly/server-sdk-ai-langchain --save
```

2. Create a chat session and use it:

```typescript
import { init } from '@launchdarkly/node-server-sdk';
import { initAi } from '@launchdarkly/server-sdk-ai';

// Initialize LaunchDarkly client
const ldClient = init(sdkKey);
const aiClient = initAi(ldClient);

// Create a chat session
const defaultConfig = {
enabled: true,
model: { name: 'gpt-4' },
provider: { name: 'openai' }
};
const chat = await aiClient.initChat('my-chat-config', context, defaultConfig);

if (chat) {
const response = await chat.invoke("What is the capital of France?");
console.log(response.message.content);
}
```

For more information about using the LaunchDarkly AI SDK, see the [LaunchDarkly AI SDK documentation](https://github.com/launchdarkly/js-core/tree/main/packages/sdk/server-ai/README.md).

## Advanced Usage

For more control, you can use the LangChain provider package directly with LaunchDarkly configurations:

```typescript
import { LangChainProvider } from '@launchdarkly/server-sdk-ai-langchain';
import { HumanMessage } from '@langchain/core/messages';

// Create a LangChain model from LaunchDarkly configuration
const llm = await LangChainProvider.createLangChainModel(aiConfig);

// Convert LaunchDarkly messages to LangChain format and add user message
const configMessages = aiConfig.messages || [];
const userMessage = new HumanMessage("What is the capital of France?");
const allMessages = [...LangChainProvider.convertMessagesToLangChain(configMessages), userMessage];

// Track the model call with LaunchDarkly tracking
const response = await aiConfig.tracker.trackMetricsOf(
(result) => LangChainProvider.createAIMetrics(result),
() => llm.invoke(allMessages)
);

console.log('AI Response:', response.content);
```


## Contributing

We encourage pull requests and other contributions from the community. Check out our [contributing guidelines](CONTRIBUTING.md) for instructions on how to contribute to this SDK.

## About LaunchDarkly

- LaunchDarkly is a continuous delivery platform that provides feature flags as a service and allows developers to iterate quickly and safely. We allow you to easily flag your features and manage them from the LaunchDarkly dashboard. With LaunchDarkly, you can:
- Roll out a new feature to a subset of your users (like a group of users who opt-in to a beta tester group), gathering feedback and bug reports from real-world use cases.
- Gradually roll out a feature to an increasing percentage of users, and track the effect that the feature has on key metrics (for instance, how likely is a user to complete a purchase if they have feature A versus feature B?).
- Turn off a feature that you realize is causing performance problems in production, without needing to re-deploy, or even restart the application with a changed configuration file.
- Grant access to certain features based on user attributes, like payment plan (eg: users on the 'gold' plan get access to more features than users in the 'silver' plan).
- Disable parts of your application to facilitate maintenance, without taking everything offline.
- LaunchDarkly provides feature flag SDKs for a wide variety of languages and technologies. Check out [our documentation](https://docs.launchdarkly.com/sdk) for a complete list.
- Explore LaunchDarkly
- [launchdarkly.com](https://www.launchdarkly.com/ 'LaunchDarkly Main Website') for more information
- [docs.launchdarkly.com](https://docs.launchdarkly.com/ 'LaunchDarkly Documentation') for our documentation and SDK reference guides
- [apidocs.launchdarkly.com](https://apidocs.launchdarkly.com/ 'LaunchDarkly API Documentation') for our API documentation
- [blog.launchdarkly.com](https://blog.launchdarkly.com/ 'LaunchDarkly Blog Documentation') for the latest product updates

[server-ai-langchain-ci-badge]: https://github.com/launchdarkly/js-core/actions/workflows/server-ai-langchain.yml/badge.svg
[server-ai-langchain-ci]: https://github.com/launchdarkly/js-core/actions/workflows/server-ai-langchain.yml
[server-ai-langchain-npm-badge]: https://img.shields.io/npm/v/@launchdarkly/server-sdk-ai-langchain.svg?style=flat-square
[server-ai-langchain-npm-link]: https://www.npmjs.com/package/@launchdarkly/server-sdk-ai-langchain
[server-ai-langchain-ghp-badge]: https://img.shields.io/static/v1?label=GitHub+Pages&message=API+reference&color=00add8
[server-ai-langchain-ghp-link]: https://launchdarkly.github.io/js-core/packages/ai-providers/server-ai-langchain/docs/
[server-ai-langchain-dm-badge]: https://img.shields.io/npm/dm/@launchdarkly/server-sdk-ai-langchain.svg?style=flat-square
[server-ai-langchain-dt-badge]: https://img.shields.io/npm/dt/@launchdarkly/server-sdk-ai-langchain.svg?style=flat-square
132 changes: 126 additions & 6 deletions packages/sdk/server-ai/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,18 +36,138 @@ npm install @launchdarkly/server-sdk-ai --save
const aiClient = initAi(ldClient);
```

3. Evaluate a model configuration:
## Setting Default AI Configurations

When retrieving AI configurations, you need to provide default values that will be used if the configuration is not available from LaunchDarkly:

### Fully Configured Default

```typescript
const defaultConfig = {
enabled: true,
model: {
name: 'gpt-4',
parameters: { temperature: 0.7, maxTokens: 1000 }
},
messages: [
{ role: 'system', content: 'You are a helpful assistant.' }
]
};
```

### Disabled Default

```typescript
const defaultConfig = {
enabled: false
};
```

## Retrieving AI Configurations

The `config` method retrieves AI configurations from LaunchDarkly with support for dynamic variables and fallback values:

```typescript
const aiConfig = await aiClient.config(
aiConfigKey,
context,
defaultConfig,
{ myVariable: 'My User Defined Variable' } // Variables for template interpolation
);

// Ensure configuration is enabled
if (aiConfig.enabled) {
const { messages, model, tracker } = aiConfig;
// Use with your AI provider
}
```

## TrackedChat for Conversational AI

`TrackedChat` provides a high-level interface for conversational AI with automatic conversation management and metrics tracking:

- Automatically configures models based on AI configuration
- Maintains conversation history across multiple interactions
- Automatically tracks token usage, latency, and success rates
- Works with any supported AI provider (see [AI Providers](https://github.com/launchdarkly/js-core#ai-providers) for available packages)

### Using TrackedChat

```typescript
const config = await aiClient.config(
aiConfigKey!,
// Use the same defaultConfig from the retrieval section above
const chat = await aiClient.initChat(
'customer-support-chat',
context,
{ enabled: false },
{ myVariable: 'My User Defined Variable' },
defaultConfig,
{ customerName: 'John' }
);

if (chat) {
// Simple conversation flow - metrics are automatically tracked by invoke()
const response1 = await chat.invoke("I need help with my order");
console.log(response1.message.content);

const response2 = await chat.invoke("What's the status?");
console.log(response2.message.content);

// Access conversation history
const messages = chat.getMessages();
console.log(`Conversation has ${messages.length} messages`);
}
```

For an example of how to use the config please refer to the examples folder.
## Advanced Usage with Providers

For more control, you can use the configuration directly with AI providers. We recommend using [LaunchDarkly AI Provider packages](https://github.com/launchdarkly/js-core#ai-providers) when available:

### Using AI Provider Packages

```typescript
import { LangChainProvider } from '@launchdarkly/server-sdk-ai-langchain';

const aiConfig = await aiClient.config(aiConfigKey, context, defaultValue);

// Create LangChain model from configuration
const llm = await LangChainProvider.createLangChainModel(aiConfig);

// Use with tracking
const response = await aiConfig.tracker.trackMetricsOf(
(result) => LangChainProvider.createAIMetrics(result),
() => llm.invoke(messages)
);

console.log('AI Response:', response.content);
```

### Using Custom Providers

```typescript
import { LDAIMetrics } from '@launchdarkly/server-sdk-ai';

const aiConfig = await aiClient.config(aiConfigKey, context, defaultValue);

// Define custom metrics mapping for your provider
const mapCustomProviderMetrics = (response: any): LDAIMetrics => ({
success: true,
usage: {
total: response.usage?.total_tokens || 0,
input: response.usage?.prompt_tokens || 0,
output: response.usage?.completion_tokens || 0,
}
});

// Use with custom provider and tracking
const result = await aiConfig.tracker.trackMetricsOf(
mapCustomProviderMetrics,
() => customProvider.generate({
messages: aiConfig.messages || [],
model: aiConfig.model?.name || 'custom-model',
temperature: aiConfig.model?.parameters?.temperature ?? 0.5,
})
);

console.log('AI Response:', result.content);
```

## Contributing

Expand Down
1 change: 0 additions & 1 deletion release-please-config.json
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,6 @@
"packages": {
"packages/ai-providers/server-ai-langchain": {
"bump-minor-pre-major": true,
"release-as": "0.1.0",
"prerelease": true
},
"packages/shared/common": {},
Expand Down