Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .github/workflows/manual-publish.yml
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,7 @@ on:
- packages/tooling/jest
- packages/sdk/browser
- packages/sdk/server-ai
- packages/ai-providers/server-ai-vercel
- packages/ai-providers/server-ai-langchain
- packages/telemetry/browser-telemetry
- packages/sdk/combined-browser
Expand Down
27 changes: 27 additions & 0 deletions .github/workflows/server-ai-vercel.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
name: ai-providers/server-ai-vercel

on:
push:
branches: [main, 'feat/**']
paths-ignore:
- '**.md' #Do not need to run CI for markdown changes.
pull_request:
branches: [main, 'feat/**']
paths-ignore:
- '**.md'

jobs:
build-test-vercel-provider:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- uses: actions/setup-node@2028fbc5c25fe9cf00d9f06a71cc4710d4507903 # v6.0.0
with:
node-version: 22.x
registry-url: 'https://registry.npmjs.org'
- id: shared
name: Shared CI Steps
uses: ./actions/ci
with:
workspace_name: '@launchdarkly/server-sdk-ai-vercel'
workspace_path: packages/ai-providers/server-ai-vercel
1 change: 1 addition & 0 deletions .release-please-manifest.json
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
{
"packages/ai-providers/server-ai-langchain": "0.1.0",
"packages/ai-providers/server-ai-vercel": "0.0.0",
"packages/sdk/akamai-base": "3.0.10",
"packages/sdk/akamai-edgekv": "1.4.12",
"packages/sdk/browser": "0.8.1",
Expand Down
9 changes: 8 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,7 @@ This includes shared libraries, used by SDKs and other tools, as well as SDKs.
| AI Providers | npm | issues | tests |
| ------------------------------------------------------------------------------------------ | ------------------------------------------------------------- | ------------------------------------------- | ------------------------------------------------------------------- |
| [@launchdarkly/server-sdk-ai-langchain](packages/ai-providers/server-ai-langchain/README.md) | [![NPM][server-ai-langchain-npm-badge]][server-ai-langchain-npm-link] | [server-ai-langchain][package-ai-providers-server-ai-langchain-issues] | [![Actions Status][server-ai-langchain-ci-badge]][server-ai-langchain-ci] |
| [@launchdarkly/server-sdk-ai-vercel](packages/ai-providers/server-ai-vercel/README.md) | [![NPM][server-ai-vercel-npm-badge]][server-ai-vercel-npm-link] | [server-ai-vercel][package-ai-providers-server-ai-vercel-issues] | [![Actions Status][server-ai-vercel-ci-badge]][server-ai-vercel-ci] |

## Organization

Expand Down Expand Up @@ -229,4 +230,10 @@ We encourage pull requests and other contributions from the community. Check out
[server-ai-langchain-ci]: https://github.com/launchdarkly/js-core/actions/workflows/server-ai-langchain.yml
[server-ai-langchain-npm-badge]: https://img.shields.io/npm/v/@launchdarkly/server-sdk-ai-langchain.svg?style=flat-square
[server-ai-langchain-npm-link]: https://www.npmjs.com/package/@launchdarkly/server-sdk-ai-langchain
[package-ai-providers-server-ai-langchain-issues]: https://github.com/launchdarkly/js-core/issues?q=is%3Aissue+is%3Aopen+label%3A%22package%3A+ai-providers%2Fserver-ai-langchain%22+
[package-ai-providers-server-ai-langchain-issues]: https://github.com/launchdarkly/js-core/issues?q=is%3Aissue+is%3Aopen+label%3A%22package%3A+ai-providers%2Fserver-ai-langchain%22+
[//]: # 'ai-providers/server-ai-vercel'
[server-ai-vercel-ci-badge]: https://github.com/launchdarkly/js-core/actions/workflows/server-ai-vercel.yml/badge.svg
[server-ai-vercel-ci]: https://github.com/launchdarkly/js-core/actions/workflows/server-ai-vercel.yml
[server-ai-vercel-npm-badge]: https://img.shields.io/npm/v/@launchdarkly/server-sdk-ai-vercel.svg?style=flat-square
[server-ai-vercel-npm-link]: https://www.npmjs.com/package/@launchdarkly/server-sdk-ai-vercel
[package-ai-providers-server-ai-vercel-issues]: https://github.com/launchdarkly/js-core/issues?q=is%3Aissue+is%3Aopen+label%3A%22package%3A+ai-providers%2Fserver-ai-vercel%22+
1 change: 1 addition & 0 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@
"name": "@launchdarkly/js-core",
"workspaces": [
"packages/ai-providers/server-ai-langchain",
"packages/ai-providers/server-ai-vercel",
"packages/shared/common",
"packages/shared/sdk-client",
"packages/shared/sdk-server",
Expand Down
111 changes: 111 additions & 0 deletions packages/ai-providers/server-ai-vercel/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,111 @@
# LaunchDarkly AI SDK Vercel Provider for Server-Side JavaScript

[![NPM][server-ai-vercel-npm-badge]][server-ai-vercel-npm-link]
[![Actions Status][server-ai-vercel-ci-badge]][server-ai-vercel-ci]
[![Documentation][server-ai-vercel-ghp-badge]][server-ai-vercel-ghp-link]
[![NPM][server-ai-vercel-dm-badge]][server-ai-vercel-npm-link]
[![NPM][server-ai-vercel-dt-badge]][server-ai-vercel-npm-link]

# ⛔️⛔️⛔️⛔️

> [!CAUTION]
> This library is a alpha version and should not be considered ready for production use while this message is visible.

# ☝️☝️☝️☝️☝️☝️

## LaunchDarkly overview

[LaunchDarkly](https://www.launchdarkly.com) is a feature management platform that serves over 100 billion feature flags daily to help teams build better software, faster. [Get started](https://docs.launchdarkly.com/home/getting-started) using LaunchDarkly today!

[![Twitter Follow](https://img.shields.io/twitter/follow/launchdarkly.svg?style=social&label=Follow&maxAge=2592000)](https://twitter.com/intent/follow?screen_name=launchdarkly)

## Quick Setup

This package provides Vercel AI SDK integration for the LaunchDarkly AI SDK. The simplest way to use it is with the LaunchDarkly AI SDK's `initChat` method:

1. Install the required packages:

```shell
npm install @launchdarkly/server-sdk-ai @launchdarkly/server-sdk-ai-vercel --save
# or
yarn add @launchdarkly/server-sdk-ai @launchdarkly/server-sdk-ai-vercel
```

2. Create a chat session and use it:

```typescript
import { init } from '@launchdarkly/node-server-sdk';
import { initAi } from '@launchdarkly/server-sdk-ai';

// Initialize LaunchDarkly client
const ldClient = init(sdkKey);
const aiClient = initAi(ldClient);

// Create a chat session
const defaultConfig = {
enabled: true,
model: { name: 'gpt-4' },
provider: { name: 'openai' }
};
const chat = await aiClient.initChat('my-chat-config', context, defaultConfig);

if (chat) {
const response = await chat.invoke('What is the capital of France?');
console.log(response.message.content);
}
```

For more information about using the LaunchDarkly AI SDK, see the [LaunchDarkly AI SDK documentation](https://github.com/launchdarkly/js-core/tree/main/packages/sdk/server-ai/README.md).

## Advanced Usage

For more control, you can use the Vercel AI provider package directly with LaunchDarkly configurations:

```typescript
import { VercelProvider } from '@launchdarkly/server-sdk-ai-vercel';
import { generateText } from 'ai';

// Create a Vercel AI model from LaunchDarkly configuration
const model = await VercelProvider.createVercelModel(aiConfig);

// Convert LaunchDarkly messages and add user message
const configMessages = aiConfig.messages || [];
const userMessage = { role: 'user', content: 'What is the capital of France?' };
const allMessages = [...configMessages, userMessage];

// Track the model call with LaunchDarkly tracking
const response = await aiConfig.tracker.trackMetricsOf(
(result) => VercelProvider.createAIMetrics(result),
() => generateText({ model, messages: allMessages })
);

console.log('AI Response:', response.text);
```

## Contributing

We encourage pull requests and other contributions from the community. Check out our [contributing guidelines](CONTRIBUTING.md) for instructions on how to contribute to this SDK.

## About LaunchDarkly

- LaunchDarkly is a continuous delivery platform that provides feature flags as a service and allows developers to iterate quickly and safely. We allow you to easily flag your features and manage them from the LaunchDarkly dashboard. With LaunchDarkly, you can:
- Roll out a new feature to a subset of your users (like a group of users who opt-in to a beta tester group), gathering feedback and bug reports from real-world use cases.
- Gradually roll out a feature to an increasing percentage of users, and track the effect that the feature has on key metrics (for instance, how likely is a user to complete a purchase if they have feature A versus feature B?).
- Turn off a feature that you realize is causing performance problems in production, without needing to re-deploy, or even restart the application with a changed configuration file.
- Grant access to certain features based on user attributes, like payment plan (eg: users on the 'gold' plan get access to more features than users in the 'silver' plan).
- Disable parts of your application to facilitate maintenance, without taking everything offline.
- LaunchDarkly provides feature flag SDKs for a wide variety of languages and technologies. Check out [our documentation](https://docs.launchdarkly.com/sdk) for a complete list.
- Explore LaunchDarkly
- [launchdarkly.com](https://www.launchdarkly.com/ 'LaunchDarkly Main Website') for more information
- [docs.launchdarkly.com](https://docs.launchdarkly.com/ 'LaunchDarkly Documentation') for our documentation and SDK reference guides
- [apidocs.launchdarkly.com](https://apidocs.launchdarkly.com/ 'LaunchDarkly API Documentation') for our API documentation
- [blog.launchdarkly.com](https://blog.launchdarkly.com/ 'LaunchDarkly Blog Documentation') for the latest product updates

[server-ai-vercel-ci-badge]: https://github.com/launchdarkly/js-core/actions/workflows/server-ai-vercel.yml/badge.svg
[server-ai-vercel-ci]: https://github.com/launchdarkly/js-core/actions/workflows/server-ai-vercel.yml
[server-ai-vercel-npm-badge]: https://img.shields.io/npm/v/@launchdarkly/server-sdk-ai-vercel.svg?style=flat-square
[server-ai-vercel-npm-link]: https://www.npmjs.com/package/@launchdarkly/server-sdk-ai-vercel
[server-ai-vercel-ghp-badge]: https://img.shields.io/static/v1?label=GitHub+Pages&message=API+reference&color=00add8
[server-ai-vercel-ghp-link]: https://launchdarkly.github.io/js-core/packages/ai-providers/server-ai-vercel/docs/
[server-ai-vercel-dm-badge]: https://img.shields.io/npm/dm/@launchdarkly/server-sdk-ai-vercel.svg?style=flat-square
[server-ai-vercel-dt-badge]: https://img.shields.io/npm/dt/@launchdarkly/server-sdk-ai-vercel.svg?style=flat-square
Original file line number Diff line number Diff line change
@@ -0,0 +1,203 @@
import { generateText } from 'ai';

import { VercelProvider } from '../src/VercelProvider';

// Mock Vercel AI SDK
jest.mock('ai', () => ({
generateText: jest.fn(),
}));

describe('VercelProvider', () => {
let mockModel: any;
let provider: VercelProvider;

beforeEach(() => {
mockModel = { name: 'test-model' };
provider = new VercelProvider(mockModel, {});
});

describe('createAIMetrics', () => {
it('creates metrics with success=true and token usage', () => {
const mockResponse = {
usage: {
promptTokens: 50,
completionTokens: 50,
totalTokens: 100,
},
};

const result = VercelProvider.createAIMetrics(mockResponse);

expect(result).toEqual({
success: true,
usage: {
total: 100,
input: 50,
output: 50,
},
});
});

it('creates metrics with success=true and no usage when usage is missing', () => {
const mockResponse = {};

const result = VercelProvider.createAIMetrics(mockResponse);

expect(result).toEqual({
success: true,
usage: undefined,
});
});

it('handles partial usage data', () => {
const mockResponse = {
usage: {
promptTokens: 30,
// completionTokens and totalTokens missing
},
};

const result = VercelProvider.createAIMetrics(mockResponse);

expect(result).toEqual({
success: true,
usage: {
total: 0,
input: 30,
output: 0,
},
});
});
});

describe('invokeModel', () => {
it('invokes Vercel AI generateText and returns response', async () => {
const mockResponse = {
text: 'Hello! How can I help you today?',
usage: {
promptTokens: 10,
completionTokens: 15,
totalTokens: 25,
},
};

(generateText as jest.Mock).mockResolvedValue(mockResponse);

const messages = [{ role: 'user' as const, content: 'Hello!' }];

const result = await provider.invokeModel(messages);

expect(generateText).toHaveBeenCalledWith({
model: mockModel,
messages: [{ role: 'user', content: 'Hello!' }],
});

expect(result).toEqual({
message: {
role: 'assistant',
content: 'Hello! How can I help you today?',
},
metrics: {
success: true,
usage: {
total: 25,
input: 10,
output: 15,
},
},
});
});

it('handles response without usage data', async () => {
const mockResponse = {
text: 'Hello! How can I help you today?',
};

(generateText as jest.Mock).mockResolvedValue(mockResponse);

const messages = [{ role: 'user' as const, content: 'Hello!' }];

const result = await provider.invokeModel(messages);

expect(result).toEqual({
message: {
role: 'assistant',
content: 'Hello! How can I help you today?',
},
metrics: {
success: true,
usage: undefined,
},
});
});
});

describe('getModel', () => {
it('returns the underlying Vercel AI model', () => {
const model = provider.getModel();
expect(model).toBe(mockModel);
});
});

describe('createVercelModel', () => {
it('creates OpenAI model for openai provider', async () => {
const mockAiConfig = {
model: { name: 'gpt-4', parameters: {} },
provider: { name: 'openai' },
enabled: true,
tracker: {} as any,
toVercelAISDK: jest.fn(),
};

// Mock the dynamic import
jest.doMock('@ai-sdk/openai', () => ({
openai: jest.fn().mockReturnValue(mockModel),
}));

const result = await VercelProvider.createVercelModel(mockAiConfig);
expect(result).toBe(mockModel);
});

it('throws error for unsupported provider', async () => {
const mockAiConfig = {
model: { name: 'test-model', parameters: {} },
provider: { name: 'unsupported' },
enabled: true,
tracker: {} as any,
toVercelAISDK: jest.fn(),
};

await expect(VercelProvider.createVercelModel(mockAiConfig)).rejects.toThrow(
'Unsupported Vercel AI provider: unsupported',
);
});
});

describe('create', () => {
it('creates VercelProvider with correct model and parameters', async () => {
const mockAiConfig = {
model: {
name: 'gpt-4',
parameters: {
temperature: 0.7,
maxTokens: 1000,
},
},
provider: { name: 'openai' },
enabled: true,
tracker: {} as any,
toVercelAISDK: jest.fn(),
};

// Mock the dynamic import
jest.doMock('@ai-sdk/openai', () => ({
openai: jest.fn().mockReturnValue(mockModel),
}));

const result = await VercelProvider.create(mockAiConfig);

expect(result).toBeInstanceOf(VercelProvider);
expect(result.getModel()).toBeDefined();
});
});
});
7 changes: 7 additions & 0 deletions packages/ai-providers/server-ai-vercel/jest.config.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
module.exports = {
transform: { '^.+\\.ts?$': 'ts-jest' },
testMatch: ['**/__tests__/**/*test.ts?(x)'],
testEnvironment: 'node',
moduleFileExtensions: ['ts', 'tsx', 'js', 'jsx', 'json', 'node'],
collectCoverageFrom: ['src/**/*.ts'],
};
Loading