|
1 | 1 | --- |
2 | 2 | title: "AI21" |
| 3 | +description: "Integrate AI21 models with Portkey's AI Gateway" |
3 | 4 | --- |
4 | 5 |
|
5 | | -Portkey provides a robust and secure gateway to facilitate the integration of various Large Language Models (LLMs) into your applications, including [AI21](https://ai21.com). |
| 6 | +Portkey provides a robust and secure gateway to integrate various Large Language Models (LLMs) into applications, including [AI21's models](https://ai21.com). |
6 | 7 |
|
7 | | -With Portkey, you can take advantage of features like fast AI gateway access, observability, prompt management, and more, all while ensuring the secure management of your LLM API keys through a [virtual key](/product/ai-gateway/virtual-keys) system. |
| 8 | +With Portkey, take advantage of features like fast AI gateway access, observability, prompt management, and more, while securely managing API keys through [Model Catalog](/product/model-catalog). |
8 | 9 |
|
9 | | -<Note> |
10 | | -Provider Slug. **ai21** |
11 | | -</Note> |
12 | | -## Portkey SDK Integration with AI21 Models |
| 10 | +## Quick Start |
13 | 11 |
|
14 | | -Portkey provides a consistent API to interact with models from various providers. To integrate AI21 with Portkey: |
| 12 | +Get AI21 working in 3 steps: |
15 | 13 |
|
16 | | -### 1\. Install the Portkey SDK |
| 14 | +<CodeGroup> |
| 15 | +```python Python icon="python" |
| 16 | +from portkey_ai import Portkey |
17 | 17 |
|
18 | | -Add the Portkey SDK to your application to interact with AI21 AI's API through Portkey's gateway. |
| 18 | +# 1. Install: pip install portkey-ai |
| 19 | +# 2. Add @ai21 provider in model catalog |
| 20 | +# 3. Use it: |
19 | 21 |
|
20 | | -<Tabs> |
21 | | - <Tab title="NodeJS"> |
22 | | - ```sh |
23 | | - npm install --save portkey-ai |
24 | | - ``` |
25 | | - </Tab> |
26 | | - <Tab title="Python"> |
| 22 | +portkey = Portkey(api_key="PORTKEY_API_KEY") |
| 23 | + |
| 24 | +response = portkey.chat.completions.create( |
| 25 | + model="@ai21/jamba-1-5-large", |
| 26 | + messages=[{"role": "user", "content": "Say this is a test"}] |
| 27 | +) |
27 | 28 |
|
28 | | -```sh |
29 | | -pip install portkey-ai |
| 29 | +print(response.choices[0].message.content) |
30 | 30 | ``` |
31 | | - </Tab> |
32 | 31 |
|
33 | | - </Tabs> |
| 32 | +```js Javascript icon="square-js" |
| 33 | + import Portkey from 'portkey-ai' |
34 | 34 |
|
| 35 | +// 1. Install: npm install portkey-ai |
| 36 | +// 2. Add @ai21 provider in model catalog |
| 37 | +// 3. Use it: |
35 | 38 |
|
36 | | -### 2\. Initialize Portkey with the Virtual Key |
| 39 | + const portkey = new Portkey({ |
| 40 | + apiKey: "PORTKEY_API_KEY" |
| 41 | +}) |
37 | 42 |
|
38 | | -To use AI21 with Portkey, [get your API key from here](https://studio.ai21.com/account/api-key), then add it to Portkey to create the virtual key. |
| 43 | +const response = await portkey.chat.completions.create({ |
| 44 | + model: "@ai21/jamba-1-5-large", |
| 45 | + messages: [{ role: "user", content: "Say this is a test" }] |
| 46 | +}) |
39 | 47 |
|
40 | | -<Tabs> |
41 | | - <Tab title="NodeJS SDK"> |
42 | | - ```js |
43 | | - import Portkey from 'portkey-ai' |
| 48 | +console.log(response.choices[0].message.content) |
| 49 | +``` |
44 | 50 |
|
45 | | - const portkey = new Portkey({ |
46 | | - apiKey: "PORTKEY_API_KEY", // defaults to process.env["PORTKEY_API_KEY"] |
47 | | - provider:"@PROVIDER" // Your AI21 Virtual Key |
48 | | - }) |
49 | | - ``` |
50 | | - </Tab> |
51 | | - <Tab title="Python SDK"> |
52 | | - ```python |
53 | | - from portkey_ai import Portkey |
54 | | - |
55 | | - portkey = Portkey( |
56 | | - api_key="PORTKEY_API_KEY", # Replace with your Portkey API key |
57 | | - provider="@PROVIDER" # Replace with your virtual key for Groq |
58 | | - ) |
| 51 | +```python OpenAI Py icon="openai" |
| 52 | +from openai import OpenAI |
| 53 | +from portkey_ai import PORTKEY_GATEWAY_URL |
| 54 | + |
| 55 | +# 1. Install: pip install openai portkey-ai |
| 56 | +# 2. Add @ai21 provider in model catalog |
| 57 | +# 3. Use it: |
| 58 | + |
| 59 | +client = OpenAI( |
| 60 | + api_key="PORTKEY_API_KEY", # Portkey API key |
| 61 | + base_url=PORTKEY_GATEWAY_URL |
| 62 | +) |
| 63 | + |
| 64 | +response = client.chat.completions.create( |
| 65 | + model="@ai21/jamba-1-5-large", |
| 66 | + messages=[{"role": "user", "content": "Say this is a test"}] |
| 67 | +) |
| 68 | + |
| 69 | +print(response.choices[0].message.content) |
59 | 70 | ``` |
60 | | - </Tab> |
61 | 71 |
|
62 | | - </Tabs> |
| 72 | +```js OpenAI JS icon="openai" |
| 73 | +import OpenAI from "openai" |
| 74 | +import { PORTKEY_GATEWAY_URL } from "portkey-ai" |
63 | 75 |
|
| 76 | +// 1. Install: npm install openai portkey-ai |
| 77 | +// 2. Add @ai21 provider in model catalog |
| 78 | +// 3. Use it: |
64 | 79 |
|
65 | | -### 3\. Invoke Chat Completions with AI21 |
| 80 | +const client = new OpenAI({ |
| 81 | + apiKey: "PORTKEY_API_KEY", // Portkey API key |
| 82 | + baseURL: PORTKEY_GATEWAY_URL |
| 83 | +}) |
66 | 84 |
|
67 | | -Use the Portkey instance to send requests to AI21\. You can also override the virtual key directly in the API call if needed. |
68 | | -<Tabs> |
69 | | - <Tab title="NodeJS SDK"> |
70 | | - ```js |
71 | | - const chatCompletion = await portkey.chat.completions.create({ |
72 | | - messages: [{ role: 'user', content: 'Say this is a test' }], |
73 | | - model: 'jamba-1-5-large', |
74 | | - }); |
| 85 | +const response = await client.chat.completions.create({ |
| 86 | + model: "@ai21/jamba-1-5-large", |
| 87 | + messages: [{ role: "user", content: "Say this is a test" }] |
| 88 | +}) |
75 | 89 |
|
76 | | - console.log(chatCompletion.choices);d |
77 | | - ``` |
78 | | - </Tab> |
79 | | - <Tab title="Python SDK"> |
80 | | - ```python |
81 | | - completion = portkey.chat.completions.create( |
82 | | - messages= [{ "role": 'user', "content": 'Say this is a test' }], |
83 | | - model= 'jamba-1-5-large' |
84 | | - ) |
85 | | - |
86 | | - print(completion) |
87 | | - ``` |
88 | | - </Tab> |
| 90 | +console.log(response.choices[0].message.content) |
| 91 | +``` |
| 92 | + |
| 93 | +```sh cURL icon="square-terminal" |
| 94 | +# 1. Add @ai21 provider in model catalog |
| 95 | +# 2. Use it: |
| 96 | + |
| 97 | +curl https://api.portkey.ai/v1/chat/completions \ |
| 98 | + -H "Content-Type: application/json" \ |
| 99 | + -H "x-portkey-api-key: $PORTKEY_API_KEY" \ |
| 100 | + -d '{ |
| 101 | + "model": "@ai21/jamba-1-5-large", |
| 102 | + "messages": [ |
| 103 | + { "role": "user", "content": "Say this is a test" } |
| 104 | + ] |
| 105 | + }' |
| 106 | +``` |
| 107 | +</CodeGroup> |
89 | 108 |
|
90 | | - </Tabs> |
| 109 | +<Note> |
| 110 | +**Tip:** You can also set `provider="@ai21"` in `Portkey()` and use just `model="jamba-1-5-large"` in the request. |
| 111 | +</Note> |
| 112 | + |
| 113 | +## Add Provider in Model Catalog |
91 | 114 |
|
| 115 | +1. Go to [**Model Catalog → Add Provider**](https://app.portkey.ai/model-catalog/providers) |
| 116 | +2. Select **AI21** |
| 117 | +3. Choose existing credentials or create new by entering your [AI21 API key](https://studio.ai21.com/account/api-key) |
| 118 | +4. Name your provider (e.g., `ai21-prod`) |
92 | 119 |
|
| 120 | +<Card title="Complete Setup Guide →" href="/product/model-catalog"> |
| 121 | + See all setup options, code examples, and detailed instructions |
| 122 | +</Card> |
93 | 123 |
|
94 | 124 | ## Managing AI21 Prompts |
95 | 125 |
|
96 | | -You can manage all prompts to A121 in the [Prompt Library](/product/prompt-library). All the current models of AI21 are supported and you can easily start testing different prompts. |
| 126 | +Manage all prompt templates to AI21 in the [Prompt Library](/product/prompt-library). All current AI21 models are supported, and you can easily test different prompts. |
97 | 127 |
|
98 | | -Once you're ready with your prompt, you can use the `portkey.prompts.completions.create` interface to use the prompt in your application. |
| 128 | +Use the `portkey.prompts.completions.create` interface to use the prompt in an application. |
99 | 129 |
|
100 | | -The complete list of features supported in the SDK are available on the link below. |
| 130 | +## Next Steps |
101 | 131 |
|
102 | | -<Card title="SDK" href="/api-reference/sdk"> |
103 | | -</Card> |
| 132 | +<CardGroup cols={2}> |
| 133 | + <Card title="Add Metadata" icon="tags" href="/product/observability/metadata"> |
| 134 | + Add metadata to your AI21 requests |
| 135 | + </Card> |
| 136 | + <Card title="Gateway Configs" icon="gear" href="/product/ai-gateway/configs"> |
| 137 | + Add gateway configs to your AI21 requests |
| 138 | + </Card> |
| 139 | + <Card title="Tracing" icon="chart-line" href="/product/observability/traces"> |
| 140 | + Trace your AI21 requests |
| 141 | + </Card> |
| 142 | + <Card title="Fallbacks" icon="arrow-rotate-left" href="/product/ai-gateway/fallbacks"> |
| 143 | + Setup fallback from OpenAI to AI21 |
| 144 | + </Card> |
| 145 | +</CardGroup> |
104 | 146 |
|
105 | | -You'll find more information in the relevant sections |
| 147 | +For complete SDK documentation: |
106 | 148 |
|
107 | | -1. [Add metadata to your requests](/product/observability/metadata) |
108 | | -2. [Add gateway configs to your A121 requests](/product/ai-gateway/configs) |
109 | | -3. [Tracing A121 requests](/product/observability/traces) |
110 | | -4. [Setup a fallback from OpenAI to A121 APIs](/product/ai-gateway/fallbacks) |
| 149 | +<Card title="SDK Reference" icon="code" href="/api-reference/sdk/list"> |
| 150 | + Complete Portkey SDK documentation |
| 151 | +</Card> |
0 commit comments