Skip to content

Commit 857c493

Browse files
Merge pull request #321 from tomukmatthews/add-nscale-provider-docs
Add docs for nscale provider
2 parents 68a9228 + fdea2de commit 857c493

File tree

4 files changed

+134
-0
lines changed

4 files changed

+134
-0
lines changed

docs.json

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -299,6 +299,7 @@
299299
"integrations/llms/moonshot",
300300
"integrations/llms/ncompass",
301301
"integrations/llms/nomic",
302+
"integrations/llms/nscale",
302303
"integrations/llms/novita-ai",
303304
"integrations/llms/nebius",
304305
"integrations/llms/openrouter",

images/supported-llm/nscale.jpeg

10.6 KB
Loading

integrations/llms.mdx

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -127,6 +127,10 @@ description: "Portkey connects with all major LLM providers and orchestration fr
127127
<Frame><img src="/images/supported-llm/nomic.avif" alt="Nomic AI" /></Frame>
128128
</Card>
129129

130+
<Card title="Nscale" href="/integrations/llms/nscale">
131+
<Frame><img src="/images/supported-llm/nscale.jpeg" alt="Nscale" /></Frame>
132+
</Card>
133+
130134
<Card title="Ollama" href="/integrations/llms/ollama">
131135
<Frame><img src="/images/supported-llm/ollama.avif" alt="Ollama" /></Frame>
132136
</Card>

integrations/llms/nscale.mdx

Lines changed: 129 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,129 @@
1+
---
2+
title: "Nscale (EU Sovereign)"
3+
---
4+
5+
Portkey provides a robust and secure gateway to facilitate the integration of various Large Language Models (LLMs) into your applications, including the models hosted on [Nscale](https://docs.nscale.com/docs/inference/serverless-models/current).
6+
7+
<Note>
8+
Provider Slug. `nscale`
9+
</Note>
10+
11+
## Portkey SDK Integration with Nscale
12+
13+
Portkey provides a consistent API to interact with models from various providers. To integrate Nscale with Portkey:
14+
15+
### 1. Install the Portkey SDK
16+
<Tabs>
17+
<Tab title="NodeJS">
18+
```sh
19+
npm install --save portkey-ai
20+
```
21+
</Tab>
22+
<Tab title="Python">
23+
```sh
24+
pip install portkey-ai
25+
```
26+
</Tab>
27+
</Tabs>
28+
29+
### 2. Initialize Portkey with the Virtual Key
30+
To use Nscale with Virtual Key, [get your API key from here](https://console.nscale.com). Then add it to Portkey to create the virtual key.
31+
<Tabs>
32+
<Tab title="NodeJS SDK">
33+
```js
34+
import Portkey from 'portkey-ai'
35+
36+
const portkey = new Portkey({
37+
apiKey: "PORTKEY_API_KEY", // defaults to process.env["PORTKEY_API_KEY"]
38+
virtualKey: "VIRTUAL_KEY" // Your Nscale Virtual Key
39+
})
40+
```
41+
</Tab>
42+
<Tab title="Python SDK">
43+
```python
44+
from portkey_ai import Portkey
45+
46+
portkey = Portkey(
47+
api_key="PORTKEY_API_KEY", # Replace with your Portkey API key
48+
virtual_key="NSCALE_VIRTUAL_KEY"
49+
)
50+
```
51+
</Tab>
52+
</Tabs>
53+
54+
### 3. Invoke Chat Completions
55+
<Tabs>
56+
<Tab title="NodeJS SDK">
57+
```js
58+
const chatCompletion = await portkey.chat.completions.create({
59+
messages: [{ role: 'user', content: 'Say this is a test' }],
60+
model: 'meta-llama/Llama-4-Scout-17B-16E-Instruct',
61+
});
62+
63+
console.log(chatCompletion.choices);
64+
```
65+
</Tab>
66+
<Tab title="Python SDK">
67+
```python
68+
completion = portkey.chat.completions.create(
69+
messages= [{ "role": 'user', "content": 'Say this is a test' }],
70+
model= 'meta-llama/Llama-4-Scout-17B-16E-Instruct'
71+
)
72+
73+
print(completion)
74+
```
75+
</Tab>
76+
</Tabs>
77+
78+
### 4. Invoke Image Generation
79+
<Tabs>
80+
<Tab title="NodeJS SDK">
81+
```js
82+
const response = await portkey.images.generations.create({
83+
prompt: "A beautiful sunset over mountains",
84+
model: "stabilityai/stable-diffusion-xl-base-1.0",
85+
n: 1,
86+
size: "1024x1024"
87+
});
88+
89+
console.log(response.data[0].url);
90+
```
91+
</Tab>
92+
<Tab title="Python SDK">
93+
```python
94+
response = portkey.images.generate(
95+
prompt="A beautiful sunset over mountains",
96+
model="stabilityai/stable-diffusion-xl-base-1.0",
97+
n=1,
98+
size="1024x1024"
99+
)
100+
101+
print(response.data[0].url)
102+
```
103+
</Tab>
104+
</Tabs>
105+
106+
---
107+
108+
## Supported Models
109+
110+
<CardGroup cols={1}>
111+
<Card title="View Available Models" icon="list" href="https://docs.nscale.com/docs/inference/serverless-models/current">
112+
Explore the complete list of available models on Nscale's documentation, including chat models, image generation models, and their pricing details.
113+
</Card>
114+
</CardGroup>
115+
116+
---
117+
118+
## Next Steps
119+
120+
The complete list of features supported in the SDK are available on the link below.
121+
<Card title="SDK" href="/api-reference/portkey-sdk-client">
122+
</Card>
123+
124+
You'll find more information in the relevant sections:
125+
126+
1. [Add metadata to your requests](/product/observability/metadata)
127+
2. [Add gateway configs to your Nscale requests](/product/ai-gateway/configs)
128+
3. [Tracing Nscale requests](/product/observability/traces)
129+
4. [Setup a fallback from OpenAI to Nscale](/product/ai-gateway/fallbacks)

0 commit comments

Comments
 (0)