You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This library provides convenient access to the OpenAI REST API from TypeScript or JavaScript.
5
+
This library provides convenient access to the OpenAI REST API in server-side TypeScript or JavaScript applications. It is generated from our [OpenAPI specification](https://github.com/openai/openai-openapi) with [Stainless](https://stainlessapi.com/). To learn how to use the OpenAI API, check out our [API Reference](https://platform.openai.com/docs/api-reference) and [Documentation](https://platform.openai.com/docs).
6
6
7
-
It is generated from our [OpenAPI specification](https://github.com/openai/openai-openapi) with [Stainless](https://stainlessapi.com/).
7
+
## Installation
8
8
9
-
To learn how to use the OpenAI API, check out our [API Reference](https://platform.openai.com/docs/api-reference) and [Documentation](https://platform.openai.com/docs).
9
+
This module is distributed on both the [npm](https://www.npmjs.com/package/openai) and [JSR](https://jsr.io/@openai/openai) registries.
10
10
11
-
## Installation
11
+
**Install from npm**
12
12
13
13
```sh
14
14
npm install openai
15
+
yarn add openai
16
+
pnpm add openai
17
+
bun install openai
18
+
```
19
+
20
+
These commands will make the module importable as the default export from
21
+
`openai` in JavaScript runtimes that use npm (Node.js, Cloudflare Workers, Bun, e.g.):
22
+
23
+
```ts
24
+
importOpenAIfrom'openai';
25
+
```
26
+
27
+
**Install from JSR**
28
+
29
+
```sh
30
+
deno add jsr:@openai/openai
31
+
npx jsr add @openai/openai
32
+
yarn dlx jsr add @openai/openai
33
+
pnpm dlx jsr add @openai/openai
34
+
bunx jsr add @openai/openai
15
35
```
16
36
17
-
You can also import from jsr:
37
+
These commands will make the module importable from the `@openai` scope:
38
+
39
+
```ts
40
+
importOpenAIfrom"@openai/openai";
41
+
```
42
+
43
+
You can also [import directly from JSR](https://jsr.io/docs/using-packages#importing-with-jsr-specifiers) without an install step if you're using the Deno JavaScript runtime:
All object responses in the SDK provide a `_request_id` property which is added from the `x-request-id` response header so that you can quickly log failing requests and report them back to OpenAI.
369
395
370
396
```ts
371
-
const completion =awaitclient.chat.completions.create({ messages: [{ role: 'user', content: 'Say this is a test' }], model: 'gpt-4' });
awaitclient.chat.completions.create({ messages: [{ role:'user', content:'How can I get the name of the current day in Node.js?' }], model:'gpt-3.5-turbo' }, {
447
+
awaitclient.chat.completions.create({
448
+
model:'gpt-4o',
449
+
messages: [
450
+
{
451
+
role:'user',
452
+
content:'How can I get the name of the current day in JavaScript?',
453
+
}
454
+
]
455
+
}, {
419
456
maxRetries:5,
420
457
});
421
458
```
@@ -432,7 +469,15 @@ const client = new OpenAI({
432
469
});
433
470
434
471
// Override per-request:
435
-
awaitclient.chat.completions.create({ messages: [{ role: 'user', content: 'How can I list all files in a directory using Python?' }], model: 'gpt-3.5-turbo' }, {
472
+
awaitclient.chat.completions.create({
473
+
model: 'gpt-4o',
474
+
messages: [
475
+
{
476
+
role: 'user',
477
+
content: 'How can I list all files in a directory using Python?'
478
+
}
479
+
]
480
+
}, {
436
481
timeout: 5*1000,
437
482
});
438
483
```
@@ -485,13 +530,13 @@ You can also use the `.withResponse()` method to get the raw `Response` along wi
485
530
const client =newOpenAI();
486
531
487
532
const response =awaitclient.chat.completions
488
-
.create({ messages: [{ role: 'user', content: 'Say this is a test' }], model: 'gpt-3.5-turbo' })
533
+
.create({ messages: [{ role: 'user', content: 'Say this is a test' }], model: 'gpt-4o' })
489
534
.asResponse();
490
535
console.log(response.headers.get('X-My-Header'));
491
536
console.log(response.statusText); // access the underlying Response object
492
537
493
538
const { data: chatCompletion, response: raw } =awaitclient.chat.completions
494
-
.create({ messages: [{ role: 'user', content: 'Say this is a test' }], model: 'gpt-3.5-turbo' })
539
+
.create({ messages: [{ role: 'user', content: 'Say this is a test' }], model: 'gpt-4o' })
495
540
.withResponse();
496
541
console.log(raw.headers.get('X-My-Header'));
497
542
console.log(chatCompletion);
@@ -622,7 +667,7 @@ TypeScript >= 4.5 is supported.
622
667
The following runtimes are supported:
623
668
624
669
- Node.js 18 LTS or later ([non-EOL](https://endoflife.date/nodejs)) versions.
625
-
- Deno v1.28.0 or higher, using `import OpenAI from "npm:openai"`.
0 commit comments