Skip to content

Commit a330c2e

Browse files
authored
Prepare 2.0.0-beta.13 release (Part 1) (#229)
- Refactored `ModerationResult` by merging `ModerationCategories` and `ModerationCategoryScores` into individual `ModerationCategory` properties, each with `Flagged` and `Score` properties. - Renamed type `OpenAIFileInfo` to `OpenAIFile` and `OpenAIFileInfoCollection` to `OpenAIFileCollection`. - Renamed type `OpenAIModelInfo` to `OpenAIModel` and `OpenAIModelInfoCollection` to `OpenAIModelCollection`. - Renamed type `Embedding` to `OpenAIEmbedding` and `EmbeddingCollection` to `OpenAIEmbeddingCollection`. - Renamed property `ImageUrl` to `ImageUri` and method `FromImageUrl` to `FromImageUri` in the `MessageContent` type. - Renamed property `ParallelToolCallsEnabled` to `AllowParallelToolCalls` in the `RunCreationOptions`, `ThreadRun`, and `ChatCompletionOptions` types. - Renamed properties `PromptTokens` to `InputTokenCount`, `CompletionTokens` to `OutputTokenCount`, and `TotalTokens` to `TotalTokenCount` in the `RunTokenUsage` and `RunStepTokenUsage` types. - Renamed properties `InputTokens` to `InputTokenCount` and `TotalTokens` to `TotalTokenCount` in the `EmbeddingTokenUsage` type. - Renamed properties `MaxPromptTokens` to `MaxInputTokenCount` and `MaxCompletionTokens` to `MaxOutputTokenCount` in the `ThreadRun`, `RunCreationOptions`, and `RunIncompleteReason` types. - Removed the `virtual` keyword from the `Pipeline` property across all clients. - Renamed the `Granularities` property of `AudioTranscriptionOptions` to `TimestampGranularities`. - Changed `AudioTranscriptionFormat` from an enum to an "extensible enum". - Changed `AudioTranslationFormat` from an enum to an "extensible enum". - Changed `GenerateImageFormat` from an enum to an "extensible enum". - Changed `GeneratedImageQuality` from an enum to an "extensible enum". - Changed `GeneratedImageStyle` from an enum to an "extensible enum". - Removed method overloads in `AssistantClient` and `VectorStoreClient` that take complex parameters in favor of methods that take simple string IDs. - Updated the `TokenIds` property type in the `TranscribedSegment` type from `IReadOnlyList<int>` to `ReadOnlyMemory<int>`. - Updated the `inputs` parameter type in the `GenerateEmbeddings` and `GenerateEmbeddingsAsync` methods of `EmbeddingClient` from `IEnumerable<IEnumerable<int>>` to `IEnumerable<ReadOnlyMemory<int>>`. - Changed `ChatMessageContentPartKind` from an extensible enum to an enum. - Changed `ChatToolCallKind` from an extensible enum to an enum. - Changed `ChatToolKind` from an extensible enum to an enum. - Changed `OpenAIFilePurpose` from an extensible enum to an enum. - Changed `OpenAIFileStatus` from an extensible enum to an enum. - Renamed `OpenAIFilePurpose` to `FilePurpose`. - Renamed `OpenAIFileStatus` to `FileStatus`. - Removed constructors that take string API key and options.
1 parent 75eded5 commit a330c2e

File tree

139 files changed

+4544
-3059
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

139 files changed

+4544
-3059
lines changed

.github/workflows/live-test.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@ jobs:
2727
- name: Run live tests
2828
run: dotnet test ./tests/OpenAI.Tests.csproj
2929
--configuration Release
30-
--filter="TestCategory!=Smoke&TestCategory!=Images&TestCategory!=Uploads&TestCategory!=Moderations&TestCategory!=Manual"
30+
--filter="TestCategory!=Smoke&TestCategory!=Images&TestCategory!=Uploads&TestCategory!=Moderations&TestCategory!=FineTuning&TestCategory!=Manual"
3131
--logger "trx;LogFilePrefix=live"
3232
--results-directory ${{github.workspace}}/artifacts/test-results
3333
${{ env.version_suffix_args}}

.github/workflows/release.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -45,11 +45,11 @@ jobs:
4545
--filter="TestCategory=Smoke&TestCategory!=Manual"
4646
--logger "trx;LogFileName=${{ github.workspace }}/artifacts/test-results/smoke.trx"
4747
${{ env.version_suffix_args }}
48-
48+
4949
- name: Run Live Tests
5050
run: dotnet test ./tests/OpenAI.Tests.csproj
5151
--configuration Release
52-
--filter="TestCategory!=Smoke&TestCategory!=Images&TestCategory!=Uploads&TestCategory!=Moderations&TestCategory!=Manual"
52+
--filter="TestCategory!=Smoke&TestCategory!=Images&TestCategory!=Uploads&TestCategory!=Moderations&TestCategory!=FineTuning&TestCategory!=Manual"
5353
--logger "trx;LogFilePrefix=live"
5454
--results-directory ${{ github.workspace }}/artifacts/test-results
5555
${{ env.version_suffix_args }}

CHANGELOG.md

Lines changed: 27 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -6,16 +6,33 @@
66

77
### Breaking Changes
88

9-
- Refactored `ModerationResult` by merging `ModerationCategories` and `ModerationCategoryScores` into individual `ModerationCategory` properties, each with `Flagged` and `Score` properties. (commit_id)
10-
- Renamed type `OpenAIFileInfo` to `OpenAIFile` and `OpenAIFileInfoCollection` to `OpenAIFileCollection`. (commit_id)
11-
- Renamed type `OpenAIModelInfo` to `OpenAIModel` and `OpenAIModelInfoCollection` to `OpenAIModelCollection`. (commit_id)
12-
- Renamed type `Embedding` to `OpenAIEmbedding` and `EmbeddingCollection` to `OpenAIEmbeddingCollection`. (commit_id)
13-
- Renamed property `ImageUrl` to `ImageUri` and method `FromImageUrl` to `FromImageUri` in the `MessageContent` type. (commit_id)
14-
- Renamed property `ParallelToolCallsEnabled` to `AllowParallelToolCalls` in the `RunCreationOptions`, `ThreadRun`, and `ChatCompletionOptions` types. (commit_id)
15-
- Renamed properties `PromptTokens` to `InputTokenCount`, `CompletionTokens` to `OutputTokenCount`, and `TotalTokens` to `TotalTokenCount` in the `RunTokenUsage` and `RunStepTokenUsage` types. (commit_id)
16-
- Renamed properties `InputTokens` to `InputTokenCount` and `TotalTokens` to `TotalTokenCount` in the `EmbeddingTokenUsage` type. (commit_id)
17-
- Renamed properties `MaxPromptTokens` to `MaxInputTokenCount` and `MaxCompletionTokens` to `MaxOutputTokenCount` in the `ThreadRun`, `RunCreationOptions`, and `RunIncompleteReason` types. (commit_id)
18-
- Removed the `virtual` keyword from the `Pipeline` property across all clients. (commit_id)
9+
- Refactored `ModerationResult` by merging `ModerationCategories` and `ModerationCategoryScores` into individual `ModerationCategory` properties, each with `Flagged` and `Score` properties. (commit_hash)
10+
- Renamed type `OpenAIFileInfo` to `OpenAIFile` and `OpenAIFileInfoCollection` to `OpenAIFileCollection`. (commit_hash)
11+
- Renamed type `OpenAIModelInfo` to `OpenAIModel` and `OpenAIModelInfoCollection` to `OpenAIModelCollection`. (commit_hash)
12+
- Renamed type `Embedding` to `OpenAIEmbedding` and `EmbeddingCollection` to `OpenAIEmbeddingCollection`. (commit_hash)
13+
- Renamed property `ImageUrl` to `ImageUri` and method `FromImageUrl` to `FromImageUri` in the `MessageContent` type. (commit_hash)
14+
- Renamed property `ParallelToolCallsEnabled` to `AllowParallelToolCalls` in the `RunCreationOptions`, `ThreadRun`, and `ChatCompletionOptions` types. (commit_hash)
15+
- Renamed properties `PromptTokens` to `InputTokenCount`, `CompletionTokens` to `OutputTokenCount`, and `TotalTokens` to `TotalTokenCount` in the `RunTokenUsage` and `RunStepTokenUsage` types. (commit_hash)
16+
- Renamed properties `InputTokens` to `InputTokenCount` and `TotalTokens` to `TotalTokenCount` in the `EmbeddingTokenUsage` type. (commit_hash)
17+
- Renamed properties `MaxPromptTokens` to `MaxInputTokenCount` and `MaxCompletionTokens` to `MaxOutputTokenCount` in the `ThreadRun`, `RunCreationOptions`, and `RunIncompleteReason` types. (commit_hash)
18+
- Removed the `virtual` keyword from the `Pipeline` property across all clients. (commit_hash)
19+
- Renamed the `Granularities` property of `AudioTranscriptionOptions` to `TimestampGranularities`. (commit_hash)
20+
- Changed `AudioTranscriptionFormat` from an enum to an "extensible enum". (commit_hash)
21+
- Changed `AudioTranslationFormat` from an enum to an "extensible enum". (commit_hash)
22+
- Changed `GenerateImageFormat` from an enum to an "extensible enum". (commit_hash)
23+
- Changed `GeneratedImageQuality` from an enum to an "extensible enum". (commit_hash)
24+
- Changed `GeneratedImageStyle` from an enum to an "extensible enum". (commit_hash)
25+
- Removed method overloads in `AssistantClient` and `VectorStoreClient` that take complex parameters in favor of methods that take simple string IDs. (commit_hash)
26+
- Updated the `TokenIds` property type in the `TranscribedSegment` type from `IReadOnlyList<int>` to `ReadOnlyMemory<int>`. (commit_hash)
27+
- Updated the `inputs` parameter type in the `GenerateEmbeddings` and `GenerateEmbeddingsAsync` methods of `EmbeddingClient` from `IEnumerable<IEnumerable<int>>` to `IEnumerable<ReadOnlyMemory<int>>`. (commit_hash)
28+
- Changed `ChatMessageContentPartKind` from an extensible enum to an enum. (commit_hash)
29+
- Changed `ChatToolCallKind` from an extensible enum to an enum. (commit_hash)
30+
- Changed `ChatToolKind` from an extensible enum to an enum. (commit_hash)
31+
- Changed `OpenAIFilePurpose` from an extensible enum to an enum. (commit_hash)
32+
- Changed `OpenAIFileStatus` from an extensible enum to an enum. (commit_hash)
33+
- Renamed `OpenAIFilePurpose` to `FilePurpose`. (commit_hash)
34+
- Renamed `OpenAIFileStatus` to `FileStatus`. (commit_hash)
35+
- Removed constructors that take string API key and options. (commit_hash)
1936

2037
### Bugs Fixed
2138

README.md

Lines changed: 78 additions & 30 deletions
Original file line numberDiff line numberDiff line change
@@ -26,6 +26,7 @@ It is generated from our [OpenAPI specification](https://github.com/openai/opena
2626
- [How to work with Azure OpenAI](#how-to-work-with-azure-openai)
2727
- [Advanced scenarios](#advanced-scenarios)
2828
- [Using protocol methods](#using-protocol-methods)
29+
- [Mock a client for testing](#mock-a-client-for-testing)
2930
- [Automatically retrying errors](#automatically-retrying-errors)
3031
- [Observability](#observability)
3132

@@ -129,7 +130,7 @@ foreach (StreamingChatCompletionUpdate update in updates)
129130
{
130131
foreach (ChatMessageContentPart updatePart in update.ContentUpdate)
131132
{
132-
Console.Write(updatePart);
133+
Console.Write(updatePart.Text);
133134
}
134135
}
135136
```
@@ -309,7 +310,7 @@ To use structured outputs to constrain chat completion content, set an appropria
309310
ChatCompletionOptions options = new()
310311
{
311312
ResponseFormat = ChatResponseFormat.CreateJsonSchemaFormat(
312-
name: "math_reasoning",
313+
jsonSchemaFormatName: "math_reasoning",
313314
jsonSchema: BinaryData.FromString("""
314315
{
315316
"type": "object",
@@ -332,15 +333,15 @@ ChatCompletionOptions options = new()
332333
"additionalProperties": false
333334
}
334335
"""),
335-
strictSchemaEnabled: true)
336+
jsonSchemaIsStrict: true)
336337
};
337338

338339
ChatCompletion chatCompletion = await client.CompleteChatAsync(
339340
["How can I solve 8x + 7 = -23?"],
340341
options);
341342

342343
using JsonDocument structuredJson = JsonDocument.Parse(chatCompletion.ToString());
343-
344+
344345
Console.WriteLine($"Final answer: {structuredJson.RootElement.GetProperty("final_answer").GetString()}");
345346
Console.WriteLine("Reasoning steps:");
346347

@@ -360,22 +361,22 @@ To generate a text embedding, use `EmbeddingClient` from the `OpenAI.Embeddings`
360361
```csharp
361362
using OpenAI.Embeddings;
362363

363-
EmbeddingClient client = new(model: "text-embedding-3-small", Environment.GetEnvironmentVariable("OPENAI_API_KEY"));
364+
EmbeddingClient client = new("text-embedding-3-small", Environment.GetEnvironmentVariable("OPENAI_API_KEY"));
364365

365366
string description = "Best hotel in town if you like luxury hotels. They have an amazing infinity pool, a spa,"
366367
+ " and a really helpful concierge. The location is perfect -- right downtown, close to all the tourist"
367368
+ " attractions. We highly recommend this hotel.";
368369

369-
Embedding embedding = client.GenerateEmbedding(description);
370-
ReadOnlyMemory<float> vector = embedding.Vector;
370+
OpenAIEmbedding embedding = client.GenerateEmbedding(description);
371+
ReadOnlyMemory<float> vector = embedding.ToFloats();
371372
```
372373

373374
Notice that the resulting embedding is a list (also called a vector) of floating point numbers represented as an instance of `ReadOnlyMemory<float>`. By default, the length of the embedding vector will be 1536 when using the `text-embedding-3-small` model or 3072 when using the `text-embedding-3-large` model. Generally, larger embeddings perform better, but using them also tends to cost more in terms of compute, memory, and storage. You can reduce the dimensions of the embedding by creating an instance of the `EmbeddingGenerationOptions` class, setting the `Dimensions` property, and passing it as an argument in your call to the `GenerateEmbedding` method:
374375

375376
```csharp
376377
EmbeddingGenerationOptions options = new() { Dimensions = 512 };
377378

378-
Embedding embedding = client.GenerateEmbedding(description, options);
379+
OpenAIEmbedding embedding = client.GenerateEmbedding(description, options);
379380
```
380381

381382
## How to generate images
@@ -387,7 +388,7 @@ To generate an image, use `ImageClient` from the `OpenAI.Images` namespace:
387388
```csharp
388389
using OpenAI.Images;
389390

390-
ImageClient client = new(model: "dall-e-3", Environment.GetEnvironmentVariable("OPENAI_API_KEY"));
391+
ImageClient client = new("dall-e-3", Environment.GetEnvironmentVariable("OPENAI_API_KEY"));
391392
```
392393

393394
Generating an image always requires a `prompt` that describes what should be generated. To further tailor the image generation to your specific needs, you can create an instance of the `ImageGenerationOptions` class and set the `Quality`, `Size`, and `Style` properties accordingly. Note that you can also set the `ResponseFormat` property of `ImageGenerationOptions` to `GeneratedImageFormat.Bytes` in order to receive the resulting PNG as `BinaryData` (instead of the default remote `Uri`) if this is convenient for your use case.
@@ -431,14 +432,14 @@ In this example, an audio file is transcribed using the Whisper speech-to-text m
431432
```csharp
432433
using OpenAI.Audio;
433434

434-
AudioClient client = new(model: "whisper-1", Environment.GetEnvironmentVariable("OPENAI_API_KEY"));
435+
AudioClient client = new("whisper-1", Environment.GetEnvironmentVariable("OPENAI_API_KEY"));
435436

436437
string audioFilePath = Path.Combine("Assets", "audio_houseplant_care.mp3");
437438

438439
AudioTranscriptionOptions options = new()
439440
{
440441
ResponseFormat = AudioTranscriptionFormat.Verbose,
441-
Granularities = AudioTimestampGranularities.Word | AudioTimestampGranularities.Segment,
442+
TimestampGranularities = AudioTimestampGranularities.Word | AudioTimestampGranularities.Segment,
442443
};
443444

444445
AudioTranscription transcription = client.TranscribeAudio(audioFilePath, options);
@@ -450,14 +451,14 @@ Console.WriteLine();
450451
Console.WriteLine($"Words:");
451452
foreach (TranscribedWord word in transcription.Words)
452453
{
453-
Console.WriteLine($" {word.Word,15} : {word.Start.TotalMilliseconds,5:0} - {word.End.TotalMilliseconds,5:0}");
454+
Console.WriteLine($" {word.Word,15} : {word.StartTime.TotalMilliseconds,5:0} - {word.EndTime.TotalMilliseconds,5:0}");
454455
}
455456

456457
Console.WriteLine();
457458
Console.WriteLine($"Segments:");
458459
foreach (TranscribedSegment segment in transcription.Segments)
459460
{
460-
Console.WriteLine($" {segment.Text,90} : {segment.Start.TotalMilliseconds,5:0} - {segment.End.TotalMilliseconds,5:0}");
461+
Console.WriteLine($" {segment.Text,90} : {segment.StartTime.TotalMilliseconds,5:0} - {segment.EndTime.TotalMilliseconds,5:0}");
461462
}
462463
```
463464

@@ -516,7 +517,7 @@ using Stream document = BinaryData.FromString("""
516517
Upload this document to OpenAI using the `FileClient`'s `UploadFile` method, ensuring that you use `FileUploadPurpose.Assistants` to allow your assistant to access it later:
517518

518519
```csharp
519-
OpenAIFileInfo salesFile = fileClient.UploadFile(
520+
OpenAIFile salesFile = fileClient.UploadFile(
520521
document,
521522
"monthly_sales.json",
522523
FileUploadPurpose.Assistants);
@@ -584,8 +585,8 @@ Finally, you can use the `AssistantClient`'s `GetMessages` method to retrieve th
584585
For illustrative purposes, you could print the messages to the console and also save any images produced by the assistant to local storage:
585586

586587
```csharp
587-
PageCollection<ThreadMessage> messagePages = assistantClient.GetMessages(threadRun.ThreadId, new MessageCollectionOptions() { Order = ListOrder.OldestFirst });
588-
IEnumerable<ThreadMessage> messages = messagePages.GetAllValues();
588+
CollectionResult<ThreadMessage> messages
589+
= assistantClient.GetMessages(threadRun.ThreadId, new MessageCollectionOptions() { Order = MessageCollectionOrder.Ascending });
589590

590591
foreach (ThreadMessage message in messages)
591592
{
@@ -616,7 +617,7 @@ foreach (ThreadMessage message in messages)
616617
}
617618
if (!string.IsNullOrEmpty(contentItem.ImageFileId))
618619
{
619-
OpenAIFileInfo imageInfo = fileClient.GetFile(contentItem.ImageFileId);
620+
OpenAIFile imageInfo = fileClient.GetFile(contentItem.ImageFileId);
620621
BinaryData imageBytes = fileClient.DownloadFile(contentItem.ImageFileId);
621622
using FileStream stream = File.OpenWrite($"{imageInfo.Filename}.png");
622623
imageBytes.ToStream().CopyTo(stream);
@@ -666,8 +667,8 @@ AssistantClient assistantClient = openAIClient.GetAssistantClient();
666667
For this example, we will use both image data from a local file as well as an image located at a URL. For the local data, we upload the file with the `Vision` upload purpose, which would also allow it to be downloaded and retrieved later.
667668

668669
```csharp
669-
OpenAIFileInfo pictureOfAppleFile = fileClient.UploadFile(
670-
"picture-of-apple.jpg",
670+
OpenAIFile pictureOfAppleFile = fileClient.UploadFile(
671+
Path.Combine("Assets", "picture-of-apple.png"),
671672
FileUploadPurpose.Vision);
672673
Uri linkToPictureOfOrange = new("https://platform.openai.com/fictitious-files/picture-of-orange.png");
673674
```
@@ -676,7 +677,7 @@ Next, create a new assistant with a vision-capable model like `gpt-4o` and a thr
676677

677678
```csharp
678679
Assistant assistant = assistantClient.CreateAssistant(
679-
model: "gpt-4o",
680+
"gpt-4o",
680681
new AssistantCreationOptions()
681682
{
682683
Instructions = "When asked a question, attempt to answer very concisely. "
@@ -686,23 +687,24 @@ Assistant assistant = assistantClient.CreateAssistant(
686687
AssistantThread thread = assistantClient.CreateThread(new ThreadCreationOptions()
687688
{
688689
InitialMessages =
689-
{
690-
new ThreadInitializationMessage(
691-
[
692-
"Hello, assistant! Please compare these two images for me:",
693-
MessageContent.FromImageFileId(pictureOfAppleFile.Id),
694-
MessageContent.FromImageUrl(linkToPictureOfOrange),
695-
]),
696-
}
690+
{
691+
new ThreadInitializationMessage(
692+
MessageRole.User,
693+
[
694+
"Hello, assistant! Please compare these two images for me:",
695+
MessageContent.FromImageFileId(pictureOfAppleFile.Id),
696+
MessageContent.FromImageUri(linkToPictureOfOrange),
697+
]),
698+
}
697699
});
698700
```
699701

700702
With the assistant and thread prepared, use the `CreateRunStreaming` method to get an enumerable `CollectionResult<StreamingUpdate>`. You can then iterate over this collection with `foreach`. For async calling patterns, use `CreateRunStreamingAsync` and iterate over the `AsyncCollectionResult<StreamingUpdate>` with `await foreach`, instead. Note that streaming variants also exist for `CreateThreadAndRunStreaming` and `SubmitToolOutputsToRunStreaming`.
701703

702704
```csharp
703705
CollectionResult<StreamingUpdate> streamingUpdates = assistantClient.CreateRunStreaming(
704-
thread,
705-
assistant,
706+
thread.Id,
707+
assistant.Id,
706708
new RunCreationOptions()
707709
{
708710
AdditionalInstructions = "When possible, try to sneak in puns if you're asked to compare things.",
@@ -795,6 +797,52 @@ string message = outputAsJson.RootElement
795797

796798
Notice how you can then call the resulting `ClientResult`'s `GetRawResponse` method and retrieve the response body as `BinaryData` via the `PipelineResponse`'s `Content` property.
797799

800+
### Mock a client for testing
801+
802+
The OpenAI .NET library has been designed to support mocking, providing key features such as:
803+
- Client methods made virtual to allow overriding.
804+
- Model factories to assist in instantiating API output models that lack public constructors.
805+
806+
To illustrate how mocking works, suppose you want to validate the behavior of the following method using the [Moq](https://github.com/devlooped/moq) library. Given the path to an audio file, it determines whether it contains a specified secret word:
807+
808+
```csharp
809+
public bool ContainsSecretWord(AudioClient client, string audioFilePath, string secretWord)
810+
{
811+
AudioTranscription transcription = client.TranscribeAudio(audioFilePath);
812+
return transcription.Text.Contains(secretWord);
813+
}
814+
```
815+
816+
Create mocks of `AudioClient` and `ClientResult<AudioTranscription>`, set up methods and properties that will be invoked, then test the behavior of the `ContainsSecretWord` method. Since the `AudioTranscription` class does not provide public constructors, it must be instantiated by the `OpenAIAudioModelFactory` static class:
817+
818+
```csharp
819+
// Instantiate mocks and the AudioTranscription object.
820+
821+
Mock<AudioClient> mockClient = new();
822+
Mock<ClientResult<AudioTranscription>> mockResult = new(null, Mock.Of<PipelineResponse>());
823+
AudioTranscription transcription = OpenAIAudioModelFactory.AudioTranscription(text: "I swear I saw an apple flying yesterday!");
824+
825+
// Set up mocks' properties and methods.
826+
827+
mockResult
828+
.SetupGet(result => result.Value)
829+
.Returns(transcription);
830+
831+
mockClient.Setup(client => client.TranscribeAudio(
832+
It.IsAny<string>(),
833+
It.IsAny<AudioTranscriptionOptions>()))
834+
.Returns(mockResult.Object);
835+
836+
// Perform validation.
837+
838+
AudioClient client = mockClient.Object;
839+
bool containsSecretWord = ContainsSecretWord(client, "<audioFilePath>", "apple");
840+
841+
Assert.That(containsSecretWord, Is.True);
842+
```
843+
844+
All namespaces have their corresponding model factory to support mocking with the exception of the `OpenAI.Assistants` and `OpenAI.VectorStores` namespaces, for which model factories are coming soon.
845+
798846
### Automatically retrying errors
799847

800848
By default, the client classes will automatically retry the following errors up to three additional times using exponential backoff:

0 commit comments

Comments
 (0)