Skip to content

Commit d2ccb54

Browse files
committed
merge
2 parents c02ccc3 + 5b5f6f6 commit d2ccb54

File tree

121 files changed

+2587
-1372
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

121 files changed

+2587
-1372
lines changed

.github/workflows/release.yml

Lines changed: 60 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -69,16 +69,67 @@ jobs:
6969
name: build-artifacts
7070
path: ${{ github.workspace }}/artifacts
7171

72+
sign:
73+
needs: build
74+
runs-on: windows-latest # Code signing must run on a Windows agent for Authenticode signing (dll/exe)
75+
environment: release # Needed for OIDC subject for releases triggered on release being created.
76+
permissions:
77+
id-token: write # Required for requesting the JWT
78+
79+
steps:
80+
- name: Download build artifacts
81+
uses: actions/download-artifact@v4
82+
with:
83+
name: build-artifacts
84+
path: ${{ github.workspace }}/build-artifacts
85+
86+
- name: Setup .NET
87+
uses: actions/setup-dotnet@v3
88+
with:
89+
dotnet-version: '9.x'
90+
91+
- name: Install Sign CLI tool
92+
run: dotnet tool install --tool-path . --prerelease sign
93+
94+
- name: 'Az CLI login'
95+
uses: azure/login@v2
96+
with:
97+
client-id: 80125de0-6f58-4f16-bd05-b2fa621d36a5
98+
tenant-id: 16076fdc-fcc1-4a15-b1ca-32c9a255900e
99+
allow-no-subscriptions: true
100+
101+
- name: Sign artifacts
102+
shell: pwsh
103+
run: >
104+
./sign code azure-key-vault
105+
**/*.nupkg
106+
--base-directory "${{ github.workspace }}/build-artifacts/packages"
107+
--publisher-name "OpenAI"
108+
--description "OpenAI library for .NET"
109+
--description-url "https://github.com/openai/openai-dotnet"
110+
--azure-credential-type "azure-cli"
111+
--azure-key-vault-url "https://sc-openaisdk.vault.azure.net/"
112+
--azure-key-vault-certificate "OpenAISDKSCCert"
113+
114+
- name: Upload signed artifact
115+
uses: actions/upload-artifact@v4
116+
with:
117+
name: build-artifacts-signed
118+
path: ${{ github.workspace }}/build-artifacts
119+
72120
deploy:
73121
name: Publish Package
74-
needs: build
122+
needs: sign
75123
runs-on: ubuntu-latest
76124
steps:
77125
- name: Checkout code
78126
uses: actions/checkout@v2
79127

80128
- name: Download build artifacts
81129
uses: actions/download-artifact@v4
130+
with:
131+
name: build-artifacts-signed
132+
path: ${{ github.workspace }}/build-artifacts
82133

83134
- name: Upload release asset
84135
if: github.event_name == 'release'
@@ -92,6 +143,13 @@ jobs:
92143
run: |
93144
gh release edit "${{ github.event.release.tag_name }}" \
94145
--notes "See full changelog: ${{ github.server_url }}/${{ github.repository }}/blob/${{ github.event.release.tag_name }}/CHANGELOG.md"
146+
env:
147+
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
148+
149+
- name: Setup .NET
150+
uses: actions/setup-dotnet@v3
151+
with:
152+
dotnet-version: '9.x'
95153

96154
- name: NuGet authenticate
97155
run: dotnet nuget add source
@@ -114,4 +172,4 @@ jobs:
114172
${{ github.workspace }}/build-artifacts/packages/*.nupkg
115173
--source https://api.nuget.org/v3/index.json
116174
--api-key ${{ secrets.NUGET_API_KEY }}
117-
--skip-duplicate
175+
--skip-duplicate

CHANGELOG.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -4,9 +4,9 @@
44

55
### Other changes
66

7-
- Updated to `System.ClientModel` 1.5.1, which contains a fix for a concurrency bug which could cause some applications running on the legacy .NET Framework to experience an infinite loop while deserializing service responses.
8-
7+
- Updated to `System.ClientModel` 1.5.1, which contains a fix for a concurrency bug which could cause some applications running on the legacy .NET Framework to experience an infinite loop while deserializing service responses.
98
- Removed explicit `net6.0` target framework, as this version reached end-of-life in November, 2024 and is no longer maintained nor supported by Microsoft. This does not prevent using the OpenAI library on .NET 6, as the runtime will fallback to the `netstandard2.0` target.
9+
- The NuGet package now contains signed binaries.
1010

1111
## 2.2.0 (2025-07-02)
1212

@@ -283,7 +283,7 @@
283283
### Features Added
284284

285285
- Added a new `RealtimeConversationClient` in a corresponding scenario namespace. ([ff75da4](https://github.com/openai/openai-dotnet/commit/ff75da4167bc83fa85eb69ac142cab88a963ed06))
286-
- This maps to the new `/realtime` beta endpoint and is thus marked with a new `[Experimental("OPENAI002")]` diagnostic tag.
286+
- This maps to the new `/realtime` beta endpoint and is thus marked with a new `[Experimental("OPENAI002")]` diagnostic tag.
287287
- This is a very early version of the convenience surface and thus subject to significant change
288288
- Documentation and samples will arrive soon; in the interim, see [the scenario test files](/tests/RealtimeConversation) for basic usage
289289
- You can also find an external sample employing this client, together with Azure OpenAI support, at https://github.com/Azure-Samples/aoai-realtime-audio-sdk/tree/main/dotnet/samples/console
@@ -441,7 +441,7 @@
441441
### Breaking Changes
442442

443443
- Renamed `AudioClient`'s `GenerateSpeechFromText` methods to simply `GenerateSpeech`. ([d84bf54](https://github.com/openai/openai-dotnet/commit/d84bf54df14ddac4c49f6efd61467b600d34ecd7))
444-
- Changed the type of `OpenAIFileInfo`'s `SizeInBytes` property from `long?` to `int?`. ([d84bf54](https://github.com/openai/openai-dotnet/commit/d84bf54df14ddac4c49f6efd61467b600d34ecd7))
444+
- Changed the type of `OpenAIFileInfo`'s `SizeInBytes` property from `long?` to `int?`. ([d84bf54](https://github.com/openai/openai-dotnet/commit/d84bf54df14ddac4c49f6efd61467b600d34ecd7))
445445

446446
### Bugs Fixed
447447

README.md

Lines changed: 40 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,7 @@ It is generated from our [OpenAPI specification](https://github.com/openai/opena
1515
- [Namespace organization](#namespace-organization)
1616
- [Using the async API](#using-the-async-api)
1717
- [Using the `OpenAIClient` class](#using-the-openaiclient-class)
18+
- [How to use dependency injection](#how-to-use-dependency-injection)
1819
- [How to use chat completions with streaming](#how-to-use-chat-completions-with-streaming)
1920
- [How to use chat completions with tools and function calling](#how-to-use-chat-completions-with-tools-and-function-calling)
2021
- [How to use chat completions with structured outputs](#how-to-use-chat-completions-with-structured-outputs)
@@ -140,6 +141,45 @@ AudioClient ttsClient = client.GetAudioClient("tts-1");
140141
AudioClient whisperClient = client.GetAudioClient("whisper-1");
141142
```
142143

144+
## How to use dependency injection
145+
146+
The OpenAI clients are **thread-safe** and can be safely registered as **singletons** in ASP.NET Core's Dependency Injection container. This maximizes resource efficiency and HTTP connection reuse.
147+
148+
Register the `ChatClient` as a singleton in your `Program.cs`:
149+
150+
```csharp
151+
builder.Services.AddSingleton<ChatClient>(serviceProvider =>
152+
{
153+
var apiKey = Environment.GetEnvironmentVariable("OPENAI_API_KEY");
154+
155+
return new ChatClient(apiKey);
156+
});
157+
```
158+
159+
Then inject and use the client in your controllers or services:
160+
161+
```csharp
162+
[ApiController]
163+
[Route("api/[controller]")]
164+
public class ChatController : ControllerBase
165+
{
166+
private readonly ChatClient _chatClient;
167+
168+
public ChatController(ChatClient chatClient)
169+
{
170+
_chatClient = chatClient;
171+
}
172+
173+
[HttpPost("complete")]
174+
public async Task<IActionResult> CompleteChat([FromBody] string message)
175+
{
176+
ChatCompletion completion = await _chatClient.CompleteChatAsync(message);
177+
178+
return Ok(new { response = completion.Content[0].Text });
179+
}
180+
}
181+
```
182+
143183
## How to use chat completions with streaming
144184

145185
When you request a chat completion, the default behavior is for the server to generate it in its entirety before sending it back in a single response. Consequently, long chat completions can require waiting for several seconds before hearing back from the server. To mitigate this, the OpenAI REST API supports the ability to stream partial results back as they are being generated, allowing you to start processing the beginning of the completion before it is finished.

0 commit comments

Comments
 (0)