Skip to content

Commit 050bf6d

Browse files
authored
Merge pull request #5411 from samuel100/samuel100/sdk-add-languages
init additions on rust and c# sdks
2 parents d7511e2 + f56d50e commit 050bf6d

File tree

8 files changed

+623
-5
lines changed

8 files changed

+623
-5
lines changed

articles/ai-foundry/foundry-local/how-to/how-to-integrate-with-inference-sdks.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -28,6 +28,12 @@ Foundry Local integrates with various inferencing SDKs - such as OpenAI, Azure O
2828
::: zone pivot="programming-language-javascript"
2929
[!INCLUDE [JavaScript](../includes/integrate-examples/javascript.md)]
3030
::: zone-end
31+
::: zone pivot="programming-language-csharp"
32+
[!INCLUDE [JavaScript](../includes/integrate-examples/csharp.md)]
33+
::: zone-end
34+
::: zone pivot="programming-language-rust"
35+
[!INCLUDE [JavaScript](../includes/integrate-examples/rust.md)]
36+
::: zone-end
3137

3238
## Next steps
3339

articles/ai-foundry/foundry-local/how-to/how-to-use-langchain-with-foundry-local.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ ms.reviewer: eneros
1111
ms.author: eneros
1212
author: eneros
1313
ms.custom: build-2025
14-
zone_pivot_groups: foundry-local-sdk
14+
zone_pivot_groups: foundry-local-langchain
1515
#customer intent: As a developer, I want to get started with Foundry Local so that I can run AI models locally.
1616
---
1717

Lines changed: 70 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,70 @@
1+
---
2+
ms.service: azure-ai-foundry
3+
ms.topic: include
4+
ms.date: 05/02/2025
5+
ms.author: samkemp
6+
author: samuel100
7+
---
8+
9+
## Create project
10+
11+
Create a new C# project and navigate into it:
12+
13+
```bash
14+
dotnet new console -n hello-foundry-local
15+
cd hello-foundry-local
16+
```
17+
18+
### Install NuGet Packages
19+
20+
Install the following NuGet packages into your project folder:
21+
22+
```bash
23+
dotnet add package Microsoft.AI.Foundry.Local --version 0.1.0
24+
dotnet add package OpenAI --version 2.2.0-beta.4
25+
```
26+
27+
## Use OpenAI SDK with Foundry Local
28+
29+
The following example demonstrates how to use the OpenAI SDK with Foundry Local. The code initializes the Foundry Local service, loads a model, and generates a response using the OpenAI SDK.
30+
31+
Copy-and-paste the following code into a C# file named `Program.cs`:
32+
33+
```csharp
34+
using Microsoft.AI.Foundry.Local;
35+
using OpenAI;
36+
using OpenAI.Chat;
37+
using System.ClientModel;
38+
using System.Diagnostics.Metrics;
39+
40+
var alias = "phi-3.5-mini";
41+
42+
var manager = await FoundryLocalManager.StartModelAsync(aliasOrModelId: alias);
43+
44+
var model = await manager.GetModelInfoAsync(aliasOrModelId: alias);
45+
ApiKeyCredential key = new ApiKeyCredential(manager.ApiKey);
46+
OpenAIClient client = new OpenAIClient(key, new OpenAIClientOptions
47+
{
48+
Endpoint = manager.Endpoint
49+
});
50+
51+
var chatClient = client.GetChatClient(model?.ModelId);
52+
53+
var completionUpdates = chatClient.CompleteChatStreaming("Why is the sky blue'");
54+
55+
Console.Write($"[ASSISTANT]: ");
56+
foreach (var completionUpdate in completionUpdates)
57+
{
58+
if (completionUpdate.ContentUpdate.Count > 0)
59+
{
60+
Console.Write(completionUpdate.ContentUpdate[0].Text);
61+
}
62+
}
63+
```
64+
65+
Run the code using the following command:
66+
67+
```bash
68+
dotnet run
69+
```
70+
Lines changed: 72 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,72 @@
1+
---
2+
ms.service: azure-ai-foundry
3+
ms.topic: include
4+
ms.date: 05/02/2025
5+
ms.author: samkemp
6+
author: samuel100
7+
---
8+
9+
## Create project
10+
11+
Create a new Rust project and navigate into it:
12+
13+
```bash
14+
cargo new hello-foundry-local
15+
cd hello-foundry-local
16+
```
17+
18+
### Install crates
19+
20+
Install the following Rust crates using Cargo:
21+
22+
```bash
23+
cargo add foundry-local anyhow env_logger serde_json
24+
cargo add reqwest --features json
25+
cargo add tokio --features full
26+
```
27+
28+
## Update the `main.rs` file
29+
30+
The following example demonstrates how to inference using a request to the Foundry Local service. The code initializes the Foundry Local service, loads a model, and generates a response using the `reqwest` library.
31+
32+
Copy-and-paste the following code into the Rust file named `main.rs`:
33+
34+
```rust
35+
use foundry_local::FoundryLocalManager;
36+
use anyhow::Result;
37+
38+
#[tokio::main]
39+
async fn main() -> Result<()> {
40+
// Create a FoundryLocalManager instance with default options
41+
let mut manager = FoundryLocalManager::builder()
42+
.alias_or_model_id("qwen2.5-0.5b") // Specify the model to use
43+
.bootstrap(true) // Start the service if not running
44+
.build()
45+
.await?;
46+
47+
// Use the OpenAI compatible API to interact with the model
48+
let client = reqwest::Client::new();
49+
let endpoint = manager.endpoint()?;
50+
let response = client.post(format!("{}/chat/completions", endpoint))
51+
.header("Content-Type", "application/json")
52+
.header("Authorization", format!("Bearer {}", manager.api_key()))
53+
.json(&serde_json::json!({
54+
"model": manager.get_model_info("qwen2.5-0.5b", true).await?.id,
55+
"messages": [{"role": "user", "content": "What is the golden ratio?"}],
56+
}))
57+
.send()
58+
.await?;
59+
60+
let result = response.json::<serde_json::Value>().await?;
61+
println!("{}", result["choices"][0]["message"]["content"]);
62+
63+
Ok(())
64+
}
65+
```
66+
67+
Run the code using the following command:
68+
69+
```bash
70+
cargo run
71+
```
72+

0 commit comments

Comments
 (0)