Skip to content

Commit ace84a3

Browse files
committed
go use data quickstart
1 parent e91e47b commit ace84a3

File tree

1 file changed

+172
-37
lines changed

1 file changed

+172
-37
lines changed

articles/ai-services/openai/includes/use-your-data-go.md

Lines changed: 172 additions & 37 deletions
Original file line numberDiff line numberDiff line change
@@ -8,20 +8,55 @@ ms.topic: include
88
ms.date: 01/17/2025
99
---
1010

11+
### Microsoft Entra ID prerequisites
12+
13+
For the recommended keyless authentication with Microsoft Entra ID, you need to:
14+
- Install the [Azure CLI](/cli/azure/install-azure-cli) used for keyless authentication with Microsoft Entra ID.
15+
- Assign the `Cognitive Services User` role to your user account. You can assign roles in the Azure portal under **Access control (IAM)** > **Add role assignment**.
16+
17+
## Set up
18+
19+
1. Create a new folder `dall-e-quickstart` and go to the quickstart folder with the following command:
20+
21+
```shell
22+
mkdir dall-e-quickstart && cd dall-e-quickstart
23+
```
24+
25+
1. For the **recommended** keyless authentication with Microsoft Entra ID, sign in to Azure with the following command:
26+
27+
```console
28+
az login
29+
```
30+
1131
[!INCLUDE [Set up required variables](./use-your-data-common-variables.md)]
1232

13-
## Create a Go environment
33+
## Run the quickstart
1434

15-
1. Create a new folder named *openai-go* for your project and a new Go code file named *sample.go*. Change into that directory:
35+
The sample code in this quickstart uses Microsoft Entra ID for the recommended keyless authentication. If you prefer to use an API key, you can replace the `NewDefaultAzureCredential` implementation with `NewKeyCredential`.
1636

17-
```cmd
18-
mkdir openai-go
19-
cd openai-go
20-
```
37+
#### [Microsoft Entra ID](#tab/keyless)
38+
39+
```go
40+
azureOpenAIEndpoint := os.Getenv("AZURE_OPENAI_ENDPOINT")
41+
credential, err := azidentity.NewDefaultAzureCredential(nil)
42+
client, err := azopenai.NewClient(azureOpenAIEndpoint, credential, nil)
43+
```
44+
45+
#### [API key](#tab/api-key)
46+
47+
```go
48+
azureOpenAIEndpoint := os.Getenv("AZURE_OPENAI_ENDPOINT")
49+
azureOpenAIKey := os.Getenv("AZURE_OPENAI_API_KEY")
50+
credential := azcore.NewKeyCredential(azureOpenAIKey)
51+
client, err := azopenai.NewClientWithKeyCredential(azureOpenAIEndpoint, credential, nil)
52+
```
53+
---
54+
55+
#### [Microsoft Entra ID](#tab/keyless)
2156

22-
## Create the app
57+
To run the sample:
2358

24-
1. From the project directory, open the *sample.go* file and add the following code:
59+
1. Create a new file named *quickstart.go*. Copy the following code into the *quickstart.go* file.
2560

2661
```golang
2762
package main
@@ -38,11 +73,111 @@ ms.date: 01/17/2025
3873
)
3974

4075
func main() {
41-
azureOpenAIKey := os.Getenv("AZURE_OPENAI_API_KEY")
76+
azureOpenAIEndpoint := os.Getenv("AZURE_OPENAI_ENDPOINT")
77+
credential, err := azidentity.NewDefaultAzureCredential(nil)
78+
client, err := azopenai.NewClient(azureOpenAIEndpoint, credential, nil)
79+
4280
modelDeploymentID := os.Getenv("AZURE_OPENAI_DEPLOYMENT_NAME")
4381

44-
// Ex: "https://<your-azure-openai-host>.openai.azure.com"
82+
// Azure AI Search configuration
83+
searchIndex := os.Getenv("AZURE_AI_SEARCH_INDEX")
84+
searchEndpoint := os.Getenv("AZURE_AI_SEARCH_ENDPOINT")
85+
searchAPIKey := os.Getenv("AZURE_AI_SEARCH_API_KEY")
86+
87+
if modelDeploymentID == "" || azureOpenAIEndpoint == "" || searchIndex == "" || searchEndpoint == "" || searchAPIKey == "" {
88+
fmt.Fprintf(os.Stderr, "Skipping example, environment variables missing\n")
89+
return
90+
}
91+
92+
client, err := azopenai.NewClientWithKeyCredential(azureOpenAIEndpoint, credential, nil)
93+
94+
if err != nil {
95+
// Implement application specific error handling logic.
96+
log.Printf("ERROR: %s", err)
97+
return
98+
}
99+
100+
resp, err := client.GetChatCompletions(context.TODO(), azopenai.ChatCompletionsOptions{
101+
Messages: []azopenai.ChatRequestMessageClassification{
102+
&azopenai.ChatRequestUserMessage{Content: azopenai.NewChatRequestUserMessageContent("What are my available health plans?")},
103+
},
104+
MaxTokens: to.Ptr[int32](512),
105+
AzureExtensionsOptions: []azopenai.AzureChatExtensionConfigurationClassification{
106+
&azopenai.AzureSearchChatExtensionConfiguration{
107+
// This allows Azure OpenAI to use an Azure AI Search index.
108+
// Answers are based on the model's pretrained knowledge
109+
// and the latest information available in the designated data source.
110+
Parameters: &azopenai.AzureSearchChatExtensionParameters{
111+
Endpoint: &searchEndpoint,
112+
IndexName: &searchIndex,
113+
Authentication: &azopenai.OnYourDataAPIKeyAuthenticationOptions{
114+
Key: &searchAPIKey,
115+
},
116+
},
117+
},
118+
},
119+
DeploymentName: &modelDeploymentID,
120+
}, nil)
121+
122+
if err != nil {
123+
// Implement application specific error handling logic.
124+
log.Printf("ERROR: %s", err)
125+
return
126+
}
127+
128+
fmt.Fprintf(os.Stderr, "Extensions Context Role: %s\nExtensions Context (length): %d\n",
129+
*resp.Choices[0].Message.Role,
130+
len(*resp.Choices[0].Message.Content))
131+
132+
fmt.Fprintf(os.Stderr, "ChatRole: %s\nChat content: %s\n",
133+
*resp.Choices[0].Message.Role,
134+
*resp.Choices[0].Message.Content,
135+
)
136+
}
137+
```
138+
139+
1. Run the following command to create a new Go module:
140+
141+
```shell
142+
go mod init quickstart.go
143+
```
144+
145+
1. Run `go mod tidy` to install the required dependencies:
146+
147+
```cmd
148+
go mod tidy
149+
```
150+
151+
1. Run the following command to run the sample:
152+
153+
```shell
154+
go run quickstart.go
155+
```
156+
157+
#### [API key](#tab/api-key)
158+
159+
To run the sample:
160+
161+
1. Create a new file named *quickstart.go*. Copy the following code into the *quickstart.go* file.
162+
163+
```golang
164+
package main
165+
166+
import (
167+
"context"
168+
"fmt"
169+
"log"
170+
"os"
171+
172+
"github.com/Azure/azure-sdk-for-go/sdk/ai/azopenai"
173+
"github.com/Azure/azure-sdk-for-go/sdk/azcore"
174+
"github.com/Azure/azure-sdk-for-go/sdk/azcore/to"
175+
)
176+
177+
func main() {
45178
azureOpenAIEndpoint := os.Getenv("AZURE_OPENAI_ENDPOINT")
179+
azureOpenAIKey := os.Getenv("AZURE_OPENAI_API_KEY")
180+
modelDeploymentID := os.Getenv("AZURE_OPENAI_DEPLOYMENT_NAME")
46181

47182
// Azure AI Search configuration
48183
searchIndex := os.Getenv("AZURE_AI_SEARCH_INDEX")
@@ -54,16 +189,15 @@ ms.date: 01/17/2025
54189
return
55190
}
56191

57-
keyCredential := azcore.NewKeyCredential(azureOpenAIKey)
192+
credential := azcore.NewKeyCredential(azureOpenAIKey)
58193

59-
// In Azure OpenAI you must deploy a model before you can use it in your client. For more information
60-
// see here: https://learn.microsoft.com/azure/cognitive-services/openai/how-to/create-resource
61-
client, err := azopenai.NewClientWithKeyCredential(azureOpenAIEndpoint, keyCredential, nil)
194+
client, err := azopenai.NewClientWithKeyCredential(azureOpenAIEndpoint, credential, nil)
62195

63-
if err != nil {
64-
// TODO: Update the following line with your application specific error handling logic
65-
log.Fatalf("ERROR: %s", err)
66-
}
196+
if err != nil {
197+
// Implement application specific error handling logic.
198+
log.Printf("ERROR: %s", err)
199+
return
200+
}
67201

68202
resp, err := client.GetChatCompletions(context.TODO(), azopenai.ChatCompletionsOptions{
69203
Messages: []azopenai.ChatRequestMessageClassification{
@@ -73,12 +207,8 @@ ms.date: 01/17/2025
73207
AzureExtensionsOptions: []azopenai.AzureChatExtensionConfigurationClassification{
74208
&azopenai.AzureSearchChatExtensionConfiguration{
75209
// This allows Azure OpenAI to use an Azure AI Search index.
76-
//
77-
// > Because the model has access to, and can reference specific sources to support its responses, answers are not only based on its pretrained knowledge
78-
// > but also on the latest information available in the designated data source. This grounding data also helps the model avoid generating responses
79-
// > based on outdated or incorrect information.
80-
//
81-
// Quote from here: https://learn.microsoft.com/en-us/azure/ai-services/openai/concepts/use-your-data
210+
// Answers are based on the model's pretrained knowledge
211+
// and the latest information available in the designated data source.
82212
Parameters: &azopenai.AzureSearchChatExtensionParameters{
83213
Endpoint: &searchEndpoint,
84214
IndexName: &searchIndex,
@@ -92,9 +222,10 @@ ms.date: 01/17/2025
92222
}, nil)
93223

94224
if err != nil {
95-
// TODO: Update the following line with your application specific error handling logic
96-
log.Fatalf("ERROR: %s", err)
97-
}
225+
// Implement application specific error handling logic.
226+
log.Printf("ERROR: %s", err)
227+
return
228+
}
98229

99230
fmt.Fprintf(os.Stderr, "Extensions Context Role: %s\nExtensions Context (length): %d\n",
100231
*resp.Choices[0].Message.Role,
@@ -107,20 +238,24 @@ ms.date: 01/17/2025
107238
}
108239
```
109240

110-
> [!IMPORTANT]
111-
> For production, use a secure way of storing and accessing your credentials like [Azure Key Vault](/azure/key-vault/general/overview). For more information about credential security, see the Azure AI services [security](../../security-features.md) article.
241+
1. Run the following command to create a new Go module:
112242

243+
```shell
244+
go mod init quickstart.go
245+
```
113246

114-
1. Now open a command prompt and run the following:
247+
1. Run `go mod tidy` to install the required dependencies:
115248

116-
```cmd
117-
go mod init sample.go
118-
```
119-
120-
1. Next run:
121249
```cmd
122250
go mod tidy
123-
go run sample.go
124251
```
125252
126-
The application prints the response including both answers to your query and citations from your uploaded files.
253+
1. Run the following command to run the sample:
254+
255+
```shell
256+
go run quickstart.go
257+
```
258+
259+
---
260+
261+
The application prints the response including both answers to your query and citations from your uploaded files.

0 commit comments

Comments
 (0)