Skip to content

Commit da6550a

Browse files
committed
Merge branch 'main' of https://github.com/MicrosoftDocs/azure-docs-pr into heidist-fix
2 parents 0db034e + af52321 commit da6550a

File tree

43 files changed

+580
-370
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

43 files changed

+580
-370
lines changed

articles/ai-services/computer-vision/Tutorials/liveness.md

Lines changed: 10 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -37,7 +37,7 @@ The liveness solution integration involves two different components: a mobile ap
3737
### Integrate liveness into mobile application
3838

3939
Once you have access to the SDK, follow instruction in the [azure-ai-vision-sdk](https://github.com/Azure-Samples/azure-ai-vision-sdk) GitHub repository to integrate the UI and the code into your native mobile application. The liveness SDK supports both Java/Kotlin for Android and Swift for iOS mobile applications:
40-
- For Swift iOS, follow the instructions in the [iOS sample](https://aka.ms/liveness-sample-ios)
40+
- For Swift iOS, follow the instructions in the [iOS sample](https://aka.ms/azure-ai-vision-face-liveness-client-sdk-ios-readme)
4141
- For Kotlin/Java Android, follow the instructions in the [Android sample](https://aka.ms/liveness-sample-java)
4242

4343
Once you've added the code into your application, the SDK will handle starting the camera, guiding the end-user to adjust their position, composing the liveness payload, and calling the Azure AI Face cloud service to process the liveness payload.
@@ -54,7 +54,7 @@ The high-level steps involved in liveness orchestration are illustrated below:
5454

5555
```json
5656
Request:
57-
curl --location 'https://face-gating-livenessdetection.ppe.cognitiveservices.azure.com/face/v1.1-preview.1/detectliveness/singlemodal/sessions' \
57+
curl --location '<insert-api-endpoint>/face/v1.1-preview.1/detectliveness/singlemodal/sessions' \
5858
--header 'Ocp-Apim-Subscription-Key:<insert-api-key>
5959
--header 'Content-Type: application/json' \
6060
--data '{
@@ -93,7 +93,7 @@ The high-level steps involved in liveness orchestration are illustrated below:
9393

9494
```json
9595
Request:
96-
curl --location 'https://face-gating-livenessdetection.ppe.cognitiveservices.azure.com/face/v1.1-preview.1/detectliveness/singlemodal/sessions/a3dc62a3-49d5-45a1-886c-36e7df97499a' \
96+
curl --location '<insert-api-endpoint>/face/v1.1-preview.1/detectliveness/singlemodal/sessions/a3dc62a3-49d5-45a1-886c-36e7df97499a' \
9797
--header 'Ocp-Apim-Subscription-Key: <insert-api-key>
9898

9999
Response:
@@ -178,13 +178,13 @@ The high-level steps involved in liveness with verification orchestration are il
178178

179179
```json
180180
Request:
181-
curl --location 'https://face-gating-livenessdetection.ppe.cognitiveservices.azure.com/face/v1.1-preview.1/detectlivenesswithverify/singlemodal/sessions' \
181+
curl --location '<insert-api-endpoint>/face/v1.1-preview.1/detectlivenesswithverify/singlemodal/sessions' \
182182
--header 'Ocp-Apim-Subscription-Key: <api_key>' \
183183
--form 'Parameters="{
184184
\"livenessOperationMode\": \"passive\",
185185
\"deviceCorrelationId\": \"723d6d03-ef33-40a8-9682-23a1feb7bccd\"
186186
}"' \
187-
--form 'VerifyImage=@"/C:/Users/nabilat/Pictures/test.png"'
187+
--form 'VerifyImage=@"test.png"'
188188

189189
Response:
190190
{
@@ -222,11 +222,11 @@ The high-level steps involved in liveness with verification orchestration are il
222222

223223
```json
224224
Request:
225-
curl --location 'https://face-gating-livenessdetection.ppe.cognitiveservices.azure.com/face/v1.1-preview.1/detectlivenesswithverify/singlemodal' \
225+
curl --location '<insert-api-endpoint>/face/v1.1-preview.1/detectlivenesswithverify/singlemodal' \
226226
--header 'Content-Type: multipart/form-data' \
227227
--header 'apim-recognition-model-preview-1904: true' \
228228
--header 'Authorization: Bearer.<session-authorization-token> \
229-
--form 'Content=@"/D:/work/scratch/data/clips/webpapp6/video.webp"' \
229+
--form 'Content=@"video.webp"' \
230230
--form 'Metadata="<insert-metadata>"
231231

232232
Response:
@@ -286,7 +286,7 @@ If you want to clean up and remove an Azure AI services subscription, you can de
286286

287287
## Next steps
288288

289-
See the liveness SDK reference to learn about other options in the liveness APIs.
289+
See the Azure AI Vision SDK reference to learn about other options in the liveness APIs.
290290

291-
- [Java (Android)](https://aka.ms/liveness-sdk-java)
292-
- [Swift (iOS)](https://aka.ms/liveness-sdk-ios)
291+
- [Kotlin (Android)](https://aka.ms/liveness-sample-java)
292+
- [Swift (iOS)](https://aka.ms/azure-ai-vision-face-liveness-client-sdk-ios-readme)

articles/ai-services/openai/includes/use-your-data-go.md

Lines changed: 79 additions & 84 deletions
Original file line numberDiff line numberDiff line change
@@ -36,91 +36,86 @@ ms.date: 08/29/2023
3636
package main
3737
3838
import (
39-
"context"
40-
"fmt"
41-
"log"
42-
"os"
43-
44-
"github.com/Azure/azure-sdk-for-go/sdk/ai/azopenai"
45-
"github.com/Azure/azure-sdk-for-go/sdk/azcore/to"
39+
"context"
40+
"fmt"
41+
"log"
42+
"os"
43+
44+
"github.com/Azure/azure-sdk-for-go/sdk/ai/azopenai"
45+
"github.com/Azure/azure-sdk-for-go/sdk/azcore"
46+
"github.com/Azure/azure-sdk-for-go/sdk/azcore/to"
4647
)
47-
48+
4849
func main() {
49-
azureOpenAIKey := os.Getenv("AOAIKey")
50-
modelDeploymentID := os.Getenv("AOAIDeploymentId")
51-
52-
// Ex: "https://<your-azure-openai-host>.openai.azure.com"
53-
azureOpenAIEndpoint := os.Getenv("AOAIEndpoint")
54-
55-
// Azure AI Search configuration
56-
searchIndex := os.Getenv("SearchIndex")
57-
searchEndpoint := os.Getenv("SearchEndpoint")
58-
searchAPIKey := os.Getenv("SearchKey")
59-
60-
if azureOpenAIKey == "" || modelDeploymentID == "" || azureOpenAIEndpoint == "" || searchIndex == "" || searchEndpoint == "" || searchAPIKey == "" {
61-
fmt.Fprintf(os.Stderr, "Skipping example, environment variables missing\n")
62-
return
63-
}
64-
65-
keyCredential, err := azopenai.NewKeyCredential(azureOpenAIKey)
66-
67-
if err != nil {
68-
// TODO: Update the following line with your application specific error handling logic
69-
log.Fatalf("ERROR: %s", err)
70-
}
71-
72-
// In Azure OpenAI you must deploy a model before you can use it in your client. For more information
73-
// see here: https://learn.microsoft.com/azure/cognitive-services/openai/how-to/create-resource
74-
client, err := azopenai.NewClientWithKeyCredential(azureOpenAIEndpoint, keyCredential, nil)
75-
76-
if err != nil {
77-
// TODO: Update the following line with your application specific error handling logic
78-
log.Fatalf("ERROR: %s", err)
79-
}
80-
81-
resp, err := client.GetChatCompletions(context.TODO(), azopenai.ChatCompletionsOptions{
82-
Messages: []azopenai.ChatMessage{
83-
{Content: to.Ptr("What are the differences between Azure Machine Learning and Azure AI services?"), Role: to.Ptr(azopenai.ChatRoleUser)},
84-
},
85-
MaxTokens: to.Ptr[int32](512),
86-
AzureExtensionsOptions: &azopenai.AzureChatExtensionOptions{
87-
Extensions: []azopenai.AzureChatExtensionConfiguration{
88-
{
89-
// This allows Azure OpenAI to use an Azure AI Search index.
90-
//
91-
// > Because the model has access to, and can reference specific sources to support its responses, answers are not only based on its pretrained knowledge
92-
// > but also on the latest information available in the designated data source. This grounding data also helps the model avoid generating responses
93-
// > based on outdated or incorrect information.
94-
//
95-
// Quote from here: https://learn.microsoft.com/en-us/azure/ai-services/openai/concepts/use-your-data
96-
Type: to.Ptr(azopenai.AzureChatExtensionTypeAzureCognitiveSearch),
97-
Parameters: azopenai.AzureCognitiveSearchChatExtensionConfiguration{
98-
Endpoint: &searchEndpoint,
99-
IndexName: &searchIndex,
100-
Key: &searchAPIKey,
101-
},
102-
},
103-
},
104-
},
105-
Deployment: modelDeploymentID,
106-
}, nil)
107-
108-
if err != nil {
109-
// TODO: Update the following line with your application specific error handling logic
110-
log.Fatalf("ERROR: %s", err)
111-
}
112-
113-
// Contains contextual information from your Azure chat completion extensions, configured above in `AzureExtensionsOptions`
114-
msgContext := resp.Choices[0].Message.Context
115-
116-
fmt.Fprintf(os.Stderr, "Extensions Context Role: %s\nExtensions Context (length): %d\n",
117-
*msgContext.Messages[0].Role,
118-
len(*msgContext.Messages[0].Content))
119-
120-
fmt.Fprintf(os.Stderr, "ChatRole: %s\nChat content: %s\n",
121-
*resp.Choices[0].Message.Role,
122-
*resp.Choices[0].Message.Content,
123-
)
50+
azureOpenAIKey := os.Getenv("AOAIKey")
51+
modelDeploymentID := os.Getenv("AOAIDeploymentId")
52+
53+
// Ex: "https://<your-azure-openai-host>.openai.azure.com"
54+
azureOpenAIEndpoint := os.Getenv("AOAIEndpoint")
55+
56+
// Azure AI Search configuration
57+
searchIndex := os.Getenv("SearchIndex")
58+
searchEndpoint := os.Getenv("SearchEndpoint")
59+
searchAPIKey := os.Getenv("SearchKey")
60+
61+
if azureOpenAIKey == "" || modelDeploymentID == "" || azureOpenAIEndpoint == "" || searchIndex == "" || searchEndpoint == "" || searchAPIKey == "" {
62+
fmt.Fprintf(os.Stderr, "Skipping example, environment variables missing\n")
63+
return
64+
}
65+
66+
keyCredential := azcore.NewKeyCredential(azureOpenAIKey)
67+
68+
// In Azure OpenAI you must deploy a model before you can use it in your client. For more information
69+
// see here: https://learn.microsoft.com/azure/cognitive-services/openai/how-to/create-resource
70+
client, err := azopenai.NewClientWithKeyCredential(azureOpenAIEndpoint, keyCredential, nil)
71+
72+
if err != nil {
73+
// TODO: Update the following line with your application specific error handling logic
74+
log.Fatalf("ERROR: %s", err)
75+
}
76+
77+
resp, err := client.GetChatCompletions(context.TODO(), azopenai.ChatCompletionsOptions{
78+
Messages: []azopenai.ChatRequestMessageClassification{
79+
&azopenai.ChatRequestUserMessage{Content: azopenai.NewChatRequestUserMessageContent("What are the differences between Azure Machine Learning and Azure AI services?")},
80+
},
81+
MaxTokens: to.Ptr[int32](512),
82+
AzureExtensionsOptions: []azopenai.AzureChatExtensionConfigurationClassification{
83+
&azopenai.AzureCognitiveSearchChatExtensionConfiguration{
84+
// This allows Azure OpenAI to use an Azure AI Search index.
85+
//
86+
// > Because the model has access to, and can reference specific sources to support its responses, answers are not only based on its pretrained knowledge
87+
// > but also on the latest information available in the designated data source. This grounding data also helps the model avoid generating responses
88+
// > based on outdated or incorrect information.
89+
//
90+
// Quote from here: https://learn.microsoft.com/en-us/azure/ai-services/openai/concepts/use-your-data
91+
Parameters: &azopenai.AzureCognitiveSearchChatExtensionParameters{
92+
Endpoint: &searchEndpoint,
93+
IndexName: &searchIndex,
94+
Authentication: &azopenai.OnYourDataAPIKeyAuthenticationOptions{
95+
Key: &searchAPIKey,
96+
},
97+
},
98+
},
99+
},
100+
DeploymentName: &modelDeploymentID,
101+
}, nil)
102+
103+
if err != nil {
104+
// TODO: Update the following line with your application specific error handling logic
105+
log.Fatalf("ERROR: %s", err)
106+
}
107+
108+
// Contains contextual information from your Azure chat completion extensions, configured above in `AzureExtensionsOptions`
109+
msgContext := resp.Choices[0].Message.Context
110+
111+
fmt.Fprintf(os.Stderr, "Extensions Context Role: %s\nExtensions Context (length): %d\n",
112+
*msgContext.Messages[0].Role,
113+
len(*msgContext.Messages[0].Content))
114+
115+
fmt.Fprintf(os.Stderr, "ChatRole: %s\nChat content: %s\n",
116+
*resp.Choices[0].Message.Role,
117+
*resp.Choices[0].Message.Content,
118+
)
124119
}
125120
```
126121

@@ -136,4 +131,4 @@ ms.date: 08/29/2023
136131
The application prints the response including both answers to your query and citations from your uploaded files.
137132

138133
> [!div class="nextstepaction"]
139-
> [I ran into an issue when running the code samples.](https://microsoft.qualtrics.com/jfe/form/SV_0Cl5zkG3CnDjq6O?PLanguage=dotnet&Pillar=AOAI&Product=ownData&Page=quickstart&Section=Create-dotnet-application)
134+
> [I ran into an issue when running the code samples.](https://microsoft.qualtrics.com/jfe/form/SV_0Cl5zkG3CnDjq6O?PLanguage=dotnet&Pillar=AOAI&Product=ownData&Page=quickstart&Section=Create-dotnet-application)
Lines changed: 31 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,31 @@
1+
---
2+
title: Azure OpenAI Service supported programming languages
3+
titleSuffix: Azure AI services
4+
description: Programming language support for Azure OpenAI.
5+
author: mrbullwinkle
6+
manager: nitinme
7+
ms.service: azure-ai-openai
8+
ms.custom:
9+
ms.topic: conceptual
10+
ms.date: 12/18/2023
11+
ms.author: mbullwin
12+
---
13+
14+
# Azure OpenAI supported programming languages
15+
16+
Azure OpenAI supports the following programming languages.
17+
18+
## Programming languages
19+
20+
| Language | Source code | Package | Examples |
21+
|------------|---------|-----|-------|
22+
| C# | [Source code](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/openai/Azure.AI.OpenAI/src) | [Package (NuGet)](https://www.nuget.org/packages/Azure.AI.OpenAI/) | [C# examples](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/openai/Azure.AI.OpenAI/tests/Samples) |
23+
| Go | [Source code](https://github.com/Azure/azure-sdk-for-go/tree/main/sdk/ai/azopenai) | [Package (Go)](https://pkg.go.dev/github.com/Azure/azure-sdk-for-go/sdk/ai/azopenai)| [ Go examples](https://pkg.go.dev/github.com/Azure/azure-sdk-for-go/sdk/ai/azopenai#pkg-examples) |
24+
| Java | [Source code](https://github.com/Azure/azure-sdk-for-java/tree/main/sdk/openai/azure-ai-openai) | [Artifact (Maven)](https://central.sonatype.com/artifact/com.azure/azure-ai-openai/) | [Java examples](https://github.com/Azure/azure-sdk-for-java/tree/main/sdk/openai/azure-ai-openai/src/samples) |
25+
| JavaScript | [Source code](https://github.com/Azure/azure-sdk-for-js/tree/main/sdk/openai/openai) | [Package (npm)](https://www.npmjs.com/package/@azure/openai) | [JavaScript examples](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/openai/Azure.AI.OpenAI/tests/Samples) |
26+
| Python | [Source code](https://github.com/openai/openai-python) | [Package (PyPi)](https://pypi.org/project/openai/) | [Python examples](./how-to/switching-endpoints.md) |
27+
28+
## Next steps
29+
30+
- Explore each programming language in our step-by-step [quickstarts](./chatgpt-quickstart.md)
31+
- To see what models are currently supported, check out the [Azure OpenAI models page](./concepts/models.md)

articles/ai-services/openai/toc.yml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -16,6 +16,8 @@ items:
1616
href: https://azure.microsoft.com/pricing/details/cognitive-services/openai-service/
1717
- name: What's new
1818
href: whats-new.md
19+
- name: Programming languages/SDKs
20+
href: ./supported-languages.md
1921
- name: Azure OpenAI FAQ
2022
href: faq.yml
2123
- name: Quickstarts

articles/azure-functions/functions-reference-python.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -973,7 +973,7 @@ pip install -r requirements.txt
973973

974974
When running your functions in an [App Service plan](./dedicated-plan.md), dependencies that you define in requirements.txt are given precedence over built-in Python modules, such as `logging`. This precedence can cause conflicts when built-in modules have the same names as directories in your code. When running in a [Consumption plan](./consumption-plan.md) or an [Elastic Premium plan](./functions-premium-plan.md), conflicts are less likely because your dependencies aren't prioritized by default.
975975

976-
To prevent issues running in an App Service plan, don't name your directories the same as any Python native modules and don't including Python native libraries in your project's requirements.txt file.
976+
To prevent issues running in an App Service plan, don't name your directories the same as any Python native modules and don't include Python native libraries in your project's requirements.txt file.
977977

978978
## Publishing to Azure
979979

0 commit comments

Comments
 (0)