Skip to content

Commit 0418c15

Browse files
authored
Merge pull request #229628 from eric-urban/eur/intent-recognition
clu updates
2 parents c9f45f7 + fc920bd commit 0418c15

File tree

4 files changed

+103
-23
lines changed

4 files changed

+103
-23
lines changed

articles/cognitive-services/Speech-Service/includes/quickstarts/intent-recognition-clu/cpp.md

Lines changed: 50 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -94,10 +94,6 @@ Follow these steps to create a new console application and install the Speech SD
9494
{
9595
std::cout << "RECOGNIZED: Text=" << result->Text << std::endl;
9696
std::cout << " Intent Id: " << result->IntentId << std::endl;
97-
98-
// There is a known issue with the LanguageUnderstandingServiceResponse_JsonResult
99-
// property when used with CLU in the Speech SDK version 1.25.
100-
// The following should return JSON in a future release.
10197
std::cout << " Intent Service JSON: " << result->Properties.GetProperty(PropertyId::LanguageUnderstandingServiceResponse_JsonResult) << std::endl;
10298
}
10399
else if (result->Reason == ResultReason::RecognizedSpeech)
@@ -152,14 +148,58 @@ Follow these steps to create a new console application and install the Speech SD
152148
Speak into your microphone when prompted. What you speak should be output as text:
153149
154150
```console
155-
Say something ...
156-
RECOGNIZED: Text=Go ahead and delete the e-mail.
157-
Intent Id: Delete.
158-
Language Understanding JSON:
151+
Speak into your microphone.
152+
RECOGNIZED: Text=Turn on the lights.
153+
Intent Id: HomeAutomation.TurnOn.
154+
Language Understanding JSON: {"kind":"ConversationResult","result":{"query":"turn on the lights","prediction":{"topIntent":"HomeAutomation.TurnOn","projectKind":"Conversation","intents":[{"category":"HomeAutomation.TurnOn","confidenceScore":0.97712576},{"category":"HomeAutomation.TurnOff","confidenceScore":0.8431633},{"category":"None","confidenceScore":0.782861}],"entities":[{"category":"HomeAutomation.DeviceType","text":"lights","offset":12,"length":6,"confidenceScore":1,"extraInformation":[{"extraInformationKind":"ListKey","key":"light"}]}]}}}.
159155
```
160156

161-
> [NOTE]
162-
> There is a known issue with the LanguageUnderstandingServiceResponse_JsonResult property when used with CLU in the Speech SDK version 1.25. You can get detailed JSON output in a future release. Via JSON, the intents are returned in the probability order of most likely to least likely. For example, the `topIntent` might be `Delete` with a confidence score of 0.95413816 (95.41%). The second most likely intent might be `Cancel` with a confidence score of 0.8985081 (89.85%).
157+
> [!NOTE]
158+
> Support for the JSON response for CLU via the LanguageUnderstandingServiceResponse_JsonResult property was added in the Speech SDK version 1.26.
159+
160+
The intents are returned in the probability order of most likely to least likely. Here's a formatted version of the JSON output where the `topIntent` is `HomeAutomation.TurnOn` with a confidence score of 0.97712576 (97.71%). The second most likely intent might be `HomeAutomation.TurnOff` with a confidence score of 0.8985081 (84.31%).
161+
162+
```json
163+
{
164+
"kind": "ConversationResult",
165+
"result": {
166+
"query": "turn on the lights",
167+
"prediction": {
168+
"topIntent": "HomeAutomation.TurnOn",
169+
"projectKind": "Conversation",
170+
"intents": [
171+
{
172+
"category": "HomeAutomation.TurnOn",
173+
"confidenceScore": 0.97712576
174+
},
175+
{
176+
"category": "HomeAutomation.TurnOff",
177+
"confidenceScore": 0.8431633
178+
},
179+
{
180+
"category": "None",
181+
"confidenceScore": 0.782861
182+
}
183+
],
184+
"entities": [
185+
{
186+
"category": "HomeAutomation.DeviceType",
187+
"text": "lights",
188+
"offset": 12,
189+
"length": 6,
190+
"confidenceScore": 1,
191+
"extraInformation": [
192+
{
193+
"extraInformationKind": "ListKey",
194+
"key": "light"
195+
}
196+
]
197+
}
198+
]
199+
}
200+
}
201+
}
202+
```
163203

164204
## Remarks
165205
Now that you've completed the quickstart, here are some additional considerations:

articles/cognitive-services/Speech-Service/includes/quickstarts/intent-recognition-clu/csharp.md

Lines changed: 50 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -89,10 +89,6 @@ Follow these steps to create a new console application and install the Speech SD
8989
{
9090
Console.WriteLine($"RECOGNIZED: Text={recognitionResult.Text}");
9191
Console.WriteLine($" Intent Id: {recognitionResult.IntentId}.");
92-
93-
// There is a known issue with the LanguageUnderstandingServiceResponse_JsonResult
94-
// property when used with CLU in the Speech SDK version 1.25.
95-
// The following should return JSON in a future release.
9692
Console.WriteLine($" Language Understanding JSON: {recognitionResult.Properties.GetProperty(PropertyId.LanguageUnderstandingServiceResponse_JsonResult)}.");
9793
}
9894
else if (recognitionResult.Reason == ResultReason.RecognizedSpeech)
@@ -136,14 +132,58 @@ dotnet run
136132
Speak into your microphone when prompted. What you speak should be output as text:
137133

138134
```console
139-
Say something ...
140-
RECOGNIZED: Text=Go ahead and delete the e-mail.
141-
Intent Id: Delete.
142-
Language Understanding JSON:
135+
Speak into your microphone.
136+
RECOGNIZED: Text=Turn on the lights.
137+
Intent Id: HomeAutomation.TurnOn.
138+
Language Understanding JSON: {"kind":"ConversationResult","result":{"query":"turn on the lights","prediction":{"topIntent":"HomeAutomation.TurnOn","projectKind":"Conversation","intents":[{"category":"HomeAutomation.TurnOn","confidenceScore":0.97712576},{"category":"HomeAutomation.TurnOff","confidenceScore":0.8431633},{"category":"None","confidenceScore":0.782861}],"entities":[{"category":"HomeAutomation.DeviceType","text":"lights","offset":12,"length":6,"confidenceScore":1,"extraInformation":[{"extraInformationKind":"ListKey","key":"light"}]}]}}}.
143139
```
144140

145-
> [NOTE]
146-
> There is a known issue with the LanguageUnderstandingServiceResponse_JsonResult property when used with CLU in the Speech SDK version 1.25. You can get detailed JSON output in a future release. Via JSON, the intents are returned in the probability order of most likely to least likely. For example, the `topIntent` might be `Delete` with a confidence score of 0.95413816 (95.41%). The second most likely intent might be `Cancel` with a confidence score of 0.8985081 (89.85%).
141+
> [!NOTE]
142+
> Support for the JSON response for CLU via the LanguageUnderstandingServiceResponse_JsonResult property was added in the Speech SDK version 1.26.
143+
144+
The intents are returned in the probability order of most likely to least likely. Here's a formatted version of the JSON output where the `topIntent` is `HomeAutomation.TurnOn` with a confidence score of 0.97712576 (97.71%). The second most likely intent might be `HomeAutomation.TurnOff` with a confidence score of 0.8985081 (84.31%).
145+
146+
```json
147+
{
148+
"kind": "ConversationResult",
149+
"result": {
150+
"query": "turn on the lights",
151+
"prediction": {
152+
"topIntent": "HomeAutomation.TurnOn",
153+
"projectKind": "Conversation",
154+
"intents": [
155+
{
156+
"category": "HomeAutomation.TurnOn",
157+
"confidenceScore": 0.97712576
158+
},
159+
{
160+
"category": "HomeAutomation.TurnOff",
161+
"confidenceScore": 0.8431633
162+
},
163+
{
164+
"category": "None",
165+
"confidenceScore": 0.782861
166+
}
167+
],
168+
"entities": [
169+
{
170+
"category": "HomeAutomation.DeviceType",
171+
"text": "lights",
172+
"offset": 12,
173+
"length": 6,
174+
"confidenceScore": 1,
175+
"extraInformation": [
176+
{
177+
"extraInformationKind": "ListKey",
178+
"key": "light"
179+
}
180+
]
181+
}
182+
]
183+
}
184+
}
185+
}
186+
```
147187

148188
## Remarks
149189
Now that you've completed the quickstart, here are some additional considerations:

articles/cognitive-services/Speech-Service/includes/quickstarts/intent-recognition-clu/deploy-clu-model.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ Go to the [Language Studio](https://aka.ms/languageStudio) and sign in with your
1313

1414
### Create a conversational language understanding project
1515

16-
For this quickstart, you can download [this sample project](https://go.microsoft.com/fwlink/?linkid=2196152) and import it. This project can predict the intended commands from user input, such as: reading emails, deleting emails, and attaching a document to an email.
16+
For this quickstart, you can download [this sample home automation project](https://github.com/Azure-Samples/cognitive-services-sample-data-files/blob/master/language-service/CLU/HomeAutomationDemo.json) and import it. This project can predict the intended commands from user input, such as turning lights on and off.
1717

1818
[!INCLUDE [Import project](../../../../language-service/conversational-language-understanding/includes/language-studio/import-project.md)]
1919

articles/cognitive-services/Speech-Service/intent-recognition.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -13,11 +13,11 @@ keywords: intent recognition
1313

1414
# What is intent recognition?
1515

16-
In this overview, you will learn about the benefits and capabilities of intent recognition. The Cognitive Services Speech SDK provides two ways to recognize intents, both described below. An intent is something the user wants to do: book a flight, check the weather, or make a call. Using intent recognition, your applications, tools, and devices can determine what the user wants to initiate or do based on options you define in the Intent Recognizer or LUIS.
16+
In this overview, you will learn about the benefits and capabilities of intent recognition. The Cognitive Services Speech SDK provides two ways to recognize intents, both described below. An intent is something the user wants to do: book a flight, check the weather, or make a call. Using intent recognition, your applications, tools, and devices can determine what the user wants to initiate or do based on options you define in the Intent Recognizer or Conversational Language Understanding (CLU) model.
1717

1818
## Pattern matching
1919

20-
The Speech SDK provides an embedded pattern matcher that you can use to recognize intents in a very strict way. This is useful for when you need a quick offline solution. This works especially well when the user is going to be trained in some way or can be expected to use specific phrases to trigger intents. For example: "Go to floor seven", or "Turn on the lamp" etc. It is recommended to start here and if it no longer meets your needs, switch to using LUIS or a combination of the two.
20+
The Speech SDK provides an embedded pattern matcher that you can use to recognize intents in a very strict way. This is useful for when you need a quick offline solution. This works especially well when the user is going to be trained in some way or can be expected to use specific phrases to trigger intents. For example: "Go to floor seven", or "Turn on the lamp" etc. It is recommended to start here and if it no longer meets your needs, switch to using [CLU](#conversational-language-understanding) or a combination of the two.
2121

2222
Use pattern matching if:
2323
* You're only interested in matching strictly what the user said. These patterns match more aggressively than [conversational language understanding (CLU)](/azure/cognitive-services/language-service/conversational-language-understanding/overview).

0 commit comments

Comments
 (0)