You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
else if (result->Reason == ResultReason::RecognizedSpeech)
@@ -152,14 +148,58 @@ Follow these steps to create a new console application and install the Speech SD
152
148
Speak into your microphone when prompted. What you speak should be output as text:
153
149
154
150
```console
155
-
Say something ...
156
-
RECOGNIZED: Text=Go ahead and delete the e-mail.
157
-
Intent Id: Delete.
158
-
Language Understanding JSON:
151
+
Speak into your microphone.
152
+
RECOGNIZED: Text=Turn on the lights.
153
+
Intent Id: HomeAutomation.TurnOn.
154
+
Language Understanding JSON: {"kind":"ConversationResult","result":{"query":"turn on the lights","prediction":{"topIntent":"HomeAutomation.TurnOn","projectKind":"Conversation","intents":[{"category":"HomeAutomation.TurnOn","confidenceScore":0.97712576},{"category":"HomeAutomation.TurnOff","confidenceScore":0.8431633},{"category":"None","confidenceScore":0.782861}],"entities":[{"category":"HomeAutomation.DeviceType","text":"lights","offset":12,"length":6,"confidenceScore":1,"extraInformation":[{"extraInformationKind":"ListKey","key":"light"}]}]}}}.
159
155
```
160
156
161
-
> [NOTE]
162
-
> There is a known issue with the LanguageUnderstandingServiceResponse_JsonResult property when used with CLU in the Speech SDK version 1.25. You can get detailed JSON output in a future release. Via JSON, the intents are returned in the probability order of most likely to least likely. For example, the `topIntent` might be `Delete` with a confidence score of 0.95413816 (95.41%). The second most likely intent might be `Cancel` with a confidence score of 0.8985081 (89.85%).
157
+
> [!NOTE]
158
+
> Support for the JSON response for CLU via the LanguageUnderstandingServiceResponse_JsonResult property was added in the Speech SDK version 1.26.
159
+
160
+
The intents are returned in the probability order of most likely to least likely. Here's a formatted version of the JSON output where the `topIntent` is `HomeAutomation.TurnOn` with a confidence score of 0.97712576 (97.71%). The second most likely intent might be `HomeAutomation.TurnOff` with a confidence score of 0.8985081 (84.31%).
161
+
162
+
```json
163
+
{
164
+
"kind": "ConversationResult",
165
+
"result": {
166
+
"query": "turn on the lights",
167
+
"prediction": {
168
+
"topIntent": "HomeAutomation.TurnOn",
169
+
"projectKind": "Conversation",
170
+
"intents": [
171
+
{
172
+
"category": "HomeAutomation.TurnOn",
173
+
"confidenceScore": 0.97712576
174
+
},
175
+
{
176
+
"category": "HomeAutomation.TurnOff",
177
+
"confidenceScore": 0.8431633
178
+
},
179
+
{
180
+
"category": "None",
181
+
"confidenceScore": 0.782861
182
+
}
183
+
],
184
+
"entities": [
185
+
{
186
+
"category": "HomeAutomation.DeviceType",
187
+
"text": "lights",
188
+
"offset": 12,
189
+
"length": 6,
190
+
"confidenceScore": 1,
191
+
"extraInformation": [
192
+
{
193
+
"extraInformationKind": "ListKey",
194
+
"key": "light"
195
+
}
196
+
]
197
+
}
198
+
]
199
+
}
200
+
}
201
+
}
202
+
```
163
203
164
204
## Remarks
165
205
Now that you've completed the quickstart, here are some additional considerations:
// There is a known issue with the LanguageUnderstandingServiceResponse_JsonResult
94
-
// property when used with CLU in the Speech SDK version 1.25.
95
-
// The following should return JSON in a future release.
96
92
Console.WriteLine($" Language Understanding JSON: {recognitionResult.Properties.GetProperty(PropertyId.LanguageUnderstandingServiceResponse_JsonResult)}.");
97
93
}
98
94
else if (recognitionResult.Reason == ResultReason.RecognizedSpeech)
@@ -136,14 +132,58 @@ dotnet run
136
132
Speak into your microphone when prompted. What you speak should be output as text:
137
133
138
134
```console
139
-
Say something ...
140
-
RECOGNIZED: Text=Go ahead and delete the e-mail.
141
-
Intent Id: Delete.
142
-
Language Understanding JSON:
135
+
Speak into your microphone.
136
+
RECOGNIZED: Text=Turn on the lights.
137
+
Intent Id: HomeAutomation.TurnOn.
138
+
Language Understanding JSON: {"kind":"ConversationResult","result":{"query":"turn on the lights","prediction":{"topIntent":"HomeAutomation.TurnOn","projectKind":"Conversation","intents":[{"category":"HomeAutomation.TurnOn","confidenceScore":0.97712576},{"category":"HomeAutomation.TurnOff","confidenceScore":0.8431633},{"category":"None","confidenceScore":0.782861}],"entities":[{"category":"HomeAutomation.DeviceType","text":"lights","offset":12,"length":6,"confidenceScore":1,"extraInformation":[{"extraInformationKind":"ListKey","key":"light"}]}]}}}.
143
139
```
144
140
145
-
> [NOTE]
146
-
> There is a known issue with the LanguageUnderstandingServiceResponse_JsonResult property when used with CLU in the Speech SDK version 1.25. You can get detailed JSON output in a future release. Via JSON, the intents are returned in the probability order of most likely to least likely. For example, the `topIntent` might be `Delete` with a confidence score of 0.95413816 (95.41%). The second most likely intent might be `Cancel` with a confidence score of 0.8985081 (89.85%).
141
+
> [!NOTE]
142
+
> Support for the JSON response for CLU via the LanguageUnderstandingServiceResponse_JsonResult property was added in the Speech SDK version 1.26.
143
+
144
+
The intents are returned in the probability order of most likely to least likely. Here's a formatted version of the JSON output where the `topIntent` is `HomeAutomation.TurnOn` with a confidence score of 0.97712576 (97.71%). The second most likely intent might be `HomeAutomation.TurnOff` with a confidence score of 0.8985081 (84.31%).
145
+
146
+
```json
147
+
{
148
+
"kind": "ConversationResult",
149
+
"result": {
150
+
"query": "turn on the lights",
151
+
"prediction": {
152
+
"topIntent": "HomeAutomation.TurnOn",
153
+
"projectKind": "Conversation",
154
+
"intents": [
155
+
{
156
+
"category": "HomeAutomation.TurnOn",
157
+
"confidenceScore": 0.97712576
158
+
},
159
+
{
160
+
"category": "HomeAutomation.TurnOff",
161
+
"confidenceScore": 0.8431633
162
+
},
163
+
{
164
+
"category": "None",
165
+
"confidenceScore": 0.782861
166
+
}
167
+
],
168
+
"entities": [
169
+
{
170
+
"category": "HomeAutomation.DeviceType",
171
+
"text": "lights",
172
+
"offset": 12,
173
+
"length": 6,
174
+
"confidenceScore": 1,
175
+
"extraInformation": [
176
+
{
177
+
"extraInformationKind": "ListKey",
178
+
"key": "light"
179
+
}
180
+
]
181
+
}
182
+
]
183
+
}
184
+
}
185
+
}
186
+
```
147
187
148
188
## Remarks
149
189
Now that you've completed the quickstart, here are some additional considerations:
Copy file name to clipboardExpand all lines: articles/cognitive-services/Speech-Service/includes/quickstarts/intent-recognition-clu/deploy-clu-model.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -13,7 +13,7 @@ Go to the [Language Studio](https://aka.ms/languageStudio) and sign in with your
13
13
14
14
### Create a conversational language understanding project
15
15
16
-
For this quickstart, you can download [this sample project](https://go.microsoft.com/fwlink/?linkid=2196152) and import it. This project can predict the intended commands from user input, such as: reading emails, deleting emails, and attaching a document to an email.
16
+
For this quickstart, you can download [this sample home automation project](https://github.com/Azure-Samples/cognitive-services-sample-data-files/blob/master/language-service/CLU/HomeAutomationDemo.json) and import it. This project can predict the intended commands from user input, such as turning lights on and off.
Copy file name to clipboardExpand all lines: articles/cognitive-services/Speech-Service/intent-recognition.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -13,11 +13,11 @@ keywords: intent recognition
13
13
14
14
# What is intent recognition?
15
15
16
-
In this overview, you will learn about the benefits and capabilities of intent recognition. The Cognitive Services Speech SDK provides two ways to recognize intents, both described below. An intent is something the user wants to do: book a flight, check the weather, or make a call. Using intent recognition, your applications, tools, and devices can determine what the user wants to initiate or do based on options you define in the Intent Recognizer or LUIS.
16
+
In this overview, you will learn about the benefits and capabilities of intent recognition. The Cognitive Services Speech SDK provides two ways to recognize intents, both described below. An intent is something the user wants to do: book a flight, check the weather, or make a call. Using intent recognition, your applications, tools, and devices can determine what the user wants to initiate or do based on options you define in the Intent Recognizer or Conversational Language Understanding (CLU) model.
17
17
18
18
## Pattern matching
19
19
20
-
The Speech SDK provides an embedded pattern matcher that you can use to recognize intents in a very strict way. This is useful for when you need a quick offline solution. This works especially well when the user is going to be trained in some way or can be expected to use specific phrases to trigger intents. For example: "Go to floor seven", or "Turn on the lamp" etc. It is recommended to start here and if it no longer meets your needs, switch to using LUIS or a combination of the two.
20
+
The Speech SDK provides an embedded pattern matcher that you can use to recognize intents in a very strict way. This is useful for when you need a quick offline solution. This works especially well when the user is going to be trained in some way or can be expected to use specific phrases to trigger intents. For example: "Go to floor seven", or "Turn on the lamp" etc. It is recommended to start here and if it no longer meets your needs, switch to using [CLU](#conversational-language-understanding) or a combination of the two.
21
21
22
22
Use pattern matching if:
23
23
* You're only interested in matching strictly what the user said. These patterns match more aggressively than [conversational language understanding (CLU)](/azure/cognitive-services/language-service/conversational-language-understanding/overview).
0 commit comments