You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
else if (result->Reason == ResultReason::RecognizedSpeech)
@@ -152,14 +148,56 @@ Follow these steps to create a new console application and install the Speech SD
152
148
Speak into your microphone when prompted. What you speak should be output as text:
153
149
154
150
```console
155
-
Say something ...
156
-
RECOGNIZED: Text=Go ahead and delete the e-mail.
157
-
Intent Id: Delete.
158
-
Language Understanding JSON:
151
+
Speak into your microphone.
152
+
RECOGNIZED: Text=Turn on the lights.
153
+
Intent Id: HomeAutomation.TurnOn.
154
+
Language Understanding JSON: {"kind":"ConversationResult","result":{"query":"turn on the lights","prediction":{"topIntent":"HomeAutomation.TurnOn","projectKind":"Conversation","intents":[{"category":"HomeAutomation.TurnOn","confidenceScore":0.97712576},{"category":"HomeAutomation.TurnOff","confidenceScore":0.8431633},{"category":"None","confidenceScore":0.782861}],"entities":[{"category":"HomeAutomation.DeviceType","text":"lights","offset":12,"length":6,"confidenceScore":1,"extraInformation":[{"extraInformationKind":"ListKey","key":"light"}]}]}}}.
159
155
```
160
156
161
-
> [NOTE]
162
-
> There is a known issue with the LanguageUnderstandingServiceResponse_JsonResult property when used with CLU in the Speech SDK version 1.25. You can get detailed JSON output in a future release. Via JSON, the intents are returned in the probability order of most likely to least likely. For example, the `topIntent` might be `Delete` with a confidence score of 0.95413816 (95.41%). The second most likely intent might be `Cancel` with a confidence score of 0.8985081 (89.85%).
157
+
158
+
The intents are returned in the probability order of most likely to least likely. Here's a formatted version of the JSON output where the `topIntent` is `HomeAutomation.TurnOn` with a confidence score of 0.97712576 (97.71%). The second most likely intent might be `HomeAutomation.TurnOff` with a confidence score of 0.8985081 (84.31%).
159
+
160
+
```json
161
+
{
162
+
"kind": "ConversationResult",
163
+
"result": {
164
+
"query": "turn on the lights",
165
+
"prediction": {
166
+
"topIntent": "HomeAutomation.TurnOn",
167
+
"projectKind": "Conversation",
168
+
"intents": [
169
+
{
170
+
"category": "HomeAutomation.TurnOn",
171
+
"confidenceScore": 0.97712576
172
+
},
173
+
{
174
+
"category": "HomeAutomation.TurnOff",
175
+
"confidenceScore": 0.8431633
176
+
},
177
+
{
178
+
"category": "None",
179
+
"confidenceScore": 0.782861
180
+
}
181
+
],
182
+
"entities": [
183
+
{
184
+
"category": "HomeAutomation.DeviceType",
185
+
"text": "lights",
186
+
"offset": 12,
187
+
"length": 6,
188
+
"confidenceScore": 1,
189
+
"extraInformation": [
190
+
{
191
+
"extraInformationKind": "ListKey",
192
+
"key": "light"
193
+
}
194
+
]
195
+
}
196
+
]
197
+
}
198
+
}
199
+
}
200
+
```
163
201
164
202
## Remarks
165
203
Now that you've completed the quickstart, here are some additional considerations:
// There is a known issue with the LanguageUnderstandingServiceResponse_JsonResult
94
-
// property when used with CLU in the Speech SDK version 1.25.
95
-
// The following should return JSON in a future release.
96
92
Console.WriteLine($" Language Understanding JSON: {recognitionResult.Properties.GetProperty(PropertyId.LanguageUnderstandingServiceResponse_JsonResult)}.");
97
93
}
98
94
else if (recognitionResult.Reason == ResultReason.RecognizedSpeech)
@@ -136,14 +132,55 @@ dotnet run
136
132
Speak into your microphone when prompted. What you speak should be output as text:
137
133
138
134
```console
139
-
Say something ...
140
-
RECOGNIZED: Text=Go ahead and delete the e-mail.
141
-
Intent Id: Delete.
142
-
Language Understanding JSON:
135
+
Speak into your microphone.
136
+
RECOGNIZED: Text=Turn on the lights.
137
+
Intent Id: HomeAutomation.TurnOn.
138
+
Language Understanding JSON: {"kind":"ConversationResult","result":{"query":"turn on the lights","prediction":{"topIntent":"HomeAutomation.TurnOn","projectKind":"Conversation","intents":[{"category":"HomeAutomation.TurnOn","confidenceScore":0.97712576},{"category":"HomeAutomation.TurnOff","confidenceScore":0.8431633},{"category":"None","confidenceScore":0.782861}],"entities":[{"category":"HomeAutomation.DeviceType","text":"lights","offset":12,"length":6,"confidenceScore":1,"extraInformation":[{"extraInformationKind":"ListKey","key":"light"}]}]}}}.
143
139
```
144
140
145
-
> [NOTE]
146
-
> There is a known issue with the LanguageUnderstandingServiceResponse_JsonResult property when used with CLU in the Speech SDK version 1.25. You can get detailed JSON output in a future release. Via JSON, the intents are returned in the probability order of most likely to least likely. For example, the `topIntent` might be `Delete` with a confidence score of 0.95413816 (95.41%). The second most likely intent might be `Cancel` with a confidence score of 0.8985081 (89.85%).
141
+
The intents are returned in the probability order of most likely to least likely. Here's a formatted version of the JSON output where the `topIntent` is `HomeAutomation.TurnOn` with a confidence score of 0.97712576 (97.71%). The second most likely intent might be `HomeAutomation.TurnOff` with a confidence score of 0.8985081 (84.31%).
142
+
143
+
```json
144
+
{
145
+
"kind": "ConversationResult",
146
+
"result": {
147
+
"query": "turn on the lights",
148
+
"prediction": {
149
+
"topIntent": "HomeAutomation.TurnOn",
150
+
"projectKind": "Conversation",
151
+
"intents": [
152
+
{
153
+
"category": "HomeAutomation.TurnOn",
154
+
"confidenceScore": 0.97712576
155
+
},
156
+
{
157
+
"category": "HomeAutomation.TurnOff",
158
+
"confidenceScore": 0.8431633
159
+
},
160
+
{
161
+
"category": "None",
162
+
"confidenceScore": 0.782861
163
+
}
164
+
],
165
+
"entities": [
166
+
{
167
+
"category": "HomeAutomation.DeviceType",
168
+
"text": "lights",
169
+
"offset": 12,
170
+
"length": 6,
171
+
"confidenceScore": 1,
172
+
"extraInformation": [
173
+
{
174
+
"extraInformationKind": "ListKey",
175
+
"key": "light"
176
+
}
177
+
]
178
+
}
179
+
]
180
+
}
181
+
}
182
+
}
183
+
```
147
184
148
185
## Remarks
149
186
Now that you've completed the quickstart, here are some additional considerations:
Copy file name to clipboardExpand all lines: articles/cognitive-services/Speech-Service/includes/quickstarts/intent-recognition-clu/deploy-clu-model.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -13,7 +13,7 @@ Go to the [Language Studio](https://aka.ms/languageStudio) and sign in with your
13
13
14
14
### Create a conversational language understanding project
15
15
16
-
For this quickstart, you can download [this sample project](https://go.microsoft.com/fwlink/?linkid=2196152) and import it. This project can predict the intended commands from user input, such as: reading emails, deleting emails, and attaching a document to an email.
16
+
For this quickstart, you can download [this sample home automation project](https://github.com/Azure-Samples/cognitive-services-sample-data-files/blob/master/language-service/CLU/HomeAutomationDemo.json) and import it. This project can predict the intended commands from user input, such as turning lights on and off.
0 commit comments