You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
description: Get answers to the most popular questions about voice assistants using Custom Commands (Preview) or the Direct Line Speech channel.
5
5
services: cognitive-services
@@ -12,7 +12,7 @@ ms.date: 11/05/2019
12
12
ms.author: travisw
13
13
---
14
14
15
-
# Voice assistants: Frequently asked questions
15
+
# Voice assistants frequently asked questions
16
16
17
17
If you can't find answers to your questions in this document, check out [other support options](support.md).
18
18
@@ -30,9 +30,9 @@ If you can't find answers to your questions in this document, check out [other s
30
30
31
31
**A:** The best way to begin with creating a Custom Commands (Preview) application or basic Bot Framework bot.
32
32
33
-
*[Create a Custom Commands (Preview) application](quickstart-custom-speech-commands-create-new.md)
34
-
*[Create a basic Bot Framework bot](https://docs.microsoft.com/azure/bot-service/bot-builder-tutorial-basic-deploy?view=azure-bot-service-4.0)
35
-
*[Connect a bot to the Direct Line Speech channel](https://docs.microsoft.com/azure/bot-service/bot-service-channel-connect-directlinespeech)
33
+
-[Create a Custom Commands (Preview) application](quickstart-custom-speech-commands-create-new.md)
34
+
-[Create a basic Bot Framework bot](https://docs.microsoft.com/azure/bot-service/bot-builder-tutorial-basic-deploy?view=azure-bot-service-4.0)
35
+
-[Connect a bot to the Direct Line Speech channel](https://docs.microsoft.com/azure/bot-service/bot-service-channel-connect-directlinespeech)
36
36
37
37
## Debugging
38
38
@@ -42,29 +42,28 @@ If you can't find answers to your questions in this document, check out [other s
42
42
43
43
The latest version of Direct Line Speech simplifies the process of contacting your bot from a device. On the channel registration page, the drop-down at the top associates your Direct Line Speech channel registration with a speech resource. Once associated, the v1.8 Speech SDK includes a `BotFrameworkConfig::FromSubscription` factory method that will configure a `DialogServiceConnector` to contact the bot you've associated with your subscription.
44
44
45
-
If you're still migrating your client application from v1.7 to v1.8, `DialogServiceConfig::FromBotSecret` may continue to work with a non-empty, non-null value for its channel secret parameter, e.g. the previous secret you used. It will simply be ignored when using a speech subscription associated with a newer channel registration. Please note that the value *must* be non-null and non-empty, as these are checked for on the device before the service-side association is relevant.
46
-
45
+
If you're still migrating your client application from v1.7 to v1.8, `DialogServiceConfig::FromBotSecret` may continue to work with a non-empty, non-null value for its channel secret parameter, e.g. the previous secret you used. It will simply be ignored when using a speech subscription associated with a newer channel registration. Please note that the value _must_ be non-null and non-empty, as these are checked for on the device before the service-side association is relevant.
47
46
48
47
For a more detailed guide, please see the [tutorial section](tutorial-voice-enable-your-bot-speech-sdk.md#register-the-direct-line-speech-channel) that walks through channel registration.
49
48
50
49
**Q: I get a 401 error when connecting and nothing works. I know my speech subscription key is valid. What's going on?**
51
50
52
-
**A:** When managing your subscription on the Azure portal, please ensure you're using the **Speech** resource (Microsoft.CognitiveServicesSpeechServices, "Speech") and *not* the **Cognitive Services** resource (Microsoft.CognitiveServicesAllInOne, "All Cognitive Services"). Also, please check [Speech service region support for voice assistants](regions.md#voice-assistants).
51
+
**A:** When managing your subscription on the Azure portal, please ensure you're using the **Speech** resource (Microsoft.CognitiveServicesSpeechServices, "Speech") and _not_ the **Cognitive Services** resource (Microsoft.CognitiveServicesAllInOne, "All Cognitive Services"). Also, please check [Speech service region support for voice assistants](regions.md#voice-assistants).
53
52
54
53

55
54
56
55
**Q: I get recognition text back from my `DialogServiceConnector`, but I see a '1011' error and nothing from my bot. Why?**
57
56
58
57
**A:** This error indicates a communication problem between your assistant and the voice assistant service.
59
58
60
-
* For Custom Commands (Preview), ensure that your Custom Commands (Preview) Application is published
61
-
* For Direct Line Speech, ensure that you've [connected your bot to the Direct Line Speech channel](https://docs.microsoft.com/azure/bot-service/bot-service-channel-connect-directlinespeech), [added Streaming protocol support](https://aka.ms/botframework/addstreamingprotocolsupport) to your bot (with the related Web Socket support), and then check that your bot is responding to incoming requests from the channel.
59
+
- For Custom Commands (Preview), ensure that your Custom Commands (Preview) Application is published
60
+
- For Direct Line Speech, ensure that you've [connected your bot to the Direct Line Speech channel](https://docs.microsoft.com/azure/bot-service/bot-service-channel-connect-directlinespeech), [added Streaming protocol support](https://aka.ms/botframework/addstreamingprotocolsupport) to your bot (with the related Web Socket support), and then check that your bot is responding to incoming requests from the channel.
62
61
63
62
**Q: This code still doesn't work and/or I'm getting a different error when using a `DialogServiceConnector`. What should I do?**
64
63
65
64
**A:** File-based logging provides substantially more detail and can help accelerate support requests. To enable this functionality, see [how to use file logging](how-to-use-logging.md).
Copy file name to clipboardExpand all lines: articles/cognitive-services/Speech-Service/how-to-automatic-language-detection.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,5 +1,5 @@
1
1
---
2
-
title: 'How-to: Use automatic language detection for speech to text - Speech Service'
2
+
title: How to use automatic language detection for speech to text
3
3
titleSuffix: Azure Cognitive Services
4
4
description: The Speech SDK supports automatic language detection for speech to text. When using this feature, the audio provided is compared against a provided list of languages, and the most likely match is determined. The returned value can then be used to select the language model used for speech to text.
5
5
services: cognitive-services
@@ -25,7 +25,7 @@ In this article, you'll learn how to use `AutoDetectSourceLanguageConfig` to con
25
25
26
26
Automatic language detection currently has a services-side limit of two languages per detection. Keep this limitation in mind when construction your `AudoDetectSourceLanguageConfig` object. In the samples below, you'll create an `AutoDetectSourceLanguageConfig`, then use it to construct a `SpeechRecognizer`.
27
27
28
-
>[!TIP]
28
+
>[!TIP]
29
29
> You can also specify a custom model to use when performing speech to text. For more information, see [Use a custom model for automatic language detection](#use-a-custom-model-for-automatic-language-detection).
30
30
31
31
The following snippets illustrate how to use automatic language detection in your apps:
0 commit comments