You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If you prefer to jump right in, view or download all [Speech SDK C# Samples](https://aka.ms/speech/github-csharp) on GitHub. Otherwise, let's get started.
description: In this article, you create a C# Universal Windows Platform (UWP) application by using the Cognitive Services Speech Software Development Kit (SDK). You connect your client application to a previously created Bot Framework bot configured to use the Direct Line Speech channel. The application is built with the Speech SDK NuGet Package and Microsoft Visual Studio 2019.
5
-
services: cognitive-services
6
-
author: IEvangelist
7
-
manager: nitinme
2
+
author: trrwilson
8
3
ms.service: cognitive-services
9
-
ms.subservice: speech-service
10
-
ms.topic: quickstart
11
-
ms.date: 02/10/2020
12
-
ms.author: dapine
4
+
ms.topic: include
5
+
ms.date: 03/20/2020
6
+
ms.author: travisw
13
7
---
14
8
15
-
# Quickstart: Create a voice assistant with the Speech SDK, UWP
16
-
17
-
Quickstarts are also available for [speech recognition](~/articles/cognitive-services/Speech-Service/quickstarts/speech-to-text-from-microphone.md?pivots=programming-language-csharp&tabs=uwp), [speech synthesis](~/articles/cognitive-services/Speech-Service/quickstarts/text-to-speech.md?pivots=programming-language-csharp&tabs=uwp), and [speech translation](~/articles/cognitive-services/Speech-Service/quickstarts/translate-speech-to-text.md?pivots=programming-language-csharp&tabs=uwp).
18
-
19
-
In this article, you'll develop a C# Universal Windows Platform (UWP) application using the [Speech SDK](speech-sdk.md). The program will connect to a previously authored and configured bot to enable a voice assistant experience from the client application. The application is built with the [Speech SDK NuGet Package](https://aka.ms/csspeech/nuget) and Microsoft Visual Studio 2019 (any edition).
20
-
21
-
> [!NOTE]
22
-
> The Universal Windows Platform lets you develop apps that run on any device that supports Windows 10, including PCs, Xbox, Surface Hub, and other devices.
23
-
24
9
## Prerequisites
25
10
26
-
This quickstart requires:
27
-
28
-
*[Visual Studio 2019](https://visualstudio.microsoft.com/downloads/).
29
-
* An Azure subscription key for the Speech service. [Get one for free](get-started.md) or create it on the [Azure portal](https://portal.azure.com).
30
-
* A previously created bot configured with the [Direct Line Speech channel](https://docs.microsoft.com/azure/bot-service/bot-service-channel-connect-directlinespeech).
11
+
Before you get started, make sure to:
31
12
13
+
> [!div class="checklist"]
14
+
> *[Create an Azure Speech resource](~/articles/cognitive-services/speech-service/get-started.md)
15
+
> *[Set up your development environment and create an empty project](~/articles/cognitive-services/speech-service/quickstarts/setup-platform.md?tabs=uwp)
16
+
> * Create a bot connected to the [Direct Line Speech channel](https://docs.microsoft.com/azure/bot-service/bot-service-channel-connect-directlinespeech)
17
+
> * Make sure that you have access to a microphone for audio capture
18
+
>
32
19
> [!NOTE]
33
-
> Please refer to [the list of supported regions for voice assistants](regions.md#voice-assistants) and ensure your resources are deployed in one of those regions.
20
+
> Please refer to [the list of supported regions for voice assistants](~/articles/cognitive-services/speech-service/regions.md#voice-assistants) and ensure your resources are deployed in one of those regions.
34
21
35
-
## Optional: Get started fast
22
+
## Open your project in Visual Studio
36
23
37
-
This quickstart will describe, step by step, how to make a client application to connect to your speech-enabled bot. If you prefer to dive right in, the complete, ready-to-compile source code used in this quickstart is available in the [Speech SDK Samples](https://aka.ms/csspeech/samples) under the `quickstart` folder.
24
+
The first step is to make sure that you have your project open in Visual Studio.
Now add the XAML code that defines the user interface of the application, and add the C# code-behind implementation.
46
-
47
-
### XAML code
48
-
49
-
First, you'll create the application's user interface by adding the XAML code:
28
+
Let's add some code that works as a skeleton for our project.
50
29
51
30
1. In **Solution Explorer**, open `MainPage.xaml`.
52
31
53
-
1. In the designer's XAML view, replace the entire contents with the following code snippet:
32
+
1. In the designer's XAML view, replace the entire contents with the following snippet that defines a rudimentary user interface:
54
33
55
34
```xml
56
35
<Page
@@ -99,9 +78,7 @@ First, you'll create the application's user interface by adding the XAML code:
99
78
100
79
The Design view is updated to show the application's user interface.
101
80
102
-
### C# code-behind source
103
-
104
-
Then you add the code-behind source so that the application works as expected. The code-behind source includes:
81
+
1. In **Solution Explorer**, open the code-behind source file `MainPage.xaml.cs`. (It's grouped under `MainPage.xaml`.) Replace the contents of this file with the below, which includes:
105
82
106
83
- `using` statements for the `Speech` and `Speech.Dialog` namespaces
107
84
- A simple implementation to ensure microphone access, wired to a button handler
@@ -110,12 +87,6 @@ Then you add the code-behind source so that the application works as expected. T
110
87
- A helper to play back text-to-speech (without streaming support)
111
88
- An empty button handler to start listening that will be populated later
112
89
113
-
To add the code-behind source, follow these steps:
114
-
115
-
1. In **Solution Explorer**, open the code-behind source file `MainPage.xaml.cs`. (It's grouped under `MainPage.xaml`.)
116
-
117
-
1. Replace the file's contents with the following code snippet:
118
-
119
90
```csharp
120
91
using Microsoft.CognitiveServices.Speech;
121
92
using Microsoft.CognitiveServices.Speech.Audio;
@@ -283,7 +254,6 @@ To add the code-behind source, follow these steps:
283
254
}
284
255
}
285
256
```
286
-
287
257
1. Add the following code snippet to the method body of `InitializeDialogServiceConnector`. This code creates the `DialogServiceConnector` with your subscription information.
288
258
289
259
```csharp
@@ -298,12 +268,12 @@ To add the code-behind source, follow these steps:
298
268
```
299
269
300
270
> [!NOTE]
301
-
> Please refer to [the list of supported regions for voice assistants](regions.md#voice-assistants) and ensure your resources are deployed in one of those regions.
271
+
> Please refer to [the list of supported regions for voice assistants](~/articles/cognitive-services/speech-service/regions.md#voice-assistants) and ensure your resources are deployed in one of those regions.
302
272
303
273
> [!NOTE]
304
-
> For information on configuring your bot and retrieving a channel secret, see the Bot Framework documentation for [the Direct Line Speech channel](https://docs.microsoft.com/azure/bot-service/bot-service-channel-connect-directlinespeech).
274
+
> For information on configuring your bot, see the Bot Framework documentation for [the Direct Line Speech channel](https://docs.microsoft.com/azure/bot-service/bot-service-channel-connect-directlinespeech).
305
275
306
-
1. Replace the strings `YourChannelSecret`, `YourSpeechSubscriptionKey`, and `YourServiceRegion` with your own values for your bot, speech subscription, and [region](regions.md).
276
+
1. Replace the strings `YourSpeechSubscriptionKey` and `YourServiceRegion` with your own values for your speech subscription and [region](~/articles/cognitive-services/speech-service/regions.md).
307
277
308
278
1. Append the following code snippet to the end of the method body of `InitializeDialogServiceConnector`. This code sets up handlers for events relied on by `DialogServiceConnector` to communicate its bot activities, speech recognition results, and other information.
309
279
@@ -388,36 +358,23 @@ To add the code-behind source, follow these steps:
1. Select **Talk to your bot**, and speak an English phrase or sentence into your device's microphone. Your speech is transmitted to the Direct Line Speech channel and transcribed to text, which appears in the window.
title: 'Quickstart: Create a custom voice assistant - Speech service'
3
+
titleSuffix: Azure Cognitive Services
4
+
services: cognitive-services
5
+
author: trrwilson
6
+
manager: nitinme
7
+
ms.service: cognitive-services
8
+
ms.subservice: speech-service
9
+
ms.topic: include
10
+
ms.date: 03/20/2020
11
+
ms.author: travisw
12
+
---
13
+
14
+
In this quickstart, you will use the [Speech SDK](~/articles/cognitive-services/speech-service/speech-sdk.md) to create a custom voice assistant application that connects to a bot that you have already authored and configured. If you need to create a bot, see [the related tutorial](~/articles/cognitive-services/speech-service/tutorial-voice-enable-your-bot-speech-sdk.md) for a more comprehensive guide.
15
+
16
+
After satisfying a few prerequisites, connecting your custom voice assistant takes only a few steps:
17
+
> [!div class="checklist"]
18
+
> * Create a `BotFrameworkConfig` object from your subscription key and region.
19
+
> * Create a `DialogServiceConnector` object using the `BotFrameworkConfig` object from above.
20
+
> * Using the `DialogServiceConnector` object, start the listening process for a single utterance.
21
+
> * Inspect the `ActivityReceivedEventArgs` returned.
0 commit comments