You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/stream-analytics/stream-analytics-define-kafka-input.md
+25-13Lines changed: 25 additions & 13 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -5,7 +5,7 @@ author: enkrumah
5
5
ms.author: ebnkruma
6
6
ms.service: stream-analytics
7
7
ms.topic: conceptual
8
-
ms.date: 10/27/2023
8
+
ms.date: 11/02/2023
9
9
---
10
10
11
11
# Stream data from Kafka into Azure Stream Analytics (Preview)
@@ -26,13 +26,13 @@ The following table lists the property names and their description for creating
26
26
27
27
> [!IMPORTANT]
28
28
> To configure your Kafka cluster as an input, the timestamp type of the input topic should be **LogAppendTime**. The only timestamp type Azure Stream Analytics supports is **LogAppendTime**.
29
-
>
29
+
>Azure Stream Analytics supports only numerical decimal format.
| Input/Output Alias | A friendly name used in queries to reference your input or output |
34
34
| Bootstrap server addresses | A list of host/port pairs to establish the connection to the Kafka cluster. |
35
-
| Kafka topic | A unit of your Kafka cluster you want to write events to. |
35
+
| Kafka topic | A named, ordered, and partitioned stream of data that allows for the publish-subscribe and event-driven processing of messages.|
36
36
| Security Protocol | How you want to connect to your Kafka cluster. Azure Stream Analytics supports mTLS, SASL_SSL, SASL_PLAINTEXT or None. |
37
37
| Event Serialization format | The serialization format (JSON, CSV, Avro, Parquet, Protobuf) of the incoming data stream. |
38
38
@@ -56,7 +56,12 @@ You can use four types of security protocols to connect to your Kafka clusters:
56
56
### Connect to Confluent Cloud using API key
57
57
58
58
The ASA Kafka input is a librdkafka-based client, and to connect to confluent cloud, you need TLS certificates that confluent cloud uses for server auth.
59
-
Confluent uses TLS certificates from Let’s Encrypt, an open certificate authority (CA) You can download the ISRG Root X1 certificate in PEM format on the site of [LetsEncrypt](https://letsencrypt.org/certificates/).
59
+
Confluent uses TLS certificates from Let’s Encrypt, an open certificate authority (CA). You can download the ISRG Root X1 certificate in PEM format on the site of [LetsEncrypt](https://letsencrypt.org/certificates/).
60
+
61
+
> [!IMPORTANT]
62
+
> You must use Azure CLI to upload the certificate as a secret to your key vault. You cannot use Azure Portal to upload a certificate that has multiline secrets to key vault.
63
+
> The default timestamp type for a topic in a confluent cloud kafka cluster is **CreateTime**, make sure you update it to **LogAppendTime**.
64
+
> Azure Stream Analytics supports only numerical decimal format.
60
65
61
66
To authenticate using the API Key confluent offers, you must use the SASL_SSL protocol and complete the configuration as follows:
62
67
@@ -65,8 +70,11 @@ To authenticate using the API Key confluent offers, you must use the SASL_SSL pr
65
70
| Username | Key/ Username from API Key |
66
71
| Password | Secret/ Password from API key |
67
72
| KeyVault | Name of Azure Key vault with Uploaded certificate from Let’s Encrypt |
68
-
| Certificate | Certificate uploaded to KeyVault downloaded from Let’s Encrypt (Download the ISRG Root X1 certificate in PEM format) |
69
-
73
+
| Certificate | name of the certificate uploaded to KeyVault downloaded from Let’s Encrypt (Download the ISRG Root X1 certificate in PEM format). Note: you must upload the certificate as a secret using Azure CLI. Refer to the **Key vault integration** guide below |
74
+
75
+
> [!NOTE]
76
+
> Depending on how your confluent cloud kafka cluster is configured, you may need a certificate different from the standard certificate confluent cloud uses for server authentication. Confirm with the admin of the confluent cloud kafka cluster to verify what certificate to use.
77
+
>
70
78
71
79
## Key vault integration
72
80
@@ -80,7 +88,8 @@ Certificates are stored as secrets in the key vault and must be in PEM format.
80
88
### Configure Key vault with permissions
81
89
82
90
You can create a key vault resource by following the documentation [Quickstart: Create a key vault using the Azure portal](../key-vault/general/quick-create-portal.md)
83
-
To be able to upload certificates, you must have "**Key Vault Administrator**" access to your Key vault. Follow the following to grant admin access.
91
+
To upload certificates, you must have "**Key Vault Administrator**" access to your Key vault.
92
+
Follow the following to grant admin access:
84
93
85
94
> [!NOTE]
86
95
> You must have "**Owner**" permissions to grant other key vault permissions.
@@ -103,24 +112,27 @@ To be able to upload certificates, you must have "**Key Vault Administrator**"
103
112
> [!IMPORTANT]
104
113
> You must have "**Key Vault Administrator**" permissions access to your Key vault for this command to work properly
105
114
> You must upload the certificate as a secret. You must use Azure CLI to upload certificates as secrets to your key vault.
106
-
> Your Azure Stream Analytics job will fail when the certificate used for authentication expires. To resolve this, you must update/replace the certificate in your key vault and restart your Azure Stream Analytics job
115
+
> Your Azure Stream Analytics job will fail when the certificate used for authentication expires. To resolve this, you must update/replace the certificate in your key vault and restart your Azure Stream Analytics job.
107
116
117
+
Make sure you have Azure CLI configured locally with PowerShell.
108
118
You can visit this page to get guidance on setting up Azure CLI: [Get started with Azure CLI](https://learn.microsoft.com/cli/azure/get-started-with-azure-cli#how-to-sign-into-the-azure-cli)
109
-
The following command can upload the certificate as a secret to your key vault. You must have "**Key Vault Administrator**" permissions access to your Key vault for this command to work properly.
110
119
111
120
**Login to Azure CLI:**
112
-
```azurecli-interactive
121
+
```PowerShell
113
122
az login
114
123
```
115
124
116
125
**Connect to your subscription containing your key vault:**
117
-
```azurecli-interactive
126
+
```PowerShell
118
127
az account set --subscription <subscription name>
119
128
```
120
129
121
130
**The following command can upload the certificate as a secret to your key vault:**
122
-
```azurecli-interactive
123
-
az keyvault secret set --vault-name <your key vault> --name <name of the secret> --file <file path to secret>
131
+
132
+
The `<your key vault>` is the name of the key vault you want to upload the certificate to. `<name of the secret>` is any name you want to give to your secret and how it will show up in the key vault. Note the name; you will use it to configure your kafka output in your ASA job. `<file path to certificate>` is the path to where you have downloaded your certificate.
133
+
134
+
```PowerShell
135
+
az keyvault secret set --vault-name <your key vault> --name <name of the secret> --file <file path to certificate>
0 commit comments