Skip to content

Commit 9247216

Browse files
Merge pull request #257198 from enkrumah/patch-53
Update stream-analytics-define-kafka-input.md
2 parents 8dcbcf4 + c76d64e commit 9247216

File tree

1 file changed

+25
-13
lines changed

1 file changed

+25
-13
lines changed

articles/stream-analytics/stream-analytics-define-kafka-input.md

Lines changed: 25 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ author: enkrumah
55
ms.author: ebnkruma
66
ms.service: stream-analytics
77
ms.topic: conceptual
8-
ms.date: 10/27/2023
8+
ms.date: 11/02/2023
99
---
1010

1111
# Stream data from Kafka into Azure Stream Analytics (Preview)
@@ -26,13 +26,13 @@ The following table lists the property names and their description for creating
2626

2727
> [!IMPORTANT]
2828
> To configure your Kafka cluster as an input, the timestamp type of the input topic should be **LogAppendTime**. The only timestamp type Azure Stream Analytics supports is **LogAppendTime**.
29-
>
29+
> Azure Stream Analytics supports only numerical decimal format.
3030
3131
| Property name | Description |
3232
|------------------------------|-------------------------------------------------------------------------------------------------------------------------|
3333
| Input/Output Alias | A friendly name used in queries to reference your input or output |
3434
| Bootstrap server addresses | A list of host/port pairs to establish the connection to the Kafka cluster. |
35-
| Kafka topic | A unit of your Kafka cluster you want to write events to. |
35+
| Kafka topic | A named, ordered, and partitioned stream of data that allows for the publish-subscribe and event-driven processing of messages.|
3636
| Security Protocol | How you want to connect to your Kafka cluster. Azure Stream Analytics supports mTLS, SASL_SSL, SASL_PLAINTEXT or None. |
3737
| Event Serialization format | The serialization format (JSON, CSV, Avro, Parquet, Protobuf) of the incoming data stream. |
3838

@@ -56,7 +56,12 @@ You can use four types of security protocols to connect to your Kafka clusters:
5656
### Connect to Confluent Cloud using API key
5757

5858
The ASA Kafka input is a librdkafka-based client, and to connect to confluent cloud, you need TLS certificates that confluent cloud uses for server auth.
59-
Confluent uses TLS certificates from Let’s Encrypt, an open certificate authority (CA) You can download the ISRG Root X1 certificate in PEM format on the site of [LetsEncrypt](https://letsencrypt.org/certificates/).
59+
Confluent uses TLS certificates from Let’s Encrypt, an open certificate authority (CA). You can download the ISRG Root X1 certificate in PEM format on the site of [LetsEncrypt](https://letsencrypt.org/certificates/).
60+
61+
> [!IMPORTANT]
62+
> You must use Azure CLI to upload the certificate as a secret to your key vault. You cannot use Azure Portal to upload a certificate that has multiline secrets to key vault.
63+
> The default timestamp type for a topic in a confluent cloud kafka cluster is **CreateTime**, make sure you update it to **LogAppendTime**.
64+
> Azure Stream Analytics supports only numerical decimal format.
6065
6166
To authenticate using the API Key confluent offers, you must use the SASL_SSL protocol and complete the configuration as follows:
6267

@@ -65,8 +70,11 @@ To authenticate using the API Key confluent offers, you must use the SASL_SSL pr
6570
| Username | Key/ Username from API Key |
6671
| Password | Secret/ Password from API key |
6772
| KeyVault | Name of Azure Key vault with Uploaded certificate from Let’s Encrypt |
68-
| Certificate | Certificate uploaded to KeyVault downloaded from Let’s Encrypt (Download the ISRG Root X1 certificate in PEM format) |
69-
73+
| Certificate | name of the certificate uploaded to KeyVault downloaded from Let’s Encrypt (Download the ISRG Root X1 certificate in PEM format). Note: you must upload the certificate as a secret using Azure CLI. Refer to the **Key vault integration** guide below |
74+
75+
> [!NOTE]
76+
> Depending on how your confluent cloud kafka cluster is configured, you may need a certificate different from the standard certificate confluent cloud uses for server authentication. Confirm with the admin of the confluent cloud kafka cluster to verify what certificate to use.
77+
>
7078
7179
## Key vault integration
7280

@@ -80,7 +88,8 @@ Certificates are stored as secrets in the key vault and must be in PEM format.
8088
### Configure Key vault with permissions
8189

8290
You can create a key vault resource by following the documentation [Quickstart: Create a key vault using the Azure portal](../key-vault/general/quick-create-portal.md)
83-
To be able to upload certificates, you must have "**Key Vault Administrator**" access to your Key vault. Follow the following to grant admin access.
91+
To upload certificates, you must have "**Key Vault Administrator**" access to your Key vault.
92+
Follow the following to grant admin access:
8493

8594
> [!NOTE]
8695
> You must have "**Owner**" permissions to grant other key vault permissions.
@@ -103,24 +112,27 @@ To be able to upload certificates, you must have "**Key Vault Administrator**"
103112
> [!IMPORTANT]
104113
> You must have "**Key Vault Administrator**" permissions access to your Key vault for this command to work properly
105114
> You must upload the certificate as a secret. You must use Azure CLI to upload certificates as secrets to your key vault.
106-
> Your Azure Stream Analytics job will fail when the certificate used for authentication expires. To resolve this, you must update/replace the certificate in your key vault and restart your Azure Stream Analytics job
115+
> Your Azure Stream Analytics job will fail when the certificate used for authentication expires. To resolve this, you must update/replace the certificate in your key vault and restart your Azure Stream Analytics job.
107116
117+
Make sure you have Azure CLI configured locally with PowerShell.
108118
You can visit this page to get guidance on setting up Azure CLI: [Get started with Azure CLI](https://learn.microsoft.com/cli/azure/get-started-with-azure-cli#how-to-sign-into-the-azure-cli)
109-
The following command can upload the certificate as a secret to your key vault. You must have "**Key Vault Administrator**" permissions access to your Key vault for this command to work properly.
110119

111120
**Login to Azure CLI:**
112-
```azurecli-interactive
121+
```PowerShell
113122
az login
114123
```
115124

116125
**Connect to your subscription containing your key vault:**
117-
```azurecli-interactive
126+
```PowerShell
118127
az account set --subscription <subscription name>
119128
```
120129

121130
**The following command can upload the certificate as a secret to your key vault:**
122-
```azurecli-interactive
123-
az keyvault secret set --vault-name <your key vault> --name <name of the secret> --file <file path to secret>
131+
132+
The `<your key vault>` is the name of the key vault you want to upload the certificate to. `<name of the secret>` is any name you want to give to your secret and how it will show up in the key vault. Note the name; you will use it to configure your kafka output in your ASA job. `<file path to certificate>` is the path to where you have downloaded your certificate.
133+
134+
```PowerShell
135+
az keyvault secret set --vault-name <your key vault> --name <name of the secret> --file <file path to certificate>
124136
```
125137

126138

0 commit comments

Comments
 (0)