Skip to content

Commit 92b4e87

Browse files
authored
Update kafka-output.md
1 parent 3630ad4 commit 92b4e87

File tree

1 file changed

+6
-6
lines changed

1 file changed

+6
-6
lines changed

articles/stream-analytics/kafka-output.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ Azure Stream Analytics allows you to connect directly to Kafka clusters as a pro
1414

1515
### Kafka Event Compression
1616

17-
Supported compression types are None, Gzip, Snappy, LZ4 and Zstd.
17+
Supported compression types are None, Gzip, Snappy, LZ4, and Zstd.
1818

1919
## Authentication and Encryption
2020

@@ -44,7 +44,7 @@ Azure Stream Analytics integrates seamlessly with Azure Key Vault to access stor
4444
You can store the certificates as Key Vault certificates or Key Vault secrets. Private keys are in PEM format.
4545

4646
### Key Vault Integration
47-
When configuring your Azure Stream Analytics job to connect to your Kafka clusters, depending on your configuration, you may have to configure your job to be able to access your Kafka clusters, which are behind a firewall or inside a virtual network. You can visit the Azure Stream Analytics VNET documentation to learn more about configuring private endpoints to access resources that are inside a virtual network or behind a firewall.
47+
When configuring your Azure Stream Analytics job to connect to your Kafka clusters, depending on your configuration, you may have to configure your job to access your Kafka clusters, which are behind a firewall or inside a virtual network. You can visit the Azure Stream Analytics VNET documentation to learn more about configuring private endpoints to access resources inside a virtual network or behind a firewall.
4848

4949

5050
### Configuration
@@ -53,20 +53,20 @@ The following table lists the property names and their description for creating
5353
| Property name | Description |
5454
|------------------------------|-------------------------------------------------------------------------------------------------------------------------|
5555
| Input/Output Alias | A friendly name used in queries to reference your input or output |
56-
| Bootstrap server addresses | A list of host/port pairs to use for establishing the connection to the Kafka cluster. |
56+
| Bootstrap server addresses | A list of host/port pairs to establish the connection to the Kafka cluster. |
5757
| Kafka topic | A unit of your Kafka cluster you want to write events to. |
58-
| Security Protocol | How you want to connect to your Kafka cluster. Azure Stream Analytics supports: mTLS, SASL_SSL, SASL_PLAINTEXT or None. |
58+
| Security Protocol | How you want to connect to your Kafka cluster. Azure Stream Analytics supports mTLS, SASL_SSL, SASL_PLAINTEXT or None. |
5959
| Event Serialization format | The serialization format (JSON, CSV, Avro) of the outgoing data stream. |
6060
| Partition key | Azure Stream Analytics assigns partitions using round partitioning. |
61-
| Kafka event compression type | The compression type used for outgoing data stream, such as Gzip, Snappy, Lz4, Zstd or None. |
61+
| Kafka event compression type | The compression type used for outgoing data streams, such as Gzip, Snappy, Lz4, Zstd, or None. |
6262

6363
### Limitations
6464
* When configuring your Azure Stream Analytics jobs to use VNET/SWIFT, your job must be configured with at least six (6) streaming units.
6565
* When using mTLS or SASL_SSL with Azure Key Vault, you must convert your Java Key Store to PEM format.
6666
* The minimum version of Kafka you can configure Azure Stream Analytics to connect to is version 0.10.
6767

6868
> [!NOTE]
69-
> For direct on using the Azure Stream Analytics kafka adapter, please reach out to [[email protected]](mailto:[email protected]).
69+
> For direct help with using the Azure Stream Analytics Kafka adapter, please reach out to [[email protected]](mailto:[email protected]).
7070
>
7171
7272
## Next steps

0 commit comments

Comments
 (0)