Skip to content

Commit a3e06d4

Browse files
authored
DEVX-1524: Adds a protobuf example configuration (#61)
1 parent bc9e00a commit a3e06d4

File tree

2 files changed

+26
-2
lines changed

2 files changed

+26
-2
lines changed

README.md

Lines changed: 7 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -116,7 +116,7 @@ confluent-hub install target/components/packages/confluentinc-kafka-connect-data
116116

117117
See all Kafka Connect [configuration parameters](https://docs.confluent.io/current/connect/managing/configuring.html).
118118

119-
## Connector-specific Parameters
119+
## kafka-connect-datagen Specific Parameters
120120

121121
Parameter | Description | Default
122122
-|-|-
@@ -131,6 +131,12 @@ Parameter | Description | Default
131131

132132
See the [config](https://github.com/confluentinc/kafka-connect-datagen/tree/master/config) folder for sample configurations.
133133

134+
## Supported data formats
135+
136+
Kafka Connect supports [Converters](https://docs.confluent.io/current/connect/userguide.html#connect-configuring-converters) which can be used to convert record key and value formats when reading from and writing to Kafka. As of the 5.5 release, Confluent Platform packages Avro, JSON, and Protobuf converters (earlier versions package just Avro converters).
137+
138+
For an example of using the the Protobuf converter with kafka-connect-datagen, see this [example configuration](config/connector_users_protobuf.config). Take note of the required use of the `SetSchemaMetadata` [Transformation](https://docs.confluent.io/current/connect/transforms/index.html) which addresses a compatibility issue between schema names used by kafka-connect-datagen and Protobuf. See the [Schema names are not compatible with Protobuf issue](https://github.com/confluentinc/kafka-connect-datagen/issues/62) for details.
139+
134140
## Use a bundled schema specifications
135141

136142
There are a few quickstart schema specifications bundled with `kafka-connect-datagen`, and they are listed in this [directory](https://github.com/confluentinc/kafka-connect-datagen/tree/master/src/main/resources).
@@ -254,4 +260,3 @@ to override the CP Version and the Operator version, which may happen if Operato
254260
CP_VERSION=5.5.0 OPERATOR_VERSION=1 KAFKA_CONNECT_DATAGEN_VERSION=0.1.4 make push-cp-server-connect-operator-from-released
255261
```
256262
which would result in a docker image tagged as: `cp-server-connect-operator-datagen:0.1.4-5.5.0.1` and pushed to DockerHub
257-
Lines changed: 19 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,19 @@
1+
{
2+
"name": "datagen-protobuf-users",
3+
"config": {
4+
"connector.class": "io.confluent.kafka.connect.datagen.DatagenConnector",
5+
"kafka.topic": "users",
6+
"quickstart": "users",
7+
"key.converter": "org.apache.kafka.connect.storage.StringConverter",
8+
"value.converter": "io.confluent.connect.protobuf.ProtobufConverter",
9+
"value.converter.schemas.enable": "false",
10+
"value.converter.schema.registry.url": "http://localhost:8081",
11+
"producer.interceptor.classes": "io.confluent.monitoring.clients.interceptor.MonitoringProducerInterceptor",
12+
"max.interval": 1000,
13+
"iterations": 10000000,
14+
"tasks.max": "1",
15+
"transforms": "SetSchemaMetadata",
16+
"transforms.SetSchemaMetadata.type": "org.apache.kafka.connect.transforms.SetSchemaMetadata$Value",
17+
"transforms.SetSchemaMetadata.schema.name": "users"
18+
}
19+
}

0 commit comments

Comments
 (0)