Skip to content

Commit 23e0881

Browse files
committed
Updated the docker demo
Now uses the datagen connector to generate the data. Also adds the confluent-control-center so the kafka data can be examined there. KAFKA-36
1 parent bf70511 commit 23e0881

File tree

4 files changed

+183
-77
lines changed

4 files changed

+183
-77
lines changed

docker/Dockerfile

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
FROM confluentinc/cp-kafka-connect:5.2.2
2+
3+
ENV CONNECT_PLUGIN_PATH="/usr/share/confluent-hub-components"
4+
5+
RUN confluent-hub install --no-prompt confluentinc/kafka-connect-datagen:latest

docker/README.md

Lines changed: 17 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,9 @@
11
# MongoDB & Kafka Docker end to end example
22

3-
A simple example that takes JSON documents from the `source` topic and stores them into the `test.sink` collection in MongoDB using
3+
A simple example that takes JSON documents from the `pageviews` topic and stores them into the `test.pageviews` collection in MongoDB using
44
the MongoDB Kafka Sink Connector.
55

6-
The MongoDB Kafka Source Connector also publishes all change stream events from `test.sink` into the `mongo.test.sink` topic.
6+
The MongoDB Kafka Source Connector also publishes all change stream events from `test.pageviews` into the `mongo.test.pageviews` topic.
77

88
## Requirements
99
- Docker 18.09+
@@ -16,26 +16,27 @@ To run the example: `./run.sh` which will:
1616

1717
- Run `docker-compose up`
1818
- Wait for MongoDB, Kafka, Kafka Connect to be ready
19+
- Register the Confluent Datagen Connector
1920
- Register the MongoDB Kafka Sink Connector
2021
- Register the MongoDB Kafka Source Connector
21-
- Publish some events to Kafka
22+
- Publish some events to Kafka via the Datagen connector
2223
- Write the events to MongoDB
23-
- Write the change stream messages to Kafka
24+
- Write the change stream messages back into Kafka
2425

2526

26-
Once running, examine the topics in the Kafka UI: http://localhost:8000/
27-
- The `source` topic should contain the 10 simple documents added. Each similar to:<br>
27+
Once running, examine the topics in the Kafka control center: http://localhost:9021/
28+
- The `pageviews` topic should contain the 10 simple documents added. Each similar to:<br>
2829
```json
29-
{"i": "0"}
30+
{"viewtime": {"$numberLong": "81"}, "pageid": "Page_1", "userid": "User_8"}
3031
```
31-
- The `mongo.test.sink` should contain the 10 change events. Each similar to:<br>
32+
- The `mongo.test.pageviews` should contain the 10 change events. Each similar to:<br>
3233
```json
33-
{"_id": {"_data": "<resumeToken>"},
34-
"operationType": "insert",
35-
"ns": {"db": "test", "coll": "sink"},
36-
"documentKey": {"_id": {"$oid": "5cc99f4893283d634cb3f59e"}},
37-
"clusterTime": {"$timestamp": {"t": 1556717385, "i": 1}},
38-
"fullDocument": {"_id": {"$oid": "5cc99f4893283d634cb3f59e"}, "i": "0"}}
34+
{"_id": {"_data": "<resumeToken>"},
35+
"operationType": "insert",
36+
"clusterTime": {"$timestamp": {"t": 1563461814, "i": 4}},
37+
"fullDocument": {"_id": {"$oid": "5d3088b6bafa7829964150f3"}, "viewtime": {"$numberLong": "81"}, "pageid": "Page_1", "userid": "User_8"},
38+
"ns": {"db": "test", "coll": "pageviews"},
39+
"documentKey": {"_id": {"$oid": "5d3088b6bafa7829964150f3"}}}
3940
```
4041

4142
Examine the collections in MongoDB:
@@ -50,6 +51,8 @@ The following systems will be created:
5051
- Kafka
5152
- Confluent Schema Registry
5253
- Confluent Kafka Connect
54+
- Confluent Control Center
55+
- Confluent KSQL Server
5356
- Kafka Rest Proxy
5457
- Kafka Topics UI
5558
- MongoDB - a 3 node replicaset

docker/docker-compose.yml

Lines changed: 111 additions & 29 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
1-
version: '3.5'
1+
version: '3.6'
22
services:
33
zookeeper:
4-
image: confluentinc/cp-zookeeper:5.2.1
4+
image: confluentinc/cp-zookeeper:5.2.2
55
hostname: zookeeper
66
container_name: zookeeper
77
ports:
@@ -12,10 +12,10 @@ services:
1212
ZOOKEEPER_CLIENT_PORT: 2181
1313
ZOOKEEPER_TICK_TIME: 2000
1414

15-
kafka:
16-
image: confluentinc/cp-kafka:5.2.1
17-
hostname: kafka
18-
container_name: kafka
15+
broker:
16+
image: confluentinc/cp-enterprise-kafka:5.2.2
17+
hostname: broker
18+
container_name: broker
1919
depends_on:
2020
- zookeeper
2121
ports:
@@ -25,17 +25,25 @@ services:
2525
- localnet
2626
environment:
2727
KAFKA_BROKER_ID: 1
28-
KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
2928
KAFKA_ZOOKEEPER_CONNECT: 'zookeeper:2181'
30-
KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka:29092
29+
KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT
30+
KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://broker:29092,PLAINTEXT_HOST://localhost:9092
31+
KAFKA_METRIC_REPORTERS: io.confluent.metrics.reporter.ConfluentMetricsReporter
32+
KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
33+
KAFKA_GROUP_INITIAL_REBALANCE_DELAY_MS: 0
34+
CONFLUENT_METRICS_REPORTER_BOOTSTRAP_SERVERS: broker:29092
35+
CONFLUENT_METRICS_REPORTER_ZOOKEEPER_CONNECT: zookeeper:2181
36+
CONFLUENT_METRICS_REPORTER_TOPIC_REPLICAS: 1
37+
CONFLUENT_METRICS_ENABLE: 'true'
38+
CONFLUENT_SUPPORT_CUSTOMER_ID: 'anonymous'
3139

3240
schema-registry:
33-
image: confluentinc/cp-schema-registry:5.2.1
41+
image: confluentinc/cp-schema-registry:5.2.2
3442
hostname: schema-registry
3543
container_name: schema-registry
3644
depends_on:
3745
- zookeeper
38-
- kafka
46+
- broker
3947
ports:
4048
- "8081:8081"
4149
networks:
@@ -45,54 +53,125 @@ services:
4553
SCHEMA_REGISTRY_KAFKASTORE_CONNECTION_URL: 'zookeeper:2181'
4654

4755
connect:
48-
image: confluentinc/cp-kafka-connect:5.2.1
49-
hostname: "connect"
56+
image: confluentinc/kafka-connect-datagen:latest
57+
build:
58+
context: .
59+
dockerfile: Dockerfile
60+
hostname: connect
61+
container_name: connect
62+
depends_on:
63+
- zookeeper
64+
- broker
65+
- schema-registry
5066
ports:
51-
- "18083:18083"
67+
- "8083:8083"
5268
networks:
5369
- localnet
5470
environment:
55-
CONNECT_BOOTSTRAP_SERVERS: "kafka:29092"
56-
CONNECT_REST_PORT: 18083
71+
CONNECT_BOOTSTRAP_SERVERS: 'broker:29092'
72+
CONNECT_REST_ADVERTISED_HOST_NAME: connect
73+
CONNECT_REST_PORT: 8083
5774
CONNECT_GROUP_ID: compose-connect-group
5875
CONNECT_CONFIG_STORAGE_TOPIC: docker-connect-configs
76+
CONNECT_CONFIG_STORAGE_REPLICATION_FACTOR: 1
77+
CONNECT_OFFSET_FLUSH_INTERVAL_MS: 10000
5978
CONNECT_OFFSET_STORAGE_TOPIC: docker-connect-offsets
79+
CONNECT_OFFSET_STORAGE_REPLICATION_FACTOR: 1
6080
CONNECT_STATUS_STORAGE_TOPIC: docker-connect-status
81+
CONNECT_STATUS_STORAGE_REPLICATION_FACTOR: 1
6182
CONNECT_KEY_CONVERTER: io.confluent.connect.avro.AvroConverter
6283
CONNECT_KEY_CONVERTER_SCHEMA_REGISTRY_URL: 'http://schema-registry:8081'
6384
CONNECT_VALUE_CONVERTER: io.confluent.connect.avro.AvroConverter
6485
CONNECT_VALUE_CONVERTER_SCHEMA_REGISTRY_URL: 'http://schema-registry:8081'
6586
CONNECT_INTERNAL_KEY_CONVERTER: "org.apache.kafka.connect.json.JsonConverter"
6687
CONNECT_INTERNAL_VALUE_CONVERTER: "org.apache.kafka.connect.json.JsonConverter"
67-
CONNECT_REST_ADVERTISED_HOST_NAME: "connect"
6888
CONNECT_LOG4J_ROOT_LOGLEVEL: "INFO"
6989
CONNECT_LOG4J_LOGGERS: "org.apache.kafka.connect.runtime.rest=WARN,org.reflections=ERROR,com.mongodb.kafka=DEBUG"
70-
CONNECT_CONFIG_STORAGE_REPLICATION_FACTOR: "1"
71-
CONNECT_OFFSET_STORAGE_REPLICATION_FACTOR: "1"
72-
CONNECT_STATUS_STORAGE_REPLICATION_FACTOR: "1"
73-
CONNECT_PLUGIN_PATH: /usr/local/share/kafka/plugins
90+
CONNECT_PLUGIN_PATH: /usr/share/confluent-hub-components
91+
CONNECT_ZOOKEEPER_CONNECT: 'zookeeper:2181'
92+
# Assumes image is based on confluentinc/kafka-connect-datagen:latest which is pulling 5.2.2 Connect image
93+
CLASSPATH: /usr/share/java/monitoring-interceptors/monitoring-interceptors-5.2.2.jar
94+
CONNECT_PRODUCER_INTERCEPTOR_CLASSES: "io.confluent.monitoring.clients.interceptor.MonitoringProducerInterceptor"
95+
CONNECT_CONSUMER_INTERCEPTOR_CLASSES: "io.confluent.monitoring.clients.interceptor.MonitoringConsumerInterceptor"
96+
command: "bash -c 'if [ ! -d /usr/share/confluent-hub-components/confluentinc-kafka-connect-datagen ]; then echo \"WARNING: Did not find directory for kafka-connect-datagen (did you remember to run: docker-compose up -d --build ?)\"; fi ; /etc/confluent/docker/run'"
7497
volumes:
75-
- ../build/confluent/kafka-connect-mongodb:/usr/local/share/kafka/plugins/
98+
- ../build/confluent/kafka-connect-mongodb:/usr/share/confluent-hub-components/kafka-connect-mongodb
99+
100+
control-center:
101+
image: confluentinc/cp-enterprise-control-center:5.2.2
102+
hostname: control-center
103+
container_name: control-center
76104
depends_on:
77105
- zookeeper
78-
- kafka
106+
- broker
79107
- schema-registry
108+
- connect
109+
- ksql-server
110+
ports:
111+
- "9021:9021"
112+
networks:
113+
- localnet
114+
environment:
115+
CONTROL_CENTER_BOOTSTRAP_SERVERS: 'broker:29092'
116+
CONTROL_CENTER_ZOOKEEPER_CONNECT: 'zookeeper:2181'
117+
CONTROL_CENTER_CONNECT_CLUSTER: 'connect:8083'
118+
CONTROL_CENTER_KSQL_URL: "http://ksql-server:8088"
119+
CONTROL_CENTER_KSQL_ADVERTISED_URL: "http://localhost:8088"
120+
CONTROL_CENTER_SCHEMA_REGISTRY_URL: "http://schema-registry:8081"
121+
CONTROL_CENTER_REPLICATION_FACTOR: 1
122+
CONTROL_CENTER_INTERNAL_TOPICS_PARTITIONS: 1
123+
CONTROL_CENTER_MONITORING_INTERCEPTOR_TOPIC_PARTITIONS: 1
124+
CONFLUENT_METRICS_TOPIC_REPLICATION: 1
125+
PORT: 9021
126+
127+
ksql-server:
128+
image: confluentinc/cp-ksql-server:5.2.2
129+
hostname: ksql-server
130+
container_name: ksql-server
131+
depends_on:
132+
- broker
133+
- connect
134+
ports:
135+
- "8088:8088"
136+
networks:
137+
- localnet
138+
environment:
139+
KSQL_CONFIG_DIR: "/etc/ksql"
140+
KSQL_LOG4J_OPTS: "-Dlog4j.configuration=file:/etc/ksql/log4j-rolling.properties"
141+
KSQL_BOOTSTRAP_SERVERS: "broker:29092"
142+
KSQL_HOST_NAME: ksql-server
143+
KSQL_APPLICATION_ID: "cp-all-in-one"
144+
KSQL_LISTENERS: "http://0.0.0.0:8088"
145+
KSQL_CACHE_MAX_BYTES_BUFFERING: 0
146+
KSQL_KSQL_SCHEMA_REGISTRY_URL: "http://schema-registry:8081"
147+
KSQL_PRODUCER_INTERCEPTOR_CLASSES: "io.confluent.monitoring.clients.interceptor.MonitoringProducerInterceptor"
148+
KSQL_CONSUMER_INTERCEPTOR_CLASSES: "io.confluent.monitoring.clients.interceptor.MonitoringConsumerInterceptor"
149+
150+
ksql-cli:
151+
image: confluentinc/cp-ksql-cli:5.2.2
152+
container_name: ksql-cli
153+
depends_on:
154+
- broker
155+
- connect
156+
- ksql-server
157+
entrypoint: /bin/sh
158+
tty: true
80159

81160
rest-proxy:
82-
image: confluentinc/cp-kafka-rest:5.2.1
161+
image: confluentinc/cp-kafka-rest:5.2.2
83162
depends_on:
84163
- zookeeper
85-
- kafka
164+
- broker
86165
- schema-registry
87166
ports:
88-
- "8082:8082"
89-
networks:
90-
- localnet
167+
- 8082:8082
91168
hostname: rest-proxy
92169
container_name: rest-proxy
170+
networks:
171+
- localnet
93172
environment:
94173
KAFKA_REST_HOST_NAME: rest-proxy
95-
KAFKA_REST_BOOTSTRAP_SERVERS: 'kafka:29092'
174+
KAFKA_REST_BOOTSTRAP_SERVERS: 'broker:29092'
96175
KAFKA_REST_LISTENERS: "http://0.0.0.0:8082"
97176
KAFKA_REST_SCHEMA_REGISTRY_URL: 'http://schema-registry:8081'
98177

@@ -108,13 +187,14 @@ services:
108187
PROXY: "true"
109188
depends_on:
110189
- zookeeper
111-
- kafka
190+
- broker
112191
- schema-registry
113192
- rest-proxy
114193

115194
# MongoDB Replica Set
116195
mongo1:
117196
image: "mongo:4.0-xenial"
197+
container_name: mongo1
118198
command: --replSet rs0 --smallfiles --oplogSize 128
119199
volumes:
120200
- rs1:/data/db
@@ -125,6 +205,7 @@ services:
125205
restart: always
126206
mongo2:
127207
image: "mongo:4.0-xenial"
208+
container_name: mongo2
128209
command: --replSet rs0 --smallfiles --oplogSize 128
129210
volumes:
130211
- rs2:/data/db
@@ -135,6 +216,7 @@ services:
135216
restart: always
136217
mongo3:
137218
image: "mongo:4.0-xenial"
219+
container_name: mongo3
138220
command: --replSet rs0 --smallfiles --oplogSize 128
139221
volumes:
140222
- rs3:/data/db

0 commit comments

Comments
 (0)