Skip to content

Commit cee9940

Browse files
committed
Switch to Confluent Platform via Docker
Removed Confluent Platform install and command-line stuff.
1 parent 4f99d29 commit cee9940

File tree

8 files changed

+268
-109
lines changed

8 files changed

+268
-109
lines changed

CONTRIBUTING.md

Lines changed: 55 additions & 64 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
This guide describes how to develop and contribute pull requests to this connector. The focus is currently on how to
2-
develop and test the connector, either via a local install of Confluent Platform or of the regular Kafka distribution.
2+
develop and test the connector, either via a Docker cluster install of Confluent Platform or of the regular Kafka distribution.
33

44
Before beginning, you will need to install Java (either version 8, 11, or 17) and also have a MarkLogic instance
55
available. It is recommended to use 11 or 17, as Confluent has deprecated Java 8 support in Confluent 7.x and is
@@ -12,7 +12,7 @@ points to your Java installation.
1212
# Running the test suite
1313

1414
The test suite for the MarkLogic Kafka connector, found at `src/test/resources`, requires that an application first be
15-
deployed to a MarkLogic instance. This application is deployed via Docker and [ml-gradle](https://github.com/marklogic-community/ml-gradle).
15+
deployed to a MarkLogic instance. This application is deployed via Docker and [ml-gradle](https://github.com/marklogic-community/ml-gradle).
1616

1717
Note that you do not need to install [Gradle](https://gradle.org/) - the `gradlew` program used below will install the
1818
appropriate version of Gradle if you do not have it installed already.
@@ -82,74 +82,60 @@ web application.
8282

8383
To try out the MarkLogic Kafka connector via the Confluent Platform, follow the steps below.
8484

85-
## Install Confluent Platform with the MarkLogic Kafka connector
85+
## Build a Confluent Platform cluster with the MarkLogic and the Kafka connector
8686

87-
First, [install the Confluent Platform](https://docs.confluent.io/platform/current/installation/installing_cp/zip-tar.html)
88-
(the "Docker" option has not yet been tested). It is recommended to configure `CONFLUENT_HOME` as described at that
89-
page, as that simplifies running the `confluent` commands below.
87+
### Build the Confluent Platform cluster via Docker
9088

91-
To verify that you have Confluent Platform installed successfully, run the following:
92-
93-
confluent local services status
94-
95-
This should show that each component of Confluent Platform is not running; you should see something like the following
96-
displayed:
89+
**Note** - This installs a separate Docker cluster and is not the same as the Docker cluster described in the test
90+
section at the top of this document.
9791

92+
Use the docker-compose file in "src/test/confluent-platform-example" to build the Confluent Platform Docker cluster
93+
with the command ```docker-compose -f src/test/confluent-platform-example/docker-compose.yml up -d --build```.
94+
This file is based on the Confluent files and instructions at
95+
[Install a Confluent Platform cluster in Docker using a Confluent docker-compose file]
96+
(https://docs.confluent.io/platform/current/platform-quickstart.html).
97+
When the setup is complete, you should be able to run
98+
```docker-compose -f src/test/confluent-platform-example/docker-compose.yml ps``` and see the following results.
9899
```
99-
Connect is [DOWN]
100-
Control Center is [DOWN]
101-
Kafka is [DOWN]
102-
Kafka REST is [DOWN]
103-
ksqlDB Server is [DOWN]
104-
Schema Registry is [DOWN]
105-
ZooKeeper is [DOWN]
100+
Name Command State Ports
101+
--------------------------------------------------------------------------------------------------------------
102+
broker /etc/confluent/docker/run Up 0.0.0.0:9092->9092/tcp,:::9092->9092/tcp,
103+
0.0.0.0:9101->9101/tcp,:::9101->9101/tcp
104+
connect /etc/confluent/docker/run Up 0.0.0.0:8083->8083/tcp,:::8083->8083/tcp, 9092/tcp
105+
control-center /etc/confluent/docker/run Up 0.0.0.0:9021->9021/tcp,:::9021->9021/tcp
106+
ksql-datagen bash -c echo Waiting for K ... Up
107+
ksqldb-cli /bin/sh Up
108+
ksqldb-server /etc/confluent/docker/run Up 0.0.0.0:8088->8088/tcp,:::8088->8088/tcp
109+
rest-proxy /etc/confluent/docker/run Up 0.0.0.0:8082->8082/tcp,:::8082->8082/tcp
110+
schema-registry /etc/confluent/docker/run Up 0.0.0.0:8081->8081/tcp,:::8081->8081/tcp
106111
```
107112

108-
The Kafka [Datagen Source Connector](https://www.confluent.io/hub/confluentinc/kafka-connect-datagen) is a convenient
109-
tool for local development and testing. Install it via the following:
110-
111-
confluent-hub install confluentinc/kafka-connect-datagen:0.6.0
112-
113-
Then build and install the connector to the Confluent Platform indicated by your $CONFLUENT_HOME environment variable:
114-
115-
./gradlew clean installConnectorInConfluent
116-
117-
Note that any time you modify the MarkLogic Kafka connector code, you'll need to repeat the
118-
`./gradlew clean installConnectorInConfluent` step. Note that `clean` is included to ensure that in case you've changed
119-
any connector dependencies, old dependencies will not be included in the connector archive.
120-
121-
Next, start Confluent:
113+
You can now visit http://localhost:9021 to access [Confluent's Control Center](https://docs.confluent.io/platform/current/control-center/index.html) application.
122114

123-
confluent local services start
124-
125-
To verify that your Confluent installation is running properly, you can run `confluent local services status` and
126-
see logging similar to this:
127-
128-
```
129-
Connect is [UP]
130-
Control Center is [UP]
131-
Kafka is [UP]
132-
Kafka REST is [UP]
133-
ksqlDB Server is [UP]
134-
Schema Registry is [UP]
135-
ZooKeeper is [UP]
136-
```
115+
Within Control Center, click on "controlcenter.cluster" to access the configuration for the Kafka cluster.
137116

138-
You can now visit http://localhost:9021 to access [Confluent's Control Center](https://docs.confluent.io/platform/current/control-center/index.html)
139-
application.
117+
### Build and install the MarkLogic Kafka Connector
118+
1. Build the connectorArchive target using ```./gradlew installConnectorInConfluent```.
119+
2. Restart the "connect" server in the Docker "confluent-platform-example" cluster.
120+
3. Verify the connector has loaded properly.
121+
1. Click on "Connect" in the left sidebar.
122+
2. Click on the "connect-default" cluster.
123+
3. Click on the "+ Add connector" tile.
124+
4. The "Browse" screen should several tiles including "MarkLogicSinkConnector" and "MarkLogicSourceConnector".
140125

141-
Within Control Center, click on "controlcenter.cluster" to access the configuration for the Kafka cluster.
142126

127+
### Install the test application on the MarkLogic server in the Docker cluster
128+
In the project root directory, run ```./gradlew -i mlDeploy```
143129

144130
## Load a Datagen connector instance
145131

146-
To test out the MarkLogic Kafka connector, you should first load an instance of the [Kafka Datagen connector]
147-
(https://github.com/confluentinc/kafka-connect-datagen). The Datagen connector is a Kafka source connector that can
148-
generate test data which can then be fed to the MarkLogic Kafka connector. The following Gradle command will automate
149-
loading an instance of the Datagen connector that will write JSON messages to a `purchases` topic every second:
132+
### Via Gradle
133+
```./gradlew -i loadDatagenPurchasesConnector```
150134

151-
./gradlew loadDatagenPurchasesConnector
135+
### Via curl
136+
```curl -X POST -H "Content-Type: application/json" --data @src/test/resources/confluent/datagen-purchases-source.json http://localhost:8083/connectors```
152137

138+
### Verifying the new connector instance
153139
In the Control Center GUI, you can verify the Datagen connector instance:
154140

155141
1. Click on "Connect" in the left sidebar
@@ -164,12 +150,13 @@ Additionally, you can examine the data sent by the Datagen connector to the `pur
164150

165151
## Load a MarkLogic Kafka sink connector instance
166152

167-
Next, load an instance of the MarkLogic Kafka connector that will read data from the `purchases` topic and write
168-
it to MarkLogic. The `src/test/resources/confluent/marklogic-purchases-sink.json` file defines the connection
169-
properties for MarkLogic. You can adjust this file to suit your testing needs.
153+
### Via Gradle
154+
```./gradlew -i loadMarkLogicPurchasesSinkConnector```
170155

171-
./gradlew loadMarkLogicPurchasesSinkConnector
156+
### Via curl
157+
```curl -X POST -H "Content-Type: application/json" --data @src/test/resources/confluent/marklogic-purchases-sink.json http://localhost:8083/connectors```
172158

159+
### Verifying the new connector instance
173160
In the Control Center GUI, you can verify the MarkLogic Kafka connector instance:
174161

175162
1. Click on "Connect" in the left sidebar
@@ -179,6 +166,7 @@ In the Control Center GUI, you can verify the MarkLogic Kafka connector instance
179166
You can then verify that data is being written to MarkLogic by using MarkLogic's qconsole application to inspect the
180167
contents of the `kafka-test-content` database.
181168

169+
### Via the web application
182170
You can also manually configure an instance of the sink connector:
183171

184172
1. Click on "Connect" in the left sidebar
@@ -196,14 +184,16 @@ In the list of connectors in Control Center, the connector will initially have a
196184
After it starts successfully, it will have a status of "Running".
197185

198186
## Load a MarkLogic Kafka source connector instance
187+
You can also load an instance of the MarkLogic Kafka source connector that will read rows from the `demo/purchases`
188+
view that is created via the TDE template at `src/test/ml-schemas/tde/purchases.json`.
199189

200-
You can also load an instance of the MarkLogic Kafka source connector that will read rows from the `demo/purchases`
201-
view that is created via the TDE template at `src/test/ml-schemas/tde/purchases.json`.
202-
The `src/test/reosurces/confluent/marklogic-purchases-source.json` file defines the connection properties for MarkLogic.
203-
You can adjust this file to suit your testing needs.
190+
### Via Gradle
191+
```./gradlew -i loadMarkLogicPurchasesSourceConnector```
204192

205-
./gradlew loadMarkLogicPurchasesSourceConnector
193+
### Via curl
194+
```curl -X POST -H "Content-Type: application/json" --data @src/test/resources/confluent/marklogic-purchases-source.json http://localhost:8083/connectors```
206195

196+
### Verifying the new connector instance
207197
In the Control Center GUI, you can verify the MarkLogic Kafka connector instance:
208198

209199
1. Click on "Connect" in the left sidebar
@@ -213,6 +203,7 @@ In the Control Center GUI, you can verify the MarkLogic Kafka connector instance
213203
You can verify that data is being read from the `demo/purchases` view and sent to the `marklogic-purchases` topic
214204
by clicking on "Topics" in Confluent Platform and then selecting "marklogic-purchases".
215205

206+
### Via the web application
216207
You can also manually configure an instance of the source connector:
217208

218209
1. Click on "Connect" in the left sidebar

build.gradle

Lines changed: 16 additions & 41 deletions
Original file line numberDiff line numberDiff line change
@@ -214,73 +214,48 @@ task connectorArchive(type: Zip, dependsOn: connectorArchive_BuildDirectory, gro
214214
destinationDirectory = file('build/distro')
215215
}
216216

217-
task installConnectorInConfluent(type: Exec, group: confluentTestingGroup, dependsOn: [connectorArchive]) {
218-
description = "Uses 'Confluent-hub' to install the connector in your local Confluent Platform"
219-
commandLine "confluent-hub", "install", "--no-prompt", "build/distro/${baseArchiveName}.zip"
220-
ignoreExitValue = true
221-
}
222-
223-
// See https://docs.confluent.io/confluent-cli/current/command-reference/local/confluent_local_destroy.html
224-
task destroyLocalConfluent(type: Exec, group: confluentTestingGroup) {
225-
description = "Destroy the local Confluent Platform instance"
226-
commandLine "confluent", "local", "destroy"
227-
// Main reason this will fail is because Confluent is not running, which shouldn't cause a failure
228-
ignoreExitValue = true
229-
}
217+
// Tasks for working with Confluent Platform running locally.
218+
// See "Testing with Confluent Platform" in CONTRIBUTING.md
230219

231-
// See https://docs.confluent.io/confluent-cli/current/command-reference/local/services/confluent_local_services_start.html
232-
task startLocalConfluent(type: Exec, group: confluentTestingGroup) {
233-
description = "Convenience task for starting a local instance of Confluent Platform"
234-
commandLine "confluent", "local", "services", "start"
220+
task installConnectorInConfluent(type: Copy, dependsOn: connectorArchive, group: confluentTestingGroup) {
221+
description = "Copies the connector's archive directory to the Docker volume shared with the Connect server"
222+
from "build/connectorArchive"
223+
into "src/test/confluent-platform-example/docker/confluent-marklogic-components"
235224
}
236225

237226
task loadDatagenPurchasesConnector(type: Exec, group: confluentTestingGroup) {
238227
description = "Load an instance of the Datagen connector into Confluent Platform for sending JSON documents to " +
239228
"the 'purchases' topic"
240-
commandLine "confluent", "local", "services", "connect", "connector", "load", "datagen-purchases-source", "-c",
241-
"src/test/resources/confluent/datagen-purchases-source.json"
229+
commandLine "curl", "-s", "-X", "POST", "-H", "Content-Type: application/json",
230+
"--data", "@src/test/resources/confluent/datagen-purchases-source.json", "http://localhost:8083/connectors"
242231
}
243232

244233
task loadMarkLogicPurchasesSinkConnector(type: Exec, group: confluentTestingGroup) {
245234
description = "Load an instance of the MarkLogic Kafka connector into Confluent Platform for writing data to " +
246235
"MarkLogic from the 'purchases' topic"
247-
commandLine "confluent", "local", "services", "connect", "connector", "load", "marklogic-purchases-sink", "-c",
248-
"src/test/resources/confluent/marklogic-purchases-sink.json"
236+
commandLine "curl", "-s", "-X", "POST", "-H", "Content-Type: application/json",
237+
"--data", "@src/test/resources/confluent/marklogic-purchases-sink.json", "http://localhost:8083/connectors"
249238
}
250239

251240
task loadMarkLogicPurchasesSourceConnector(type: Exec, group: confluentTestingGroup) {
252241
description = "Load an instance of the MarkLogic Kafka connector into Confluent Platform for reading rows from " +
253242
"the demo/purchases view"
254-
commandLine "confluent", "local", "services", "connect", "connector", "load", "marklogic-purchases-source", "-c",
255-
"src/test/resources/confluent/marklogic-purchases-source.json"
243+
commandLine "curl", "-s", "-X", "POST", "-H", "Content-Type: application/json",
244+
"--data", "@src/test/resources/confluent/marklogic-purchases-source.json", "http://localhost:8083/connectors"
256245
}
257246

258247
task loadMarkLogicAuthorsSourceConnector(type: Exec, group: confluentTestingGroup) {
259248
description = "Loads a source connector that retrieves authors from the citations.xml file, which is also used for " +
260249
"all the automated tests"
261-
commandLine "confluent", "local", "services", "connect", "connector", "load", "marklogic-authors-source", "-c",
262-
"src/test/resources/confluent/marklogic-authors-source.json"
250+
commandLine "curl", "-s", "-X", "POST", "-H", "Content-Type: application/json",
251+
"--data", "@src/test/resources/confluent/marklogic-authors-source.json", "http://localhost:8083/connectors"
263252
}
264253

265254
task loadMarkLogicEmployeesSourceConnector(type: Exec, group: confluentTestingGroup) {
266-
commandLine "confluent", "local", "services", "connect", "connector", "load", "marklogic-employees-source", "-c",
267-
"src/test/resources/confluent/marklogic-employees-source.json"
255+
commandLine "curl", "-s", "-X", "POST", "-H", "Content-Type: application/json",
256+
"--data", "@src/test/resources/confluent/marklogic-employees-source.json", "http://localhost:8083/connectors"
268257
}
269258

270-
task setupLocalConfluent(group: confluentTestingGroup) {
271-
description = "Start a local Confluent Platform instance and load the Datagen and MarkLogic connectors"
272-
}
273-
274-
// Temporarily only loading the source connector to make manual testing easier, will re-enable all of these before 1.8.0
275-
//setupLocalConfluent.dependsOn startLocalConfluent, loadDatagenPurchasesConnector, loadMarkLogicPurchasesSinkConnector, loadMarkLogicPurchasesSourceConnector
276-
setupLocalConfluent.dependsOn startLocalConfluent, loadMarkLogicEmployeesSourceConnector
277-
278-
loadDatagenPurchasesConnector.mustRunAfter startLocalConfluent
279-
loadMarkLogicPurchasesSinkConnector.mustRunAfter startLocalConfluent
280-
loadMarkLogicPurchasesSourceConnector.mustRunAfter startLocalConfluent
281-
loadMarkLogicAuthorsSourceConnector.mustRunAfter startLocalConfluent
282-
loadMarkLogicEmployeesSourceConnector.mustRunAfter startLocalConfluent
283-
284259
task insertAuthors(type: Test) {
285260
useJUnitPlatform()
286261
systemProperty "AUTHOR_IDS", authorIds

docker-compose.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,7 @@ services:
1515
- ./docker/marklogic/logs:/var/opt/MarkLogic/Logs
1616
ports:
1717
- "8000-8002:8000-8002"
18+
- "8010-8013:8010-8013"
1819
- "8018-8019:8018-8019"
1920

2021
# Copied from https://docs.sonarsource.com/sonarqube/latest/setup-and-upgrade/install-the-server/#example-docker-compose-configuration .

0 commit comments

Comments
 (0)