Skip to content

Commit 43b9049

Browse files
authored
Merge pull request #221 from BillFarber/reorgDocker
MLE-24489 - Moving the docker file
2 parents f853116 + b5416f4 commit 43b9049

File tree

3 files changed

+27
-28
lines changed

3 files changed

+27
-28
lines changed
File renamed without changes.

CONTRIBUTING.md

Lines changed: 23 additions & 24 deletions
Original file line numberDiff line numberDiff line change
@@ -4,8 +4,7 @@ distribution.
44

55
### Requirements:
66
* MarkLogic Server 11+
7-
* Java version 17 is required to use the Gradle tools.
8-
Additionally, SonarQube requires the use of Java 17.
7+
* Java version 17
98

109
See [the Confluent compatibility matrix](https://docs.confluent.io/platform/current/installation/versions-interoperability.html#java)
1110
for more information. After installing your desired version of Java, ensure that the `JAVA_HOME` environment variable
@@ -21,7 +20,7 @@ Note that you do not need to install [Gradle](https://gradle.org/) - the "gradle
2120
appropriate version of Gradle if you do not have it installed already.
2221

2322
## Virtual Server Preparation
24-
The project includes a docker-compose file that includes MarkLogic, SonarQube with a Postgres server, and Confluent
23+
The project includes a docker-compose file in the repository root that includes MarkLogic, SonarQube with a Postgres server, and Confluent
2524
Platform servers.
2625

2726
### Confluent Platform
@@ -33,14 +32,14 @@ The Confluent Platform servers in this docker-compose file are based on the Conf
3332
[Install a Confluent Platform cluster in Docker using a Confluent docker-compose file](https://docs.confluent.io/platform/current/platform-quickstart.html).
3433

3534
## Docker Cluster Preparation
36-
To setup the docker cluster, use the docker-compose file in the "test-app" directory to build the Docker cluster with
35+
To setup the docker cluster, use the docker-compose file in the repository root to build the Docker cluster with
3736
the command:
3837
```
39-
docker-compose -f docker-compose.yml up -d --build
38+
docker-compose up -d --build
4039
```
4140
When the setup is complete, you should be able to run
4241
```
43-
docker-compose -f docker-compose.yml ps
42+
docker-compose ps
4443
```
4544
and see results similar to the following.
4645
```
@@ -61,14 +60,14 @@ schema-registry confluentinc/cp-schema-registry:7.6.1
6160
You can now visit several web applications:
6261
* http://localhost:8000 to access the MarkLogic server.
6362
* http://localhost:9000 to use the SonarQube server as described in the "Running Sonar Code Analysis"
64-
section below.
63+
section below.
6564
* http://localhost:9021 to access
66-
[Confluent's Control Center GUI](https://docs.confluent.io/platform/current/control-center/index.html) application.
67-
Within Control Center, click on "controlcenter.cluster" to access the configuration for the Kafka cluster.
65+
[Confluent's Control Center GUI](https://docs.confluent.io/platform/current/control-center/index.html) application.
66+
Within Control Center, click on "controlcenter.cluster" to access the configuration for the Kafka cluster.
6867

6968
## MarkLogic Preparation
7069
To prepare the MarkLogic server for automated testing as well as testing with the Confluent Platform, the Data Hub based
71-
application must be deployed. From the "test-app" directory, follow these steps:
70+
application must be deployed. From the root directory, follow these steps:
7271
1. Run `./gradlew hubInit`
7372
2. Edit gradle-local.properties and set `mlUsername` and `mlPassword`
7473
3. Run `./gradlew -i mlDeploy`
@@ -102,7 +101,7 @@ To configure the SonarQube service, perform the following steps:
102101
9. In the "Provide a token" panel, click on "Generate". Copy the token.
103102
10. Click the "Continue" button.
104103
11. Update `systemProp.sonar.token=<Replace With Your Sonar Token>` in `gradle-local.properties` in the root of your
105-
project.
104+
project.
106105

107106
To run the SonarQube analysis, run the following Gradle task in the root directory, which will run all the tests with
108107
code coverage and then generate a quality report with SonarQube:
@@ -218,8 +217,8 @@ contents of the `data-hub-FINAL` database.
218217

219218
## Debugging the MarkLogic Kafka connector
220219

221-
The main mechanism for debugging an instance of the MarkLogic Kafka connector is by examining logs from the
222-
connector. You can access those, along with logging from Kafka Connect and all other connectors, by running the
220+
The main mechanism for debugging an instance of the MarkLogic Kafka connector is by examining logs from the
221+
connector. You can access those, along with logging from Kafka Connect and all other connectors, by running the
223222
following:
224223

225224
confluent local services connect log -f
@@ -228,43 +227,43 @@ See [the log command docs](https://docs.confluent.io/confluent-cli/current/comma
228227
for more information.
229228

230229
You can also customize Confluent logging by [adjusting the log4j file for Kafka Connect](https://docs.confluent.io/platform/current/connect/logging.html#viewing-kconnect-logs).
231-
For example, to prevent some logging from Kafka Connect and from the Java Client DMSDK, add the following to the
230+
For example, to prevent some logging from Kafka Connect and from the Java Client DMSDK, add the following to the
232231
`$CONFLUENT_HOME/etc/kafka/connect-log4j.properties` file:
233232

234233
log4j.logger.org.apache.kafka=WARN
235234
log4j.logger.com.marklogic.client.datamovement=WARN
236235

237236

238237
# Testing with basic Apache Kafka
239-
The primary reason to test the MarkLogic Kafka connector via a regular Kafka distribution is that the development
240-
cycle is much faster and more reliable - i.e. you can repeatedly redeploy the connector and restart Kafka Connect to
238+
The primary reason to test the MarkLogic Kafka connector via a regular Kafka distribution is that the development
239+
cycle is much faster and more reliable - i.e. you can repeatedly redeploy the connector and restart Kafka Connect to
241240
test changes, and Kafka Connect will continue to work fine. This is particularly useful when the changes you're testing
242241
do not require testing the GUI provided by Confluent Control Center.
243242

244-
To get started, these instructions assume that you already have an instance of Apache Kafka installed; the
245-
[Kafka Quickstart](https://kafka.apache.org/quickstart) instructions provide an easy way of accomplishing this. Perform
243+
To get started, these instructions assume that you already have an instance of Apache Kafka installed; the
244+
[Kafka Quickstart](https://kafka.apache.org/quickstart) instructions provide an easy way of accomplishing this. Perform
246245
step 1 of these instructions before proceeding.
247246

248247
Next, configure your Gradle properties to point to your Kafka installation and deploy the connector there:
249248

250249
1. Configure `kafkaHome` in gradle-local.properties - e.g. `kafkaHome=/Users/myusername/kafka_2.13-2.8.1`
251250
2. Configure `kafkaMlUsername` and `kafkaMlPassword` in gradle-local.properties, setting these to a MarkLogic user that
252-
is able to write documents to MarkLogic. These values will be used to populate the
251+
is able to write documents to MarkLogic. These values will be used to populate the
253252
`ml.connection.username` and `ml.connection.password` connector properties.
254253
3. Run `./gradlew clean deploy` to build a jar and copy it and the config property files to your Kafka installation
255254

256255
[Step 2 in the Kafka Quickstart guide](https://kafka.apache.org/quickstart) provides the instructions for starting the
257-
separate Zookeeper and Kafka server processes. You'll need to run these commands from your Kafka installation
258-
directory. As of August 2022, those commands are (these seem very unlikely to change and thus are included here for
256+
separate Zookeeper and Kafka server processes. You'll need to run these commands from your Kafka installation
257+
directory. As of August 2022, those commands are (these seem very unlikely to change and thus are included here for
259258
convenience):
260259

261260
bin/zookeeper-server-start.sh config/zookeeper.properties
262261

263-
and
262+
and
264263

265264
bin/kafka-server-start.sh config/server.properties
266265

267-
Next, start the Kafka connector in standalone mode (also from the Kafka home directory). To run the sink connector,
266+
Next, start the Kafka connector in standalone mode (also from the Kafka home directory). To run the sink connector,
268267
use the following command:
269268

270269
bin/connect-standalone.sh config/marklogic-connect-standalone.properties config/marklogic-sink.properties
@@ -278,7 +277,7 @@ You'll see a fair amount of logging from Kafka itself; near the end of the loggi
278277
`RowManagerSourceTask` to ensure that the connector has started up correctly.
279278

280279
## Sink Connector Testing
281-
To test out the sink connector, you can use the following command to enter a CLI that allows you to manually send
280+
To test out the sink connector, you can use the following command to enter a CLI that allows you to manually send
282281
messages to the `marklogic` topic that the connector is configured by default to read from:
283282

284283
bin/kafka-console-producer.sh --broker-list localhost:9092 --topic marklogic

test-app/docker-compose.yml renamed to docker-compose.yml

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -2,10 +2,10 @@
22
name: marklogic-kafka-confluent
33
services:
44

5-
# This compose file is based on:
6-
# This guide - https://docs.confluent.io/platform/current/platform-quickstart.html#step-6-uninstall-and-clean-up
7-
# This compose file - https://raw.githubusercontent.com/confluentinc/cp-all-in-one/7.6.1-post/cp-all-in-one-kraft/docker-compose.yml
8-
# Extended to include a MarkLogic container
5+
# This compose file is based on:
6+
# This guide - https://docs.confluent.io/platform/current/platform-quickstart.html#step-6-uninstall-and-clean-up
7+
# This compose file - https://raw.githubusercontent.com/confluentinc/cp-all-in-one/7.6.1-post/cp-all-in-one-kraft/docker-compose.yml
8+
# Extended to include a MarkLogic container
99

1010
broker:
1111
image: confluentinc/cp-kafka:7.6.1

0 commit comments

Comments
 (0)