Skip to content

Commit 35303fd

Browse files
committed
Upgrade to Kafka 4.1.0 and Confluent Platform 8.0
Force upgrade to Spring 6 New files for the new CP stack and updated .gitignore for those files.
1 parent 72f037d commit 35303fd

File tree

11 files changed

+185
-116
lines changed

11 files changed

+185
-116
lines changed

.copyrightconfig

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,4 +11,4 @@ startyear: 2023
1111
# - Dotfiles already skipped automatically
1212
# Enable by removing the leading '# ' from the next line and editing values.
1313
# filesexcluded: third_party/*, docs/generated/*.md, assets/*.png, scripts/temp_*.py, vendor/lib.js
14-
filesexcluded: .github/*, README.md, CONTRIBUTING.md, Jenkinsfile, gradle/*, docker-compose.yml, *.gradle, gradle.properties, gradlew, gradlew.bat, **/test/resources/**, docs/**, test-app/docker-compose.yml
14+
filesexcluded: .github/*, README.md, CONTRIBUTING.md, Jenkinsfile, gradle/*, docker-compose.yml, *.gradle, gradle.properties, gradlew, gradlew.bat, **/test/resources/**, docs/**, test-app/docker-compose.yml, docker/prometheus/*.yml

.gitignore

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,8 @@
44
build
55
out
66
gradle-local.properties
7-
docker
7+
docker/confluent-marklogic-components/marklogic-kafka-marklogic-connector*
8+
docker/marklogic
89

910
bin
1011
.vscode

CONTRIBUTING.md

Lines changed: 37 additions & 57 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
This guide describes how to develop and contribute pull requests to this connector. The focus is currently on how to
22
develop and test the connector. There are two methods available - automated and manual. Both methods are performed via a
3-
Docker stack. The automated tests stack creates MarkLogic, Sonar, and Postgres instance for the automated tests. The
3+
Docker stack. The automated tests stack creates a MarkLogic instance for the automated tests. The
44
manual tests use Confluent Platform in a different Docker stack to allow testing the connector via Confluent Control
55
Center with a MarkLogic instance in the same stack.
66

@@ -22,16 +22,14 @@ Note that you do not need to install [Gradle](https://gradle.org/) - the "gradle
2222
appropriate version of Gradle if you do not have it installed already.
2323

2424
## Docker Cluster Preparation for Automated Testing
25-
The automated tests require a MarkLogic server, SonarQube server, and Postgres server. The docker-compose file in the
26-
repository root includes these services. To prepare for running the automated tests, perform the following steps:
25+
The automated tests require a MarkLogic server. The docker-compose file in the repository root includes these services.
26+
To prepare for running the automated tests, perform the following steps:
2727
```
2828
docker-compose up -d --build
2929
```
3030

31-
You can now visit these web applications:
31+
You can now visit this web applications:
3232
* http://localhost:8000 to access the MarkLogic server.
33-
* http://localhost:9000 to use the SonarQube server as described in the "Running Sonar Code Analysis"
34-
section below.
3533

3634
## MarkLogic Preparation
3735
To prepare the MarkLogic server for automated testing as well as testing with the Confluent Platform, the Data Hub based
@@ -54,44 +52,11 @@ directory. Note that you must be using Java 17 for this command due to the lates
5452
Alternatively, you can import this project into an IDE such as IntelliJ and run each of the tests found under
5553
`src/test/java`.
5654

57-
## Running Sonar Code Analysis
55+
## Generating code quality reports with SonarQube
5856

59-
To configure the SonarQube service, perform the following steps:
57+
Please see our [internal Wiki page](https://progresssoftware.atlassian.net/wiki/spaces/PM/pages/1763541097/Developer+Experience+SonarQube)
58+
for information on setting up SonarQube if you have not yet already.
6059

61-
1. Go to http://localhost:9000 .
62-
2. Login as admin/admin. SonarQube will ask you to change this password; you can choose whatever you want ("password" works).
63-
3. Click on "Create a local project".
64-
4. Enter "marklogic-kafka-connector" for the Project Display Name; use that as the Project Key as well.
65-
5. Enter "master" as the main branch name.
66-
6. Click on "Next".
67-
7. Click on "Use the global setting" and then "Create project".
68-
8. On the "Analysis Method" page, click on "Locally".
69-
9. In the "Provide a token" panel, click on "Generate". Copy the token.
70-
10. Click the "Continue" button.
71-
11. Update `systemProp.sonar.token=<Replace With Your Sonar Token>` in `gradle-local.properties` in the root directory
72-
of your project.
73-
74-
To run the SonarQube analysis, run the following Gradle task in the root directory, which will run all the tests with
75-
code coverage and then generate a quality report with SonarQube:
76-
77-
./gradlew test sonar
78-
79-
If you do not update `systemProp.sonar.token` in your `gradle.properties` file, you can specify the token via the
80-
following:
81-
82-
./gradlew test sonar -Dsonar.token=paste your token here
83-
84-
When that completes, you can find the results at http://localhost:9000/dashboard?id=marklogic-kafka-connector
85-
86-
Click on that link. If it's the first time you've run the report, you'll see all issues. If you've run the report
87-
before, then SonarQube will show "New Code" by default. That's handy, as you can use that to quickly see any issues
88-
you've introduced on the feature branch you're working on. You can then click on "Overall Code" to see all issues.
89-
90-
Note that if you only need results on code smells and vulnerabilities, you can repeatedly run "./gradlew sonar"
91-
without having to re-run the tests.
92-
93-
For more assistance with Sonar and Gradle, see the
94-
[Sonar Gradle plugin docs](https://docs.sonarqube.org/latest/analyzing-source-code/scanners/sonarscanner-for-gradle/).
9560

9661
# Configuring Local Manual Testing
9762
This project includes a Docker Compose file that creates a Kafka cluster using Confluent Platform along with a
@@ -101,27 +66,30 @@ application. The instructions below describe how to get started.
10166
## Docker Cluster Preparation for Manual Testing
10267
The docker-compose file in the test-app directory includes these services along with a MarkLogic server.
10368
```
104-
docker-compose --env-file ./.env -f test-app/docker-compose.yml up -d --build
69+
docker-compose --env-file test-app/.env -f test-app/docker-compose.yml up -d --build
10570
```
10671

10772
When the setup is complete, you should be able to run
10873
```
109-
docker-compose --env-file ./.env -f test-app/docker-compose.yml ps
74+
docker-compose --env-file test-app/.env -f test-app/docker-compose.yml ps
11075
```
11176
and see results similar to the following.
11277
```
113-
NAME IMAGE COMMAND SERVICE CREATED STATUS PORTS
114-
broker confluentinc/cp-kafka:7.6.1 "/etc/confluent/dock…" broker 14 minutes ago Up 14 minutes 0.0.0.0:9092->9092/tcp, 0.0.0.0:9101->9101/tcp
115-
connect cnfldemos/cp-server-connect-datagen:0.6.4-7.6.0 "/etc/confluent/dock…" connect 14 minutes ago Up 14 minutes 0.0.0.0:8083->8083/tcp, 9092/tcp
116-
control-center confluentinc/cp-enterprise-control-center:7.6.1 "/etc/confluent/dock…" control-center 14 minutes ago Up 14 minutes 0.0.0.0:9021->9021/tcp
117-
ksql-datagen confluentinc/ksqldb-examples:7.6.1 "bash -c 'echo Waiti…" ksql-datagen 14 minutes ago Up 14 minutes
118-
ksqldb-cli confluentinc/cp-ksqldb-cli:7.6.1 "/bin/sh" ksqldb-cli 14 minutes ago Up 14 minutes
119-
ksqldb-server confluentinc/cp-ksqldb-server:7.6.1 "/etc/confluent/dock…" ksqldb-server 14 minutes ago Up 14 minutes 0.0.0.0:8088->8088/tcp
120-
marklogic marklogicdb/marklogic-db:11.2.0-centos-1.1.2 "/tini -- /usr/local…" marklogic 14 minutes ago Up 14 minutes 25/tcp, 7997-7999/tcp, 0.0.0.0:8000-8002->8000-8002/tcp, 0.0.0.0:8010-8013->8010-8013/tcp, 8003-8009/tcp, 0.0.0.0:8018-8019->8018-8019/tcp
121-
marklogic-kafka-confluent-postgres-1 postgres:15-alpine "docker-entrypoint.s…" postgres 14 minutes ago Up 14 minutes 5432/tcp
122-
marklogic-kafka-confluent-sonarqube-1 sonarqube:10.3.0-community "/opt/sonarqube/dock…" sonarqube 14 minutes ago Up 14 minutes 0.0.0.0:9000->9000/tcp
123-
rest-proxy confluentinc/cp-kafka-rest:7.6.1 "/etc/confluent/dock…" rest-proxy 14 minutes ago Up 14 minutes 0.0.0.0:8082->8082/tcp
124-
schema-registry confluentinc/cp-schema-registry:7.6.1 "/etc/confluent/dock…" schema-registry 14 minutes ago Up 14 minutes 0.0.0.0:8081->8081/tcp
78+
NAME IMAGE COMMAND SERVICE CREATED STATUS PORTS
79+
alertmanager confluentinc/cp-enterprise-alertmanager:2.2.0 "alertmanager-start" alertmanager 51 seconds ago Up 50 seconds 0.0.0.0:9093->9093/tcp, [::]:9093->9093/tcp
80+
broker confluentinc/cp-server:8.0.0 "/etc/confluent/dock…" broker 51 seconds ago Up 50 seconds 0.0.0.0:9092->9092/tcp, [::]:9092->9092/tcp, 0.0.0.0:9101->9101/tcp, [::]:9101->9101/tcp
81+
connect cnfldemos/cp-server-connect-datagen:0.6.7-8.0.0 "/etc/confluent/dock…" connect 51 seconds ago Up 49 seconds 0.0.0.0:8083->8083/tcp, [::]:8083->8083/tcp
82+
control-center confluentinc/cp-enterprise-control-center-next-gen:2.2.0 "/etc/confluent/dock…" control-center 51 seconds ago Up 49 seconds 0.0.0.0:9021->9021/tcp, [::]:9021->9021/tcp
83+
flink-jobmanager cnfldemos/flink-kafka:1.19.1-scala_2.12-java17 "/docker-entrypoint.…" flink-jobmanager 51 seconds ago Up 50 seconds 0.0.0.0:9081->9081/tcp, [::]:9081->9081/tcp
84+
flink-sql-client cnfldemos/flink-sql-client-kafka:1.19.1-scala_2.12-java17 "/docker-entrypoint.…" flink-sql-client 51 seconds ago Up 50 seconds 6123/tcp, 8081/tcp
85+
flink-taskmanager cnfldemos/flink-kafka:1.19.1-scala_2.12-java17 "/docker-entrypoint.…" flink-taskmanager 51 seconds ago Up 50 seconds 6123/tcp, 8081/tcp
86+
ksql-datagen confluentinc/ksqldb-examples:8.0.0 "bash -c 'echo Waiti…" ksql-datagen 51 seconds ago Up 49 seconds
87+
ksqldb-cli confluentinc/cp-ksqldb-cli:8.0.0 "/bin/sh" ksqldb-cli 51 seconds ago Up 49 seconds
88+
ksqldb-server confluentinc/cp-ksqldb-server:8.0.0 "/etc/confluent/dock…" ksqldb-server 51 seconds ago Up 49 seconds 0.0.0.0:8088->8088/tcp, [::]:8088->8088/tcp
89+
manual-tests-marklogic-kafka-confluent-marklogic-1 ml-docker-db-dev-tierpoint.bed-artifactory.bedford.progress.com/marklogic/marklogic-server-ubi:latest-12 "/tini -- /usr/local…" marklogic 51 seconds ago Up 50 seconds 0.0.0.0:8000-8002->8000-8002/tcp, [::]:8000-8002->8000-8002/tcp, 0.0.0.0:8010-8013->8010-8013/tcp, [::]:8010-8013->8010-8013/tcp, 0.0.0.0:8018-8019->8018-8019/tcp, [::]:8018-8019->8018-8019/tcp
90+
prometheus confluentinc/cp-enterprise-prometheus:2.2.0 "prometheus-start" prometheus 51 seconds ago Up 50 seconds 0.0.0.0:9090->9090/tcp, [::]:9090->9090/tcp
91+
rest-proxy confluentinc/cp-kafka-rest:8.0.0 "/etc/confluent/dock…" rest-proxy 51 seconds ago Up 49 seconds 0.0.0.0:8082->8082/tcp, [::]:8082->8082/tcp
92+
schema-registry confluentinc/cp-schema-registry:8.0.0 "/etc/confluent/dock…" schema-registry 51 seconds ago Up 50 seconds 0.0.0.0:8081->8081/tcp, [::]:8081->8081/tcp
12593
```
12694

12795
You can now visit several web applications:
@@ -139,6 +107,18 @@ The Confluent Platform servers in this docker-compose file are based on the Conf
139107
[Install a Confluent Platform cluster in Docker using a Confluent docker-compose file](https://docs.confluent.io/platform/current/platform-quickstart.html).
140108

141109

110+
### MarkLogic Preparation
111+
To prepare the MarkLogic server for automated testing as well as testing with the Confluent Platform, the Data Hub based
112+
application must be deployed. From the root directory, follow these steps:
113+
1. Run `./gradlew hubInit`
114+
2. Edit gradle-local.properties and set `mlUsername` and `mlPassword`
115+
3. Run `./gradlew -i mlDeploy`
116+
117+
Note: If you change the version of Data Hub Framework used by this project, you should also delete the following directories:
118+
* 'test-app/src/main/entity-config'
119+
* 'test-app/src/main/hub-internal-config'
120+
121+
142122
### Building and Sharing the Connector with the Docker Container
143123
Using gradle in the root directory, build the connector archive and copy it to a directory shared with the Confluent
144124
Platform Docker cluster built in the that section, using this gradle command in the root directory:
@@ -187,7 +167,7 @@ In the Control Center GUI, you can verify the MarkLogic Kafka connector instance
187167
3. Click on the "marklogic-purchases-sink" connector
188168

189169
You can then verify that data is being written to MarkLogic by using MarkLogic's qconsole application to inspect the
190-
contents of the `data-hub-FINAL` database.
170+
contents of the `data-hub-FINAL` database. There should be documents with URIs that start with `/purchase/*`.
191171

192172
### Load a MarkLogic Kafka source connector instance
193173
You can also load an instance of the MarkLogic Kafka source connector that will read rows from the `demo/purchases`

build.gradle

Lines changed: 21 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -27,10 +27,15 @@ plugins {
2727
}
2828

2929
java {
30-
sourceCompatibility = 1.8
31-
targetCompatibility = 1.8
30+
toolchain {
31+
languageVersion = JavaLanguageVersion.of(17)
32+
}
33+
sourceCompatibility = JavaVersion.VERSION_17
34+
targetCompatibility = JavaVersion.VERSION_17
3235
}
3336

37+
38+
3439
repositories {
3540
mavenCentral()
3641
}
@@ -55,7 +60,7 @@ configurations {
5560
}
5661

5762
ext {
58-
kafkaVersion = "3.9.1"
63+
kafkaVersion = "4.1.0"
5964
}
6065

6166
dependencies {
@@ -66,7 +71,7 @@ dependencies {
6671
// Force DHF to use the latest version of ml-app-deployer, which minimizes security vulnerabilities
6772
implementation "com.marklogic:ml-app-deployer:5.0.0"
6873

69-
implementation "com.fasterxml.jackson.dataformat:jackson-dataformat-csv:2.17.2"
74+
implementation "com.fasterxml.jackson.dataformat:jackson-dataformat-csv:2.19.0"
7075

7176
// Note that in general, the version of the DHF jar must match that of the deployed DHF instance. Different versions
7277
// may work together, but that behavior is not guaranteed.
@@ -81,7 +86,17 @@ dependencies {
8186
exclude module: "logback-classic"
8287
}
8388

84-
testImplementation 'com.marklogic:marklogic-junit5:1.5.0'
89+
testImplementation('com.marklogic:marklogic-junit5:1.5.0') {
90+
// Use the Java Client declared above.
91+
exclude module: "marklogic-client-api"
92+
93+
// Use the Spring dependencies from ml-app-deployer 6 to avoid vulnerabilities in Spring 5.
94+
exclude group: "org.springframework"
95+
}
96+
97+
// Add back all required Spring 6 modules for tests, since junit5 and test code need more than just spring-test
98+
testImplementation "org.springframework:spring-test:6.2.11"
99+
testImplementation "org.springframework:spring-context:6.2.11"
85100

86101
testImplementation "org.apache.kafka:connect-json:${kafkaVersion}"
87102
testImplementation kafkaConnectRuntime
@@ -223,5 +238,5 @@ task connectorArchive(type: Zip, dependsOn: connectorArchive_BuildDirectory, gro
223238
task copyConnectorToDockerVolume(type: Copy, dependsOn: connectorArchive, group: confluentTestingGroup) {
224239
description = "Copies the connector's archive directory to the Docker volume shared with the Connect server"
225240
from "build/connectorArchive"
226-
into "test-app/docker/confluent-marklogic-components"
241+
into "./docker/confluent-marklogic-components"
227242
}

docker-compose.yml

Lines changed: 0 additions & 36 deletions
Original file line numberDiff line numberDiff line change
@@ -17,39 +17,3 @@ services:
1717
- "8018-8019:8018-8019"
1818
cap_drop:
1919
- NET_RAW
20-
21-
# Copied from https://docs.sonarsource.com/sonarqube/latest/setup-and-upgrade/install-the-server/#example-docker-compose-configuration .
22-
sonarqube:
23-
image: sonarqube:10.3.0-community
24-
depends_on:
25-
- postgres
26-
environment:
27-
SONAR_JDBC_URL: jdbc:postgresql://postgres:5432/sonar
28-
SONAR_JDBC_USERNAME: sonar
29-
SONAR_JDBC_PASSWORD: sonar
30-
volumes:
31-
- sonarqube_data:/opt/sonarqube/data
32-
- sonarqube_extensions:/opt/sonarqube/extensions
33-
- sonarqube_logs:/opt/sonarqube/logs
34-
ports:
35-
- "9000:9000"
36-
cap_drop:
37-
- NET_RAW
38-
39-
postgres:
40-
image: postgres:15-alpine
41-
environment:
42-
POSTGRES_USER: sonar
43-
POSTGRES_PASSWORD: sonar
44-
volumes:
45-
- postgresql:/var/lib/postgresql
46-
- postgresql_data:/var/lib/postgresql/data
47-
cap_drop:
48-
- NET_RAW
49-
50-
volumes:
51-
sonarqube_data:
52-
sonarqube_extensions:
53-
sonarqube_logs:
54-
postgresql:
55-
postgresql_data:
Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,8 @@
1+
global:
2+
resolve_timeout: 1m
3+
smtp_require_tls: false
4+
receivers:
5+
- name: default
6+
route:
7+
receiver: default
8+
routes: []

docker/prometheus/config/prometheus-generated.yml

Whitespace-only changes.

docker/prometheus/config/web-config-am.yml

Whitespace-only changes.

docker/prometheus/config/web-config-prom.yml

Whitespace-only changes.

test-app/.env

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,2 @@
1+
MARKLOGIC_IMAGE=ml-docker-db-dev-tierpoint.bed-artifactory.bedford.progress.com/marklogic/marklogic-server-ubi:latest-12
2+
MARKLOGIC_LOGS_VOLUME=../docker/marklogic/logs

0 commit comments

Comments
 (0)