11This guide covers how to develop and test this project. It assumes that you have cloned this repository to your local
22workstation.
33
4- Due to the use of the Sonar plugin for Gradle, you must use Java 11 or higher for developing and testing the project.
5- The ` build.gradle ` file for this project ensures that the connector is built to run on Java 8 or higher.
4+ You must use Java 11 or higher for developing, testing, and building this project. If you wish to use Sonar as
5+ described below, you must use Java 17 or higher.
66
77# Setup
88
99To begin, you need to deploy the test application in this project to MarkLogic. You can do so either on your own
10- installation of MarkLogic, or you can use ` docker- compose ` to install MarkLogic, optionally as a 3-node cluster with
10+ installation of MarkLogic, or you can use ` docker compose ` to install MarkLogic, optionally as a 3-node cluster with
1111a load balancer in front of it.
1212
13- ## Installing MarkLogic with docker- compose
13+ ## Installing MarkLogic with docker compose
1414
15- If you wish to use ` docker- compose ` , perform the following steps before deploying the test application.
15+ If you wish to use ` docker compose ` , perform the following steps before deploying the test application.
1616
17171 . [ Install Docker] ( https://docs.docker.com/get-docker/ ) .
18182 . Ensure that you don't have a MarkLogic instance running locally (if you do, you may run into port conflicts in
1919 the next step).
20- 3 . Run ` docker- compose up -d --build ` .
20+ 3 . Run ` docker compose up -d --build ` .
2121
2222The above will result in a new MarkLogic instance with a single node.
2323
2424Alternatively, if you would like to test against a 3-node MarkLogic cluster with a load balancer in front of it,
25- run ` docker- compose -f docker-compose-3nodes.yaml up -d --build ` .
25+ run ` docker compose -f docker-compose-3nodes.yaml up -d --build ` .
2626
2727## Deploying the test application
2828
@@ -45,8 +45,8 @@ To run the tests against the test application, run the following Gradle task:
4545
4646## Generating code quality reports with SonarQube
4747
48- In order to use SonarQube, you must have used Docker to run this project's ` docker-compose.yml ` file and you must
49- have the services in that file running.
48+ In order to use SonarQube, you must have used Docker to run this project's ` docker-compose.yml ` file, and you must
49+ have the services in that file running and you must use Java 17 to run the Gradle ` sonar ` task .
5050
5151To configure the SonarQube service, perform the following steps:
5252
@@ -62,8 +62,8 @@ To configure the SonarQube service, perform the following steps:
626210 . Add ` systemProp.sonar.token=your token pasted here ` to ` gradle-local.properties ` in the root of your project, creating
6363that file if it does not exist yet.
6464
65- To run SonarQube, run the following Gradle tasks, which will run all the tests with code coverage and then generate
66- a quality report with SonarQube:
65+ To run SonarQube, run the following Gradle tasks using Java 17 , which will run all the tests with code coverage and
66+ then generate a quality report with SonarQube:
6767
6868 ./gradlew test sonar
6969
@@ -83,20 +83,17 @@ you've introduced on the feature branch you're working on. You can then click on
8383Note that if you only need results on code smells and vulnerabilities, you can repeatedly run ` ./gradlew sonar `
8484without having to re-run the tests.
8585
86- Our Sonar instance is also configured to scan for dependency vulnerabilities
87- [ via the dependency-check plugin] ( https://github.com/dependency-check/dependency-check-sonar-plugin ) . For more
88- information, see the ` dependencyCheck ` block in this project's ` build.gradle ` file. To include dependency check results,
89- just run the following (it's not included by default when running the ` sonar ` task):
86+ You can also force Gradle to run ` sonar ` if any tests fail:
9087
91- ./gradlew dependencyCheckAnalyze sonar
88+ ./gradlew clean test sonar --continue
9289
9390## Accessing MarkLogic logs in Grafana
9491
9592This project's ` docker-compose-3nodes.yaml ` file includes
9693[ Grafana, Loki, and promtail services] ( https://grafana.com/docs/loki/latest/clients/promtail/ ) for the primary reason of
9794collecting MarkLogic log files and allowing them to be viewed and searched via Grafana.
9895
99- Once you have run ` docker- compose ` , you can access Grafana at http://localhost:3000 . Follow these instructions to
96+ Once you have run ` docker compose ` , you can access Grafana at http://localhost:3000 . Follow these instructions to
10097access MarkLogic logging data:
10198
102991 . Click on the hamburger in the upper left hand corner and select "Explore", or simply go to
@@ -123,7 +120,7 @@ This will produce a single jar file for the connector in the `./build/libs` dire
123120
124121You can then launch PySpark with the connector available via:
125122
126- pyspark --jars build/libs/marklogic-spark-connector-2.3 -SNAPSHOT.jar
123+ pyspark --jars build/libs/marklogic-spark-connector-2.4 -SNAPSHOT.jar
127124
128125The below command is an example of loading data from the test application deployed via the instructions at the top of
129126this page.
@@ -199,7 +196,7 @@ The Spark master GUI is at <http://localhost:8080>. You can use this to view det
199196
200197Now that you have a Spark cluster running, you just need to tell PySpark to connect to it:
201198
202- pyspark --master spark://NYWHYC3G0W:7077 --jars build/libs/marklogic-spark-connector-2.3 -SNAPSHOT.jar
199+ pyspark --master spark://NYWHYC3G0W:7077 --jars build/libs/marklogic-spark-connector-2.4 -SNAPSHOT.jar
203200
204201You can then run the same commands as shown in the PySpark section above. The Spark master GUI will allow you to
205202examine details of each of the commands that you run.
@@ -218,12 +215,12 @@ You will need the connector jar available, so run `./gradlew clean shadowJar` if
218215You can then run a test Python program in this repository via the following (again, change the master address as
219216needed); note that you run this outside of PySpark, and ` spark-submit ` is available after having installed PySpark:
220217
221- spark-submit --master spark://NYWHYC3G0W:7077 --jars build/libs/marklogic-spark-connector-2.3 -SNAPSHOT.jar src/test/python/test_program.py
218+ spark-submit --master spark://NYWHYC3G0W:7077 --jars build/libs/marklogic-spark-connector-2.4 -SNAPSHOT.jar src/test/python/test_program.py
222219
223220You can also test a Java program. To do so, first move the ` com.marklogic.spark.TestProgram ` class from ` src/test/java `
224221to ` src/main/java ` . Then run ` ./gradlew clean shadowJar ` to rebuild the connector jar. Then run the following:
225222
226- spark-submit --master spark://NYWHYC3G0W:7077 --class com.marklogic.spark.TestProgram build/libs/marklogic-spark-connector-2.3 -SNAPSHOT.jar
223+ spark-submit --master spark://NYWHYC3G0W:7077 --class com.marklogic.spark.TestProgram build/libs/marklogic-spark-connector-2.4 -SNAPSHOT.jar
227224
228225Be sure to move ` TestProgram ` back to ` src/test/java ` when you are done.
229226
0 commit comments