Skip to content

Commit 9499319

Browse files
authored
Merge pull request #391 from marklogic/release/2.5.1
Release/2.5.1
2 parents 2538963 + d038abd commit 9499319

File tree

22 files changed

+87
-186
lines changed

22 files changed

+87
-186
lines changed

NOTICE.txt

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ jackson-dataformat-xml 2.15.2 (Apache-2.0)
1414
jdom2 2.0.6.1 (Apache-2.0)
1515
jena-arq 4.10.0 (Apache-2.0)
1616
langchain4j 0.35.0 (Apache-2.0)
17-
marklogic-client-api 7.0.0 (Apache-2.0)
17+
marklogic-client-api 7.1.0 (Apache-2.0)
1818
okhttp 4.12.0 (Apache-2.0)
1919

2020
Common Licenses
@@ -23,7 +23,7 @@ Apache License 2.0 (Apache-2.0)
2323

2424
Third-Party Components
2525

26-
The following is a list of the third-party components used by the MarkLogic® Spark connector 2.5.0 (last updated December 17, 2024):
26+
The following is a list of the third-party components used by the MarkLogic® Spark connector 2.5.1 (last updated January 6, 2025):
2727

2828
jackson-dataformat-xml 2.15.2 (Apache-2.0)
2929
https://repo1.maven.org/maven2/com/fasterxml/jackson/dataformat/jackson-dataformat-xml/
@@ -41,7 +41,7 @@ langchain4j 0.35.0 (Apache-2.0)
4141
https://repo1.maven.org/maven2/dev/langchain4j/langchain4j/
4242
For the full text of the Apache-2.0 license, see Apache License 2.0 (Apache-2.0)
4343

44-
marklogic-client-api 7.0.0 (Apache-2.0)
44+
marklogic-client-api 7.1.0 (Apache-2.0)
4545
https://repo1.maven.org/maven2/com/marklogic/marklogic-client-api/
4646
For the full text of the Apache-2.0 license, see Apache License 2.0 (Apache-2.0)
4747

build.gradle

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ subprojects {
1616
apply plugin: "jacoco"
1717

1818
group = "com.marklogic"
19-
version "2.5.0"
19+
version "2.5-SNAPSHOT"
2020

2121
java {
2222
sourceCompatibility = 11

docs/getting-started/jupyter.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -32,15 +32,15 @@ connector and also to initialize Spark:
3232

3333
```
3434
import os
35-
os.environ['PYSPARK_SUBMIT_ARGS'] = '--jars "/path/to/marklogic-spark-connector-2.5.0.jar" pyspark-shell'
35+
os.environ['PYSPARK_SUBMIT_ARGS'] = '--jars "/path/to/marklogic-spark-connector-2.5.1.jar" pyspark-shell'
3636
3737
from pyspark.sql import SparkSession
3838
spark = SparkSession.builder.master("local[*]").appName('My Notebook').getOrCreate()
3939
spark.sparkContext.setLogLevel("WARN")
4040
spark
4141
```
4242

43-
The path of `/path/to/marklogic-spark-connector-2.5.0.jar` should be changed to match the location of the connector
43+
The path of `/path/to/marklogic-spark-connector-2.5.1.jar` should be changed to match the location of the connector
4444
jar on your filesystem. You are free to customize the `spark` variable in any manner you see fit as well.
4545

4646
Now that you have an initialized Spark session, you can run any of the examples found in the

docs/getting-started/pyspark.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@ shell by pressing `ctrl-D`.
3030

3131
Run PySpark from the directory that you downloaded the connector to per the [setup instructions](setup.md):
3232

33-
pyspark --jars marklogic-spark-connector-2.5.0.jar
33+
pyspark --jars marklogic-spark-connector-2.5.1.jar
3434

3535
The `--jars` command line option is PySpark's method for utilizing Spark connectors. Each Spark environment should have
3636
a similar mechanism for including third party connectors; please see the documentation for your particular Spark

docs/getting-started/setup.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -31,10 +31,10 @@ have an instance of MarkLogic running, you can skip step 4 below, but ensure tha
3131
extracted directory contains valid connection properties for your instance of MarkLogic.
3232

3333
1. From [this repository's Releases page](https://github.com/marklogic/marklogic-spark-connector/releases), select
34-
the latest release and download the `marklogic-spark-getting-started-2.5.0.zip` file.
34+
the latest release and download the `marklogic-spark-getting-started-2.5.1.zip` file.
3535
2. Extract the contents of the downloaded zip file.
3636
3. Open a terminal window and go to the directory created by extracting the zip file; the directory should have a
37-
name of "marklogic-spark-getting-started-2.5.0".
37+
name of "marklogic-spark-getting-started-2.5.1".
3838
4. Run `docker-compose up -d` to start an instance of MarkLogic
3939
5. Ensure that the `./gradlew` file is executable; depending on your operating system, you may need to run
4040
`chmod 755 gradlew` to make the file executable.

examples/entity-aggregation/build.gradle

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ repositories {
88

99
dependencies {
1010
implementation 'org.apache.spark:spark-sql_2.12:3.5.3'
11-
implementation "com.marklogic:marklogic-spark-connector:2.5.0"
11+
implementation "com.marklogic:marklogic-spark-connector:2.5.1"
1212
implementation "org.postgresql:postgresql:42.7.4"
1313
}
1414

examples/getting-started/marklogic-spark-getting-started.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@
99
"source": [
1010
"# Make the MarkLogic connector available to the underlying PySpark application.\n",
1111
"import os\n",
12-
"os.environ['PYSPARK_SUBMIT_ARGS'] = '--jars \"marklogic-spark-connector-2.5.0.jar\" pyspark-shell'\n",
12+
"os.environ['PYSPARK_SUBMIT_ARGS'] = '--jars \"marklogic-spark-connector-2.5.1.jar\" pyspark-shell'\n",
1313
"\n",
1414
"# Define the connection details for the getting-started example application.\n",
1515
"client_uri = \"spark-example-user:password@localhost:8003\"\n",

examples/java-dependency/build.gradle

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ repositories {
88

99
dependencies {
1010
implementation 'org.apache.spark:spark-sql_2.12:3.5.3'
11-
implementation 'com.marklogic:marklogic-spark-connector:2.5.0'
11+
implementation 'com.marklogic:marklogic-spark-connector:2.5.1'
1212
}
1313

1414
task runApp(type: JavaExec) {

marklogic-langchain4j/build.gradle

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
dependencies {
22
compileOnly "com.fasterxml.jackson.core:jackson-databind:2.17.2"
3-
api "com.marklogic:marklogic-client-api:7.0.0"
3+
api "com.marklogic:marklogic-client-api:7.1.0"
44

55
// Supports splitting documents.
66
api "dev.langchain4j:langchain4j:0.35.0"

marklogic-spark-api/src/main/java/com/marklogic/spark/Options.java

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,12 @@ public abstract class Options {
1515
public static final String CLIENT_URI = "spark.marklogic.client.uri";
1616
public static final String CLIENT_USERNAME = "spark.marklogic.client.username";
1717

18+
/**
19+
* Alias for "spark.marklogic.client.uri", which will be deprecated soon in favor of this better name.
20+
* @since 2.5.1
21+
*/
22+
public static final String CLIENT_CONNECTION_STRING = "spark.marklogic.client.connectionString";
23+
1824
public static final String READ_INVOKE = "spark.marklogic.read.invoke";
1925
public static final String READ_JAVASCRIPT = "spark.marklogic.read.javascript";
2026
public static final String READ_JAVASCRIPT_FILE = "spark.marklogic.read.javascriptFile";

0 commit comments

Comments
 (0)