Skip to content

Commit 8460e74

Browse files
committed
Releasing 2.8.0
1 parent 547b4e7 commit 8460e74

File tree

5 files changed

+17
-17
lines changed

5 files changed

+17
-17
lines changed

CHANGELOG.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ All notable changes to this project will be documented in this file.
33

44
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
55

6-
## [UNRELEASED] - YYYY-MM-DD
6+
## [2.8.0] - 2023-05-24
77

88
### Added
99

README.md

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -119,7 +119,7 @@ The package version has the following semantics: `spark-extension_{SCALA_COMPAT_
119119
Add this line to your `build.sbt` file:
120120

121121
```sbt
122-
libraryDependencies += "uk.co.gresearch.spark" %% "spark-extension" % "2.7.0-3.4"
122+
libraryDependencies += "uk.co.gresearch.spark" %% "spark-extension" % "2.8.0-3.4"
123123
```
124124

125125
### Maven
@@ -130,7 +130,7 @@ Add this dependency to your `pom.xml` file:
130130
<dependency>
131131
<groupId>uk.co.gresearch.spark</groupId>
132132
<artifactId>spark-extension_2.12</artifactId>
133-
<version>2.7.0-3.4</version>
133+
<version>2.8.0-3.4</version>
134134
</dependency>
135135
```
136136

@@ -139,7 +139,7 @@ Add this dependency to your `pom.xml` file:
139139
Submit your Spark app with the Spark Extension dependency (version ≥1.1.0) as follows:
140140

141141
```shell script
142-
spark-submit --packages uk.co.gresearch.spark:spark-extension_2.12:2.7.0-3.3 [jar]
142+
spark-submit --packages uk.co.gresearch.spark:spark-extension_2.12:2.8.0-3.3 [jar]
143143
```
144144

145145
Note: Pick the right Scala version (here 2.12) and Spark version (here 3.3) depending on your Spark version.
@@ -149,7 +149,7 @@ Note: Pick the right Scala version (here 2.12) and Spark version (here 3.3) depe
149149
Launch a Spark Shell with the Spark Extension dependency (version ≥1.1.0) as follows:
150150

151151
```shell script
152-
spark-shell --packages uk.co.gresearch.spark:spark-extension_2.12:2.7.0-3.4
152+
spark-shell --packages uk.co.gresearch.spark:spark-extension_2.12:2.8.0-3.4
153153
```
154154

155155
Note: Pick the right Scala version (here 2.12) and Spark version (here 3.4) depending on your Spark Shell version.
@@ -165,7 +165,7 @@ from pyspark.sql import SparkSession
165165

166166
spark = SparkSession \
167167
.builder \
168-
.config("spark.jars.packages", "uk.co.gresearch.spark:spark-extension_2.12:2.7.0-3.4") \
168+
.config("spark.jars.packages", "uk.co.gresearch.spark:spark-extension_2.12:2.8.0-3.4") \
169169
.getOrCreate()
170170
```
171171

@@ -176,7 +176,7 @@ Note: Pick the right Scala version (here 2.12) and Spark version (here 3.4) depe
176176
Launch the Python Spark REPL with the Spark Extension dependency (version ≥1.1.0) as follows:
177177

178178
```shell script
179-
pyspark --packages uk.co.gresearch.spark:spark-extension_2.12:2.7.0-3.4
179+
pyspark --packages uk.co.gresearch.spark:spark-extension_2.12:2.8.0-3.4
180180
```
181181

182182
Note: Pick the right Scala version (here 2.12) and Spark version (here 3.4) depending on your PySpark version.
@@ -186,7 +186,7 @@ Note: Pick the right Scala version (here 2.12) and Spark version (here 3.4) depe
186186
Run your Python scripts that use PySpark via `spark-submit`:
187187

188188
```shell script
189-
spark-submit --packages uk.co.gresearch.spark:spark-extension_2.12:2.7.0-3.4 [script.py]
189+
spark-submit --packages uk.co.gresearch.spark:spark-extension_2.12:2.8.0-3.4 [script.py]
190190
```
191191

192192
Note: Pick the right Scala version (here 2.12) and Spark version (here 3.4) depending on your Spark version.
@@ -200,7 +200,7 @@ Running your Python application on a Spark cluster will still require one of the
200200
to add the Scala package to the Spark environment.
201201

202202
```shell script
203-
pip install pyspark-extension==2.7.0.3.4
203+
pip install pyspark-extension==2.8.0.3.4
204204
```
205205

206206
Note: Pick the right Spark version (here 3.4) depending on your PySpark version.
@@ -210,7 +210,7 @@ Note: Pick the right Spark version (here 3.4) depending on your PySpark version.
210210
There are plenty of [Data Science notebooks](https://datasciencenotebook.org/) around. To use this library,
211211
add **a jar dependency** to your notebook using these **Maven coordinates**:
212212

213-
uk.co.gresearch.spark:spark-extension_2.12:2.7.0-3.4
213+
uk.co.gresearch.spark:spark-extension_2.12:2.8.0-3.4
214214

215215
Or [download the jar](https://mvnrepository.com/artifact/uk.co.gresearch.spark/spark-extension) and place it
216216
on a filesystem where it is accessible by the notebook, and reference that jar file directly.

pom.xml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
<modelVersion>4.0.0</modelVersion>
33
<groupId>uk.co.gresearch.spark</groupId>
44
<artifactId>spark-extension_2.13</artifactId>
5-
<version>2.8.0-3.4-SNAPSHOT</version>
5+
<version>2.8.0-3.4</version>
66
<name>Spark Extension</name>
77
<description>A library that provides useful extensions to Apache Spark.</description>
88
<inceptionYear>2020</inceptionYear>

python/README.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -72,7 +72,7 @@ Running your Python application on a Spark cluster will still require one of the
7272
to add the Scala package to the Spark environment.
7373

7474
```shell script
75-
pip install pyspark-extension==2.7.0.3.3
75+
pip install pyspark-extension==2.8.0.3.3
7676
```
7777

7878
Note: Pick the right Spark version (here 3.3) depending on your PySpark version.
@@ -86,7 +86,7 @@ from pyspark.sql import SparkSession
8686

8787
spark = SparkSession \
8888
.builder \
89-
.config("spark.jars.packages", "uk.co.gresearch.spark:spark-extension_2.12:2.7.0-3.3") \
89+
.config("spark.jars.packages", "uk.co.gresearch.spark:spark-extension_2.12:2.8.0-3.3") \
9090
.getOrCreate()
9191
```
9292

@@ -97,7 +97,7 @@ Note: Pick the right Scala version (here 2.12) and Spark version (here 3.3) depe
9797
Launch the Python Spark REPL with the Spark Extension dependency (version ≥1.1.0) as follows:
9898

9999
```shell script
100-
pyspark --packages uk.co.gresearch.spark:spark-extension_2.12:2.7.0-3.3
100+
pyspark --packages uk.co.gresearch.spark:spark-extension_2.12:2.8.0-3.3
101101
```
102102

103103
Note: Pick the right Scala version (here 2.12) and Spark version (here 3.3) depending on your PySpark version.
@@ -107,7 +107,7 @@ Note: Pick the right Scala version (here 2.12) and Spark version (here 3.3) depe
107107
Run your Python scripts that use PySpark via `spark-submit`:
108108

109109
```shell script
110-
spark-submit --packages uk.co.gresearch.spark:spark-extension_2.12:2.7.0-3.3 [script.py]
110+
spark-submit --packages uk.co.gresearch.spark:spark-extension_2.12:2.8.0-3.3 [script.py]
111111
```
112112

113113
Note: Pick the right Scala version (here 2.12) and Spark version (here 3.3) depending on your Spark version.
@@ -117,7 +117,7 @@ Note: Pick the right Scala version (here 2.12) and Spark version (here 3.3) depe
117117
There are plenty of [Data Science notebooks](https://datasciencenotebook.org/) around. To use this library,
118118
add **a jar dependency** to your notebook using these **Maven coordinates**:
119119

120-
uk.co.gresearch.spark:spark-extension_2.12:2.7.0-3.3
120+
uk.co.gresearch.spark:spark-extension_2.12:2.8.0-3.3
121121

122122
Or [download the jar](https://mvnrepository.com/artifact/uk.co.gresearch.spark/spark-extension) and place it
123123
on a filesystem where it is accessible by the notebook, and reference that jar file directly.

python/setup.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@
1717
from pathlib import Path
1818
from setuptools import setup
1919

20-
jar_version = '2.8.0-3.4-SNAPSHOT'
20+
jar_version = '2.8.0-3.4'
2121
scala_version = '2.13.8'
2222
scala_compat_version = '.'.join(scala_version.split('.')[:2])
2323
spark_compat_version = jar_version.split('-')[1]

0 commit comments

Comments
 (0)