Skip to content

Commit 103681d

Browse files
authored
[3.3][Docs] Update docs for 3.3.1 release (#4293)
<!-- Thanks for sending a pull request! Here are some tips for you: 1. If this is your first time, please read our contributor guidelines: https://github.com/delta-io/delta/blob/master/CONTRIBUTING.md 2. If the PR is unfinished, add '[WIP]' in your PR title, e.g., '[WIP] Your PR title ...'. 3. Be sure to keep the PR description updated to reflect all changes. 4. Please write your PR title to summarize what this PR proposes. 5. If possible, provide a concise example to reproduce the issue for a faster review. 6. If applicable, include the corresponding issue number in the PR title and link it in the body. --> #### Which Delta project/connector is this regarding? <!-- Please add the component selected below to the beginning of the pull request title For example: [Spark] Title of my pull request --> - [ ] Spark - [ ] Standalone - [ ] Flink - [ ] Kernel - [ ] Other (fill in here) ## Description <!-- - Describe what this PR changes. - Describe why we need the change. If this PR resolves an issue be sure to include "Resolves #XXX" to correctly link and close the issue upon merge. --> ## How was this patch tested? <!-- If tests were added, say they were added here. Please make sure to test the changes thoroughly including negative and positive cases if possible. If the changes were tested in any way other than unit tests, please clarify how you tested step by step (ideally copy and paste-able, so that other reviewers can test and check, and descendants can verify in the future). If the changes were not tested, please explain why. --> ## Does this PR introduce _any_ user-facing changes? <!-- If yes, please clarify the previous behavior and the change this PR proposes - provide the console output, description and/or an example to show the behavior difference if possible. If possible, please also clarify if this is a user-facing change compared to the released Delta Lake versions or within the unreleased branches such as master. If no, write 'No'. -->
1 parent 1e133be commit 103681d

File tree

2 files changed

+11
-11
lines changed

2 files changed

+11
-11
lines changed

docs/source/delta-storage.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -70,7 +70,7 @@ This section explains how to quickly start reading and writing Delta tables on S
7070

7171
```bash
7272
bin/spark-shell \
73-
--packages io.delta:delta-spark_2.12:3.3.0,org.apache.hadoop:hadoop-aws:3.3.4 \
73+
--packages io.delta:delta-spark_2.12:3.3.1,org.apache.hadoop:hadoop-aws:3.3.4 \
7474
--conf spark.hadoop.fs.s3a.access.key=<your-s3-access-key> \
7575
--conf spark.hadoop.fs.s3a.secret.key=<your-s3-secret-key>
7676
```
@@ -91,7 +91,7 @@ For efficient listing of <Delta> metadata files on S3, set the configuration `de
9191

9292
```scala
9393
bin/spark-shell \
94-
--packages io.delta:delta-spark_2.12:3.3.0,org.apache.hadoop:hadoop-aws:3.3.4 \
94+
--packages io.delta:delta-spark_2.12:3.3.1,org.apache.hadoop:hadoop-aws:3.3.4 \
9595
--conf spark.hadoop.fs.s3a.access.key=<your-s3-access-key> \
9696
--conf spark.hadoop.fs.s3a.secret.key=<your-s3-secret-key> \
9797
--conf "spark.hadoop.delta.enableFastS3AListFrom=true
@@ -142,7 +142,7 @@ This section explains how to quickly start reading and writing Delta tables on S
142142

143143
```bash
144144
bin/spark-shell \
145-
--packages io.delta:delta-spark_2.12:3,org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-storage-s3-dynamodb:3.3.0 \
145+
--packages io.delta:delta-spark_2.12:3,org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-storage-s3-dynamodb:3.3.1 \
146146
--conf spark.hadoop.fs.s3a.access.key=<your-s3-access-key> \
147147
--conf spark.hadoop.fs.s3a.secret.key=<your-s3-secret-key> \
148148
--conf spark.delta.logStore.s3a.impl=io.delta.storage.S3DynamoDBLogStore \

docs/source/quick-start.md

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -18,13 +18,13 @@ Follow these instructions to set up <Delta> with Spark. You can run the steps in
1818

1919
#. Run as a project: Set up a Maven or SBT project (Scala or Java) with <Delta>, copy the code snippets into a source file, and run the project. Alternatively, you can use the [examples provided in the Github repository](https://github.com/delta-io/delta/tree/master/examples).
2020

21-
.. important:: For all of the following instructions, make sure to install the correct version of Spark or PySpark that is compatible with <Delta> `3.3.0`. See the [release compatibility matrix](releases.md) for details.
21+
.. important:: For all of the following instructions, make sure to install the correct version of Spark or PySpark that is compatible with <Delta> `3.3.1`. See the [release compatibility matrix](releases.md) for details.
2222

2323
### Prerequisite: set up Java
2424

2525
As mentioned in the official <AS> installation instructions [here](https://spark.apache.org/docs/latest/index.html#downloading), make sure you have a valid Java version installed (8, 11, or 17) and that Java is configured correctly on your system using either the system `PATH` or `JAVA_HOME` environmental variable.
2626

27-
Windows users should follow the instructions in this [blog](https://phoenixnap.com/kb/install-spark-on-windows-10), making sure to use the correct version of <AS> that is compatible with <Delta> `3.3.0`.
27+
Windows users should follow the instructions in this [blog](https://phoenixnap.com/kb/install-spark-on-windows-10), making sure to use the correct version of <AS> that is compatible with <Delta> `3.3.1`.
2828

2929
### Set up interactive shell
3030

@@ -35,7 +35,7 @@ To use <Delta> interactively within the Spark SQL, Scala, or Python shell, you n
3535
Download the [compatible version](releases.md) of <AS> by following instructions from [Downloading Spark](https://spark.apache.org/downloads.html), either using `pip` or by downloading and extracting the archive and running `spark-sql` in the extracted directory.
3636

3737
```bash
38-
bin/spark-sql --packages io.delta:delta-spark_2.12:3.3.0 --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" --conf "spark.sql.catalog.spark_catalog=org.apache.spark.sql.delta.catalog.DeltaCatalog"
38+
bin/spark-sql --packages io.delta:delta-spark_2.12:3.3.1 --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" --conf "spark.sql.catalog.spark_catalog=org.apache.spark.sql.delta.catalog.DeltaCatalog"
3939
```
4040

4141
#### PySpark Shell
@@ -49,15 +49,15 @@ bin/spark-sql --packages io.delta:delta-spark_2.12:3.3.0 --conf "spark.sql.exten
4949
#. Run PySpark with the <Delta> package and additional configurations:
5050

5151
```bash
52-
pyspark --packages io.delta:delta-spark_2.12:3.3.0 --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" --conf "spark.sql.catalog.spark_catalog=org.apache.spark.sql.delta.catalog.DeltaCatalog"
52+
pyspark --packages io.delta:delta-spark_2.12:3.3.1 --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" --conf "spark.sql.catalog.spark_catalog=org.apache.spark.sql.delta.catalog.DeltaCatalog"
5353
```
5454

5555
#### Spark Scala Shell
5656

5757
Download the [compatible version](releases.md) of <AS> by following instructions from [Downloading Spark](https://spark.apache.org/downloads.html), either using `pip` or by downloading and extracting the archive and running `spark-shell` in the extracted directory.
5858

5959
```bash
60-
bin/spark-shell --packages io.delta:delta-spark_2.12:3.3.0 --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" --conf "spark.sql.catalog.spark_catalog=org.apache.spark.sql.delta.catalog.DeltaCatalog"
60+
bin/spark-shell --packages io.delta:delta-spark_2.12:3.3.1 --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" --conf "spark.sql.catalog.spark_catalog=org.apache.spark.sql.delta.catalog.DeltaCatalog"
6161
```
6262

6363
### Set up project
@@ -72,7 +72,7 @@ You include <Delta> in your Maven project by adding it as a dependency in your P
7272
<dependency>
7373
<groupId>io.delta</groupId>
7474
<artifactId>delta-spark_2.12</artifactId>
75-
<version>3.3.0</version>
75+
<version>3.3.1</version>
7676
</dependency>
7777
```
7878

@@ -81,12 +81,12 @@ You include <Delta> in your Maven project by adding it as a dependency in your P
8181
You include <Delta> in your SBT project by adding the following line to your `build.sbt` file:
8282

8383
```scala
84-
libraryDependencies += "io.delta" %% "delta-spark" % "3.3.0"
84+
libraryDependencies += "io.delta" %% "delta-spark" % "3.3.1"
8585
```
8686

8787
#### Python
8888

89-
To set up a Python project (for example, for unit testing), you can install <Delta> using `pip install delta-spark==3.3.0` and then configure the SparkSession with the `configure_spark_with_delta_pip()` utility function in <Delta>.
89+
To set up a Python project (for example, for unit testing), you can install <Delta> using `pip install delta-spark==3.3.1` and then configure the SparkSession with the `configure_spark_with_delta_pip()` utility function in <Delta>.
9090

9191
```python
9292
import pyspark

0 commit comments

Comments
 (0)