Skip to content

Commit 22118f9

Browse files
committed
[SPARK-50987][DOCS] Make spark-connect-overview.mds version strings up-to-date
This PR aims to make `spark-connect-overview.md`'s version strings up-to-date. - https://apache.github.io/spark/spark-connect-overview.html **BEFORE** <img width="477" alt="Screenshot 2025-01-24 at 11 17 03 PM" src="https://github.com/user-attachments/assets/4ee91119-e116-4573-8446-32bf18342ac5" /> **AFTER** <img width="318" alt="Screenshot 2025-01-24 at 11 17 22 PM" src="https://github.com/user-attachments/assets/9e3f9061-6623-440f-8031-5ee85666675c" /> **BEFORE** <img width="546" alt="Screenshot 2025-01-24 at 11 17 58 PM" src="https://github.com/user-attachments/assets/dc3ac80b-a5fc-4ea2-bf1d-4025a6ae204f" /> **AFTER** <img width="552" alt="Screenshot 2025-01-24 at 11 18 35 PM" src="https://github.com/user-attachments/assets/8c5fe8cf-a8b1-4933-a593-3037f356c81a" /> **BEFORE** <img width="679" alt="Screenshot 2025-01-24 at 11 21 33 PM" src="https://github.com/user-attachments/assets/d4d69efe-2fb4-43ea-be13-0d1bbe251b2c" /> **AFTER** <img width="674" alt="Screenshot 2025-01-24 at 11 22 29 PM" src="https://github.com/user-attachments/assets/09a413fe-3659-4bba-b37c-609f2d6f16ba" /> This keeps the document up-to-date for Apache Spark 3.5.5/4.0.0+. Manual review. No Closes #49665 from dongjoon-hyun/SPARK-50987. Authored-by: Dongjoon Hyun <dongjoon@apache.org> Signed-off-by: Dongjoon Hyun <dongjoon@apache.org> (cherry picked from commit 21f0512) Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
1 parent 3e01d41 commit 22118f9

File tree

1 file changed

+7
-7
lines changed

1 file changed

+7
-7
lines changed

docs/spark-connect-overview.md

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -279,11 +279,11 @@ The connection may also be programmatically created using _SparkSession#builder_
279279

280280
<div data-lang="python" markdown="1">
281281

282-
First, install PySpark with `pip install pyspark[connect]==3.5.0` or if building a packaged PySpark application/library,
282+
First, install PySpark with `pip install pyspark[connect]=={{site.SPARK_VERSION_SHORT}}` or if building a packaged PySpark application/library,
283283
add it your setup.py file as:
284284
{% highlight python %}
285285
install_requires=[
286-
'pyspark[connect]==3.5.0'
286+
'pyspark[connect]=={{site.SPARK_VERSION_SHORT}}'
287287
]
288288
{% endhighlight %}
289289

@@ -330,8 +330,8 @@ Lines with a: 72, lines with b: 39
330330
To use Spark Connect as part of a Scala application/project, we first need to include the right dependencies.
331331
Using the `sbt` build system as an example, we add the following dependencies to the `build.sbt` file:
332332
{% highlight sbt %}
333-
libraryDependencies += "org.apache.spark" %% "spark-sql-api" % "3.5.0"
334-
libraryDependencies += "org.apache.spark" %% "spark-connect-client-jvm" % "3.5.0"
333+
libraryDependencies += "org.apache.spark" %% "spark-sql-api" % "{{site.SPARK_VERSION_SHORT}}"
334+
libraryDependencies += "org.apache.spark" %% "spark-connect-client-jvm" % "{{site.SPARK_VERSION_SHORT}}"
335335
{% endhighlight %}
336336

337337
When writing your own code, include the `remote` function with a reference to
@@ -374,9 +374,9 @@ HTTP/2 interface allows for the use of authenticating proxies, which makes
374374
it possible to secure Spark Connect without having to implement authentication
375375
logic in Spark directly.
376376

377-
# What is supported in Spark 3.4
377+
# What is supported
378378

379-
**PySpark**: In Spark 3.4, Spark Connect supports most PySpark APIs, including
379+
**PySpark**: Since Spark 3.4, Spark Connect supports most PySpark APIs, including
380380
[DataFrame](api/python/reference/pyspark.sql/dataframe.html),
381381
[Functions](api/python/reference/pyspark.sql/functions.html), and
382382
[Column](api/python/reference/pyspark.sql/column.html). However,
@@ -387,7 +387,7 @@ supported in the [API reference](api/python/reference/index.html) documentation.
387387
Supported APIs are labeled "Supports Spark Connect" so you can check whether the
388388
APIs you are using are available before migrating existing code to Spark Connect.
389389

390-
**Scala**: In Spark 3.5, Spark Connect supports most Scala APIs, including
390+
**Scala**: Since Spark 3.5, Spark Connect supports most Scala APIs, including
391391
[Dataset](api/scala/org/apache/spark/sql/Dataset.html),
392392
[functions](api/scala/org/apache/spark/sql/functions$.html),
393393
[Column](api/scala/org/apache/spark/sql/Column.html),

0 commit comments

Comments
 (0)