@@ -284,11 +284,11 @@ The connection may also be programmatically created using _SparkSession#builder_
284
284
285
285
<div data-lang =" python " markdown =" 1 " >
286
286
287
- First, install PySpark with ` pip install pyspark[connect]==3.5.0 ` or if building a packaged PySpark application/library,
287
+ First, install PySpark with ` pip install pyspark[connect]=={{site.SPARK_VERSION_SHORT}} ` or if building a packaged PySpark application/library,
288
288
add it your setup.py file as:
289
289
{% highlight python %}
290
290
install_requires=[
291
- 'pyspark[ connect] ==3.5.0 '
291
+ 'pyspark[ connect] =={{site.SPARK_VERSION_SHORT}} '
292
292
]
293
293
{% endhighlight %}
294
294
@@ -335,7 +335,7 @@ Lines with a: 72, lines with b: 39
335
335
To use Spark Connect as part of a Scala application/project, we first need to include the right dependencies.
336
336
Using the ` sbt ` build system as an example, we add the following dependencies to the ` build.sbt ` file:
337
337
{% highlight sbt %}
338
- libraryDependencies += "org.apache.spark" %% "spark-connect-client-jvm" % "3.5.0 "
338
+ libraryDependencies += "org.apache.spark" %% "spark-connect-client-jvm" % "{{site.SPARK_VERSION_SHORT}} "
339
339
{% endhighlight %}
340
340
341
341
When writing your own code, include the ` remote ` function with a reference to
@@ -380,9 +380,9 @@ HTTP/2 interface allows for the use of authenticating proxies, which makes
380
380
it possible to secure Spark Connect without having to implement authentication
381
381
logic in Spark directly.
382
382
383
- # What is supported in Spark 3.4
383
+ # What is supported
384
384
385
- ** PySpark** : In Spark 3.4, Spark Connect supports most PySpark APIs, including
385
+ ** PySpark** : Since Spark 3.4, Spark Connect supports most PySpark APIs, including
386
386
[ DataFrame] ( api/python/reference/pyspark.sql/dataframe.html ) ,
387
387
[ Functions] ( api/python/reference/pyspark.sql/functions.html ) , and
388
388
[ Column] ( api/python/reference/pyspark.sql/column.html ) . However,
@@ -393,7 +393,7 @@ supported in the [API reference](api/python/reference/index.html) documentation.
393
393
Supported APIs are labeled "Supports Spark Connect" so you can check whether the
394
394
APIs you are using are available before migrating existing code to Spark Connect.
395
395
396
- ** Scala** : In Spark 3.5, Spark Connect supports most Scala APIs, including
396
+ ** Scala** : Since Spark 3.5, Spark Connect supports most Scala APIs, including
397
397
[ Dataset] ( api/scala/org/apache/spark/sql/Dataset.html ) ,
398
398
[ functions] ( api/scala/org/apache/spark/sql/functions$.html ) ,
399
399
[ Column] ( api/scala/org/apache/spark/sql/Column.html ) ,
0 commit comments