Skip to content

Commit f81aba2

Browse files
committed
[SPARK-51718] Update README.md with Spark 4.0.0 RC3
### What changes were proposed in this pull request? This PR aims to make `README.md` up-to-date with the following. - Apache Spark 4.0.0 RC3 - New APIs like `filter`, `cache`, `read`, `write`, `mode`, `orc` ### Why are the changes needed? To provide more examples. ### Does this PR introduce _any_ user-facing change? No, this is a documentation-only change. ### How was this patch tested? Manual review. Also, the newly updated example is testable in the following repository. - https://github.com/dongjoon-hyun/spark-connect-swift-app ### Was this patch authored or co-authored using generative AI tooling? No. Closes #41 from dongjoon-hyun/SPARK-51718. Authored-by: Dongjoon Hyun <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]>
1 parent 6123c8b commit f81aba2

File tree

1 file changed

+16
-4
lines changed

1 file changed

+16
-4
lines changed

README.md

Lines changed: 16 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -6,10 +6,10 @@
66

77
This is an experimental Swift library to show how to connect to a remote Apache Spark Connect Server and run SQL statements to manipulate remote data.
88

9-
So far, this library project is tracking the upstream changes like the [Apache Spark](https://spark.apache.org) 4.0.0 RC2 release and [Apache Arrow](https://arrow.apache.org) project's Swift-support.
9+
So far, this library project is tracking the upstream changes like the [Apache Spark](https://spark.apache.org) 4.0.0 RC3 release and [Apache Arrow](https://arrow.apache.org) project's Swift-support.
1010

1111
## Requirement
12-
- [Apache Spark 4.0.0 RC2 (March 2025)](https://dist.apache.org/repos/dist/dev/spark/v4.0.0-rc2-bin/)
12+
- [Apache Spark 4.0.0 RC3 (March 2025)](https://dist.apache.org/repos/dist/dev/spark/v4.0.0-rc3-bin/)
1313
- [Swift 6.0 (2024)](https://swift.org)
1414
- [gRPC Swift 2.1 (March 2025)](https://github.com/grpc/grpc-swift/releases/tag/2.1.2)
1515
- [gRPC Swift Protobuf 1.1 (March 2025)](https://github.com/grpc/grpc-swift-protobuf/releases/tag/1.1.0)
@@ -59,7 +59,7 @@ print("Connected to Apache Spark \(await spark.version) Server")
5959
6060
let statements = [
6161
"DROP TABLE IF EXISTS t",
62-
"CREATE TABLE IF NOT EXISTS t(a INT)",
62+
"CREATE TABLE IF NOT EXISTS t(a INT) USING ORC",
6363
"INSERT INTO t VALUES (1), (2), (3)",
6464
]
6565
@@ -68,7 +68,10 @@ for s in statements {
6868
_ = try await spark.sql(s).count()
6969
}
7070
print("SELECT * FROM t")
71-
try await spark.sql("SELECT * FROM t").show()
71+
try await spark.sql("SELECT * FROM t").cache().show()
72+
73+
try await spark.range(10).filter("id % 2 == 0").write.mode("overwrite").orc("/tmp/orc")
74+
try await spark.read.orc("/tmp/orc").show()
7275
7376
await spark.stop()
7477
```
@@ -90,6 +93,15 @@ SELECT * FROM t
9093
| 1 |
9194
| 3 |
9295
+---+
96+
+----+
97+
| id |
98+
+----+
99+
| 2 |
100+
| 6 |
101+
| 0 |
102+
| 8 |
103+
| 4 |
104+
+----+
93105
```
94106

95107
You can find this example in the following repository.

0 commit comments

Comments
 (0)