You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
### What changes were proposed in this pull request?
This PR aims to use `super-linter` for markdown files.
### Why are the changes needed?
For consistency.
### Does this PR introduce _any_ user-facing change?
No.
### How was this patch tested?
Pass the CIs with the newly added `linter` GitHub Action test.
### Was this patch authored or co-authored using generative AI tooling?
No.
Closes#173 from dongjoon-hyun/SPARK-52293.
Authored-by: Dongjoon Hyun <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
Copy file name to clipboardExpand all lines: README.md
-1Lines changed: 0 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -117,4 +117,3 @@ SELECT * FROM t
117
117
You can find more complete examples including `Spark SQL REPL`, `Web Server` and `Streaming` applications in the [Examples](https://github.com/apache/spark-connect-swift/tree/main/Examples) directory.
118
118
119
119
This library also supports `SPARK_REMOTE` environment variable to specify the [Spark Connect connection string](https://spark.apache.org/docs/latest/spark-connect-overview.html#set-sparkremote-environment-variable) in order to provide more options.
The basic application example demonstrates fundamental operations with Apache Spark Connect, including:
16
+
16
17
- Connecting to a Spark server
17
18
- Creating and manipulating tables with SQL
18
19
- Using DataFrame operations
19
20
- Reading and writing data in the ORC format
20
21
21
22
### Key Features
23
+
22
24
- SQL execution for table operations
23
25
- DataFrame transformations with filter operations
24
26
- Data persistence with ORC format
@@ -40,11 +42,13 @@ swift run
40
42
## Spark SQL REPL(Read-Eval-Print Loop) Example
41
43
42
44
The Spark SQL REPL application example demonstrates interactive operations with ad-hoc Spark SQL queries with Apache Spark Connect, including:
45
+
43
46
- Connecting to a Spark server
44
47
- Receiving ad-hoc Spark SQL queries from users
45
48
- Show the SQL results interactively
46
49
47
50
### Key Features
51
+
48
52
- Spark SQL execution for table operations
49
53
- User interactions
50
54
@@ -66,6 +70,7 @@ swift run
66
70
The Pi calculation example shows how to use Spark Connect Swift for computational tasks by calculating an approximation of π (pi) using the Monte Carlo method.
67
71
68
72
### Key Features
73
+
69
74
- Command-line argument handling
70
75
- Mathematical computations with Spark
71
76
- Random number generation
@@ -89,6 +94,7 @@ swift run
89
94
The streaming example demonstrates how to process streaming data using Spark Connect Swift client, specifically for counting words from a network socket stream.
90
95
91
96
### Key Features
97
+
92
98
- Stream processing with Spark Connect
93
99
- Network socket data source
94
100
- Word counting with string operations
@@ -120,6 +126,7 @@ Type text into the Netcat terminal to see real-time word counting from `Spark Co
120
126
The web application example showcases how to integrate Spark Connect Swift with a web server using the Vapor framework.
121
127
122
128
### Key Features
129
+
123
130
- HTTP server integration with Vapor
124
131
- REST API endpoints
125
132
- Spark session management within web requests
@@ -153,6 +160,7 @@ Hi, this is powered by the Apache Spark 4.0.0.%
153
160
## Development Environment
154
161
155
162
All examples include:
163
+
156
164
- A Dockerfile for containerized execution
157
165
- A Package.swift file for Swift Package Manager configuration
0 commit comments