Skip to content

Commit 47b6a5f

Browse files
committed
[SPARK-51736] Make SparkConnectError and StorageLevel fields public
### What changes were proposed in this pull request? This PR aims to change the visibility of the following. - `enum SparkConnectError` - `StorageLevel` fields In addition, this PR changes the following test suite to use only `public` APIs of `SparkConnect` by changing the import statement. This will help us validate the visibility change easily. - `DataFrameTests` - `DataFrameReaderTests` - `DataFrameWriterTests` - `SQLTests` ### Why are the changes needed? To allow users to use `SparkConnectError` and `StorageLevel`. ### Does this PR introduce _any_ user-facing change? No. This is a visibility change on the unreleased versions. ### How was this patch tested? Pass the CIs and manual review. ### Was this patch authored or co-authored using generative AI tooling? No. Closes #45 from dongjoon-hyun/SPARK-51736. Authored-by: Dongjoon Hyun <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]>
1 parent ca07010 commit 47b6a5f

File tree

6 files changed

+10
-10
lines changed

6 files changed

+10
-10
lines changed

Sources/SparkConnect/SparkConnectError.swift

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@
1818
//
1919

2020
/// A enum for ``SparkConnect`` package errors
21-
enum SparkConnectError: Error {
21+
public enum SparkConnectError: Error {
2222
case UnsupportedOperationException
2323
case InvalidSessionIDException
2424
case InvalidTypeException

Sources/SparkConnect/StorageLevel.swift

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -23,19 +23,19 @@
2323
/// to replicate the `RDD` partitions on multiple nodes.
2424
public struct StorageLevel: Sendable {
2525
/// Whether the cache should use disk or not.
26-
var useDisk: Bool
26+
public var useDisk: Bool
2727

2828
/// Whether the cache should use memory or not.
29-
var useMemory: Bool
29+
public var useMemory: Bool
3030

3131
/// Whether the cache should use off-heap or not.
32-
var useOffHeap: Bool
32+
public var useOffHeap: Bool
3333

3434
/// Whether the cached data is deserialized or not.
35-
var deserialized: Bool
35+
public var deserialized: Bool
3636

3737
/// The number of replicas.
38-
var replication: Int32
38+
public var replication: Int32
3939

4040
init(useDisk: Bool, useMemory: Bool, useOffHeap: Bool, deserialized: Bool, replication: Int32 = 1)
4141
{

Tests/SparkConnectTests/DataFrameReaderTests.swift

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@
2020
import Foundation
2121
import Testing
2222

23-
@testable import SparkConnect
23+
import SparkConnect
2424

2525
/// A test suite for `DataFrameReader`
2626
struct DataFrameReaderTests {

Tests/SparkConnectTests/DataFrameTests.swift

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@
1919

2020
import Testing
2121

22-
@testable import SparkConnect
22+
import SparkConnect
2323

2424
/// A test suite for `DataFrame`
2525
struct DataFrameTests {

Tests/SparkConnectTests/DataFrameWriterTests.swift

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@
2020
import Foundation
2121
import Testing
2222

23-
@testable import SparkConnect
23+
import SparkConnect
2424

2525
/// A test suite for `DataFrameWriter`
2626
struct DataFrameWriterTests {

Tests/SparkConnectTests/SQLTests.swift

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@
2020
import Foundation
2121
import Testing
2222

23-
@testable import SparkConnect
23+
import SparkConnect
2424

2525
/// A test suite for various SQL statements.
2626
struct SQLTests {

0 commit comments

Comments
 (0)