Skip to content

Commit 5c50f68

Browse files
committed
[SPARK-26811][SQL][FOLLOWUP] fix some documentation
## What changes were proposed in this pull request? It's a followup of apache#24012 , to fix 2 documentation: 1. `SupportsRead` and `SupportsWrite` are not internal anymore. They are public interfaces now. 2. `Scan` should link the `BATCH_READ` instead of hardcoding it. ## How was this patch tested? N/A Closes apache#24285 from cloud-fan/doc. Authored-by: Wenchen Fan <[email protected]> Signed-off-by: Wenchen Fan <[email protected]>
1 parent 6c4552c commit 5c50f68

File tree

3 files changed

+7
-5
lines changed

3 files changed

+7
-5
lines changed

sql/core/src/main/java/org/apache/spark/sql/sources/v2/SupportsRead.java

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@
2222
import org.apache.spark.sql.util.CaseInsensitiveStringMap;
2323

2424
/**
25-
* An internal base interface of mix-in interfaces for readable {@link Table}. This adds
25+
* A mix-in interface of {@link Table}, to indicate that it's readable. This adds
2626
* {@link #newScanBuilder(CaseInsensitiveStringMap)} that is used to create a scan for batch,
2727
* micro-batch, or continuous processing.
2828
*/

sql/core/src/main/java/org/apache/spark/sql/sources/v2/SupportsWrite.java

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@
2222
import org.apache.spark.sql.util.CaseInsensitiveStringMap;
2323

2424
/**
25-
* An internal base interface of mix-in interfaces for writable {@link Table}. This adds
25+
* A mix-in interface of {@link Table}, to indicate that it's writable. This adds
2626
* {@link #newWriteBuilder(CaseInsensitiveStringMap)} that is used to create a write
2727
* for batch or streaming.
2828
*/

sql/core/src/main/java/org/apache/spark/sql/sources/v2/reader/Scan.java

Lines changed: 5 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -24,6 +24,7 @@
2424
import org.apache.spark.sql.sources.v2.SupportsContinuousRead;
2525
import org.apache.spark.sql.sources.v2.SupportsMicroBatchRead;
2626
import org.apache.spark.sql.sources.v2.Table;
27+
import org.apache.spark.sql.sources.v2.TableCapability;
2728

2829
/**
2930
* A logical representation of a data source scan. This interface is used to provide logical
@@ -32,8 +33,8 @@
3233
* This logical representation is shared between batch scan, micro-batch streaming scan and
3334
* continuous streaming scan. Data sources must implement the corresponding methods in this
3435
* interface, to match what the table promises to support. For example, {@link #toBatch()} must be
35-
* implemented, if the {@link Table} that creates this {@link Scan} returns BATCH_READ support in
36-
* its {@link Table#capabilities()}.
36+
* implemented, if the {@link Table} that creates this {@link Scan} returns
37+
* {@link TableCapability#BATCH_READ} support in its {@link Table#capabilities()}.
3738
* </p>
3839
*/
3940
@Evolving
@@ -61,7 +62,8 @@ default String description() {
6162
/**
6263
* Returns the physical representation of this scan for batch query. By default this method throws
6364
* exception, data sources must overwrite this method to provide an implementation, if the
64-
* {@link Table} that creates this returns batch read support in its {@link Table#capabilities()}.
65+
* {@link Table} that creates this scan returns {@link TableCapability#BATCH_READ} in its
66+
* {@link Table#capabilities()}.
6567
*
6668
* @throws UnsupportedOperationException
6769
*/

0 commit comments

Comments
 (0)