Skip to content

Commit 39d4a38

Browse files
cj-zhukovSergey Zhukov
andauthored
Consolidate external dependency examples (#18142) (#18747)
## Which issue does this PR close? <!-- We generally require a GitHub issue to be filed for all bug fixes and enhancements and this helps us generate change logs for our releases. You can link an issue to this PR using the GitHub syntax. For example `Closes #123` indicates that this PR will close issue #123. --> - part of ##18142. ## Rationale for this change This PR is for consolidating all the `external_dependency` examples (dataframe_to_s3, query_aws_s3) into a single example binary. We are agreed on the pattern and we can apply it to the remaining examples <!-- Why are you proposing this change? If this is already explained clearly in the issue then this section is not needed. Explaining clearly why changes are proposed helps reviewers understand your changes and offer better suggestions for fixes. --> ## What changes are included in this PR? <!-- There is no need to duplicate the description in the issue here but it is sometimes worth providing a summary of the individual changes in this PR. --> ## Are these changes tested? <!-- We typically require tests for all PRs in order to: 1. Prevent the code from being accidentally broken by subsequent changes 2. Serve as another way to document the expected behavior of the code If tests are not included in your PR, please explain why (for example, are they covered by existing tests)? --> ## Are there any user-facing changes? <!-- If there are user-facing changes then we may require documentation to be updated before approving the PR. --> <!-- If there are any breaking changes to public APIs, please add the `api change` label. --> --------- Co-authored-by: Sergey Zhukov <[email protected]>
1 parent 2db3aea commit 39d4a38

File tree

5 files changed

+116
-24
lines changed

5 files changed

+116
-24
lines changed

datafusion-examples/Cargo.toml

Lines changed: 0 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -35,14 +35,6 @@ rust-version = { workspace = true }
3535
[lints]
3636
workspace = true
3737

38-
[[example]]
39-
name = "dataframe_to_s3"
40-
path = "examples/external_dependency/dataframe-to-s3.rs"
41-
42-
[[example]]
43-
name = "query_aws_s3"
44-
path = "examples/external_dependency/query-aws-s3.rs"
45-
4638
[dev-dependencies]
4739
arrow = { workspace = true }
4840
# arrow_schema is required for record_batch! macro :sad:

datafusion-examples/README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -60,8 +60,8 @@ cargo run --example dataframe -- dataframe
6060
- [`examples/custom_data_source/custom_datasource.rs`](examples/custom_data_source/custom_datasource.rs): Run queries against a custom datasource (TableProvider)
6161
- [`examples/custom_data_source/custom_file_casts.rs`](examples/custom_data_source/custom_file_casts.rs): Implement custom casting rules to adapt file schemas
6262
- [`examples/custom_data_source/custom_file_format.rs`](examples/custom_data_source/custom_file_format.rs): Write data to a custom file format
63-
- [`dataframe-to-s3.rs`](examples/external_dependency/dataframe-to-s3.rs): Run a query using a DataFrame against a parquet file from s3 and writing back to s3
64-
- [`examples/dataframe/dataframe.rs`](examples/dataframe/dataframe.rs): Run a query using a DataFrame API against parquet files, csv files, and in-memory data, including multiple subqueries. Also demonstrates the various methods to write out a DataFrame to a table, parquet file, csv file, and json file.
63+
- [`examples/external_dependency/dataframe_to_s3.rs`](examples/external_dependency/dataframe_to_s3.rs): Run a query using a DataFrame against a parquet file from s3 and writing back to s3
64+
- [`dataframe.rs`](examples/dataframe.rs): Run a query using a DataFrame API against parquet files, csv files, and in-memory data, including multiple subqueries. Also demonstrates the various methods to write out a DataFrame to a table, parquet file, csv file, and json file.
6565
- [`examples/builtin_functions/date_time`](examples/builtin_functions/date_time.rs): Examples of date-time related functions and queries
6666
- [`examples/custom_data_source/default_column_values.rs`](examples/custom_data_source/default_column_values.rs): Implement custom default value handling for missing columns using field metadata and PhysicalExprAdapter
6767
- [`examples/dataframe/deserialize_to_struct.rs`](examples/dataframe/deserialize_to_struct.rs): Convert query results (Arrow ArrayRefs) into Rust structs
@@ -83,7 +83,7 @@ cargo run --example dataframe -- dataframe
8383
- [`examples/query_planning/planner_api.rs`](examples/query_planning/planner_api.rs) APIs to manipulate logical and physical plans
8484
- [`examples/query_planning/pruning.rs`](examples/query_planning/pruning.rs): Use pruning to rule out files based on statistics
8585
- [`examples/query_planning/thread_pools.rs`](examples/query_planning/thread_pools.rs): Demonstrates TrackConsumersPool for memory tracking and debugging with enhanced error messages and shows how to implement memory-aware ExecutionPlan with memory reservation and spilling
86-
- [`query-aws-s3.rs`](examples/external_dependency/query-aws-s3.rs): Configure `object_store` and run a query against files stored in AWS S3
86+
- [`examples/external_dependency/query_aws_s3.rs`](examples/external_dependency/query_aws_s3.rs): Configure `object_store` and run a query against files stored in AWS S3
8787
- [`examples/data_io/query_http_csv.rs`](examples/data_io/query_http_csv.rs): Configure `object_store` and run a query against files via HTTP
8888
- [`examples/builtin_functions/regexp.rs`](examples/builtin_functions/regexp.rs): Examples of using regular expression functions
8989
- [`examples/data_io/remote_catalog.rs`](examples/data_io/remote_catalog.rs): Examples of interfacing with a remote catalog (e.g. over a network)

datafusion-examples/examples/external_dependency/dataframe-to-s3.rs renamed to datafusion-examples/examples/external_dependency/dataframe_to_s3.rs

Lines changed: 13 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,8 @@
1515
// specific language governing permissions and limitations
1616
// under the License.
1717

18+
//! See `main.rs` for how to run it.
19+
1820
use std::env;
1921
use std::sync::Arc;
2022

@@ -28,14 +30,18 @@ use datafusion::prelude::*;
2830
use object_store::aws::AmazonS3Builder;
2931
use url::Url;
3032

31-
/// This example demonstrates querying data from AmazonS3 and writing
32-
/// the result of a query back to AmazonS3
33-
#[tokio::main]
34-
async fn main() -> Result<()> {
33+
/// This example demonstrates querying data from Amazon S3 and writing
34+
/// the result of a query back to Amazon S3.
35+
///
36+
/// The following environment variables must be defined:
37+
///
38+
/// - AWS_ACCESS_KEY_ID
39+
/// - AWS_SECRET_ACCESS_KEY
40+
pub async fn dataframe_to_s3() -> Result<()> {
3541
// create local execution context
3642
let ctx = SessionContext::new();
3743

38-
//enter region and bucket to which your credentials have GET and PUT access
44+
// enter region and bucket to which your credentials have GET and PUT access
3945
let region = "<bucket-region-here>";
4046
let bucket_name = "<bucket-name-here>";
4147

@@ -66,13 +72,13 @@ async fn main() -> Result<()> {
6672
.write_parquet(&out_path, DataFrameWriteOptions::new(), None)
6773
.await?;
6874

69-
//write as JSON to s3
75+
// write as JSON to s3
7076
let json_out = format!("s3://{bucket_name}/json_out");
7177
df.clone()
7278
.write_json(&json_out, DataFrameWriteOptions::new(), None)
7379
.await?;
7480

75-
//write as csv to s3
81+
// write as csv to s3
7682
let csv_out = format!("s3://{bucket_name}/csv_out");
7783
df.write_csv(&csv_out, DataFrameWriteOptions::new(), None)
7884
.await?;
Lines changed: 93 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,93 @@
1+
// Licensed to the Apache Software Foundation (ASF) under one
2+
// or more contributor license agreements. See the NOTICE file
3+
// distributed with this work for additional information
4+
// regarding copyright ownership. The ASF licenses this file
5+
// to you under the Apache License, Version 2.0 (the
6+
// "License"); you may not use this file except in compliance
7+
// with the License. You may obtain a copy of the License at
8+
//
9+
// http://www.apache.org/licenses/LICENSE-2.0
10+
//
11+
// Unless required by applicable law or agreed to in writing,
12+
// software distributed under the License is distributed on an
13+
// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
14+
// KIND, either express or implied. See the License for the
15+
// specific language governing permissions and limitations
16+
// under the License.
17+
18+
//! # These are using data from Amazon S3 examples
19+
//!
20+
//! These examples demonstrate how to work with data from Amazon S3.
21+
//!
22+
//! ## Usage
23+
//! ```bash
24+
//! cargo run --example external_dependency -- [dataframe_to_s3|query_aws_s3]
25+
//! ```
26+
//!
27+
//! Each subcommand runs a corresponding example:
28+
//! - `dataframe_to_s3` — run a query using a DataFrame against a parquet file from AWS S3 and writing back to AWS S3
29+
//! - `query_aws_s3` — configure `object_store` and run a query against files stored in AWS S3
30+
31+
mod dataframe_to_s3;
32+
mod query_aws_s3;
33+
34+
use std::str::FromStr;
35+
36+
use datafusion::error::{DataFusionError, Result};
37+
38+
enum ExampleKind {
39+
DataframeToS3,
40+
QueryAwsS3,
41+
}
42+
43+
impl AsRef<str> for ExampleKind {
44+
fn as_ref(&self) -> &str {
45+
match self {
46+
Self::DataframeToS3 => "dataframe_to_s3",
47+
Self::QueryAwsS3 => "query_aws_s3",
48+
}
49+
}
50+
}
51+
52+
impl FromStr for ExampleKind {
53+
type Err = DataFusionError;
54+
55+
fn from_str(s: &str) -> Result<Self> {
56+
match s {
57+
"dataframe_to_s3" => Ok(Self::DataframeToS3),
58+
"query_aws_s3" => Ok(Self::QueryAwsS3),
59+
_ => Err(DataFusionError::Execution(format!("Unknown example: {s}"))),
60+
}
61+
}
62+
}
63+
64+
impl ExampleKind {
65+
const ALL: [Self; 2] = [Self::DataframeToS3, Self::QueryAwsS3];
66+
67+
const EXAMPLE_NAME: &str = "external_dependency";
68+
69+
fn variants() -> Vec<&'static str> {
70+
Self::ALL.iter().map(|x| x.as_ref()).collect()
71+
}
72+
}
73+
74+
#[tokio::main]
75+
async fn main() -> Result<()> {
76+
let usage = format!(
77+
"Usage: cargo run --example {} -- [{}]",
78+
ExampleKind::EXAMPLE_NAME,
79+
ExampleKind::variants().join("|")
80+
);
81+
82+
let arg = std::env::args().nth(1).ok_or_else(|| {
83+
eprintln!("{usage}");
84+
DataFusionError::Execution("Missing argument".to_string())
85+
})?;
86+
87+
match arg.parse::<ExampleKind>()? {
88+
ExampleKind::DataframeToS3 => dataframe_to_s3::dataframe_to_s3().await?,
89+
ExampleKind::QueryAwsS3 => query_aws_s3::query_aws_s3().await?,
90+
}
91+
92+
Ok(())
93+
}

datafusion-examples/examples/external_dependency/query-aws-s3.rs renamed to datafusion-examples/examples/external_dependency/query_aws_s3.rs

Lines changed: 7 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -15,21 +15,22 @@
1515
// specific language governing permissions and limitations
1616
// under the License.
1717

18+
//! See `main.rs` for how to run it.
19+
1820
use datafusion::error::Result;
1921
use datafusion::prelude::*;
2022
use object_store::aws::AmazonS3Builder;
2123
use std::env;
2224
use std::sync::Arc;
2325
use url::Url;
2426

25-
/// This example demonstrates querying data in an S3 bucket.
27+
/// This example demonstrates querying data in a public S3 bucket
28+
/// (the NYC TLC open dataset: `s3://nyc-tlc`).
2629
///
2730
/// The following environment variables must be defined:
28-
///
29-
/// - AWS_ACCESS_KEY_ID
30-
/// - AWS_SECRET_ACCESS_KEY
31-
#[tokio::main]
32-
async fn main() -> Result<()> {
31+
/// - `AWS_ACCESS_KEY_ID`
32+
/// - `AWS_SECRET_ACCESS_KEY`
33+
pub async fn query_aws_s3() -> Result<()> {
3334
let ctx = SessionContext::new();
3435

3536
// the region must be set to the region where the bucket exists until the following

0 commit comments

Comments
 (0)