Skip to content
Merged
Show file tree
Hide file tree
Changes from 22 commits
Commits
Show all changes
32 commits
Select commit Hold shift + click to select a range
92894b5
Refactor schema, config, dataframe, and expression classes to use RwL…
kosiew Sep 26, 2025
7030cec
Add error handling to CaseBuilder methods to preserve builder state
kosiew Sep 26, 2025
dba5c6a
Refactor to use parking_lot for interior mutability in schema, config…
kosiew Sep 26, 2025
cfc9f2c
Add concurrency tests for SqlSchema, Config, and DataFrame
kosiew Sep 26, 2025
03a1022
Add tests for CaseBuilder to ensure builder state is preserved on suc…
kosiew Sep 26, 2025
d6cdfe3
Add test for independent handles in CaseBuilder to verify behavior
kosiew Sep 26, 2025
1755937
Fix CaseBuilder to preserve state correctly in when() method
kosiew Sep 26, 2025
b6ce4ae
Refactor to use named constant for boolean literals in test_expr.py
kosiew Sep 26, 2025
fd504a1
fix ruff errors
kosiew Sep 26, 2025
4b01772
Refactor to introduce type aliases for cached batches in dataframe.rs
kosiew Sep 26, 2025
1c73410
Cherry pick from #1252
ntjohnson1 Sep 25, 2025
d5914c2
Add most expr - cherry pick from #1252
ntjohnson1 Sep 25, 2025
fe3ad12
Add source root - cherry pick #1252
ntjohnson1 Sep 25, 2025
509850e
Fix license comment formatting in config.rs
kosiew Sep 28, 2025
c95e8b1
Refactor caching logic to use a local variable for IPython environmen…
kosiew Sep 29, 2025
799e8fb
Add test for ensuring exposed pyclasses default to frozen
kosiew Sep 29, 2025
6de60bc
Add PyO3 class mutability guidelines reference to contributor guide
kosiew Sep 29, 2025
b213bd4
Mark boolean expression classes as frozen for immutability
kosiew Sep 29, 2025
64faca2
Refactor PyCaseBuilder methods to eliminate redundant take/store logic
kosiew Sep 29, 2025
5caec09
Refactor PyConfig methods to improve readability by encapsulating con…
kosiew Sep 29, 2025
a905154
Resolve patch apply conflicts for CaseBuilder concurrency improvements
kosiew Sep 29, 2025
428839d
Resolve Config optimization conflicts for improved read/write concurr…
kosiew Sep 29, 2025
34a6078
Refactor PyConfig get methods for improved readability and performance
kosiew Sep 29, 2025
09d9ab8
Refactor test_expr.py to replace positional boolean literals with nam…
kosiew Oct 1, 2025
2df2f5f
fix ruff errors
kosiew Oct 1, 2025
2c76271
Add license header to test_pyclass_frozen.py for compliance
kosiew Oct 1, 2025
8a52e23
Alternate approach to case expression
timsaucer Oct 4, 2025
1b97b41
Replace case builter with keeping the expressions and then applying a…
timsaucer Oct 4, 2025
fc27bd5
Update unit tests
timsaucer Oct 4, 2025
d247d64
Refactor case and when functions to utilize PyCaseBuilder for improve…
kosiew Oct 6, 2025
9536e02
Update src/expr/conditional_expr.rs
timsaucer Oct 6, 2025
6e384ba
Merge remote-tracking branch 'timsaucer/tsaucer/case_expr_clone_worka…
kosiew Oct 7, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions Cargo.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

1 change: 1 addition & 0 deletions Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -51,6 +51,7 @@ futures = "0.3"
object_store = { version = "0.12.3", features = ["aws", "gcp", "azure", "http"] }
url = "2"
log = "0.4.27"
parking_lot = "0.12"

[build-dependencies]
prost-types = "0.13.1" # keep in line with `datafusion-substrait`
Expand Down
61 changes: 61 additions & 0 deletions docs/source/contributor-guide/ffi.rst
Original file line number Diff line number Diff line change
Expand Up @@ -137,6 +137,67 @@ and you want to create a sharable FFI counterpart, you could write:
let my_provider = MyTableProvider::default();
let ffi_provider = FFI_TableProvider::new(Arc::new(my_provider), false, None);

.. _ffi_pyclass_mutability:

PyO3 class mutability guidelines
--------------------------------

PyO3 bindings should present immutable wrappers whenever a struct stores shared or
interior-mutable state. In practice this means that any ``#[pyclass]`` containing an
``Arc<RwLock<_>>`` or similar synchronized primitive must opt into ``#[pyclass(frozen)]``
unless there is a compelling reason not to.

The :mod:`datafusion` configuration helpers illustrate the preferred pattern. The
``PyConfig`` class in :file:`src/config.rs` stores an ``Arc<RwLock<ConfigOptions>>`` and is
explicitly frozen so callers interact with configuration state through provided methods
instead of mutating the container directly:

.. code-block:: rust

#[pyclass(name = "Config", module = "datafusion", subclass, frozen)]
#[derive(Clone)]
pub(crate) struct PyConfig {
config: Arc<RwLock<ConfigOptions>>,
}

The same approach applies to execution contexts. ``PySessionContext`` in
:file:`src/context.rs` stays frozen even though it shares mutable state internally via
``SessionContext``. This ensures PyO3 tracks borrows correctly while Python-facing APIs
clone the inner ``SessionContext`` or return new wrappers instead of mutating the
existing instance in place:

.. code-block:: rust

#[pyclass(frozen, name = "SessionContext", module = "datafusion", subclass)]
#[derive(Clone)]
pub struct PySessionContext {
pub ctx: SessionContext,
}

Occasionally a type must remain mutable—for example when PyO3 attribute setters need to
update fields directly. In these rare cases add an inline justification so reviewers and
future contributors understand why ``frozen`` is unsafe to enable. ``DataTypeMap`` in
:file:`src/common/data_type.rs` includes such a comment because PyO3 still needs to track
field updates:

.. code-block:: rust

// TODO: This looks like this needs pyo3 tracking so leaving unfrozen for now
#[derive(Debug, Clone)]
#[pyclass(name = "DataTypeMap", module = "datafusion.common", subclass)]
pub struct DataTypeMap {
#[pyo3(get, set)]
pub arrow_type: PyDataType,
#[pyo3(get, set)]
pub python_type: PythonType,
#[pyo3(get, set)]
pub sql_type: SqlType,
}

When reviewers encounter a mutable ``#[pyclass]`` without a comment, they should request
an explanation or ask that ``frozen`` be added. Keeping these wrappers frozen by default
helps avoid subtle bugs stemming from PyO3's interior mutability tracking.

If you were interfacing with a library that provided the above ``FFI_TableProvider`` and
you needed to turn it back into an ``TableProvider``, you can turn it into a
``ForeignTableProvider`` with implements the ``TableProvider`` trait.
Expand Down
4 changes: 4 additions & 0 deletions docs/source/contributor-guide/introduction.rst
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,10 @@ We welcome and encourage contributions of all kinds, such as:
In addition to submitting new PRs, we have a healthy tradition of community members reviewing each other’s PRs.
Doing so is a great way to help the community as well as get more familiar with Rust and the relevant codebases.

Before opening a pull request that touches PyO3 bindings, please review the
:ref:`PyO3 class mutability guidelines <ffi_pyclass_mutability>` so you can flag missing
``#[pyclass(frozen)]`` annotations during development and review.

How to develop
--------------

Expand Down
125 changes: 125 additions & 0 deletions python/tests/test_concurrency.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,125 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.

from __future__ import annotations

from concurrent.futures import ThreadPoolExecutor

import pyarrow as pa
from datafusion import Config, SessionContext, col, lit
from datafusion import functions as f
from datafusion.common import SqlSchema


def _run_in_threads(fn, count: int = 8) -> None:
with ThreadPoolExecutor(max_workers=count) as executor:
futures = [executor.submit(fn, i) for i in range(count)]
for future in futures:
# Propagate any exception raised in the worker thread.
future.result()


def test_concurrent_access_to_shared_structures() -> None:
"""Exercise SqlSchema, Config, and DataFrame concurrently."""

schema = SqlSchema("concurrency")
config = Config()
ctx = SessionContext()

batch = pa.record_batch([pa.array([1, 2, 3], type=pa.int32())], names=["value"])
df = ctx.create_dataframe([[batch]])

config_key = "datafusion.execution.batch_size"
expected_rows = batch.num_rows

def worker(index: int) -> None:
schema.name = f"concurrency-{index}"
assert schema.name.startswith("concurrency-")
# Exercise getters that use internal locks.
assert isinstance(schema.tables, list)
assert isinstance(schema.views, list)
assert isinstance(schema.functions, list)

config.set(config_key, str(1024 + index))
assert config.get(config_key) is not None
# Access the full config map to stress lock usage.
assert config_key in config.get_all()

batches = df.collect()
assert sum(batch.num_rows for batch in batches) == expected_rows

_run_in_threads(worker, count=12)


def test_config_set_during_get_all() -> None:
"""Ensure config writes proceed while another thread reads all entries."""

config = Config()
key = "datafusion.execution.batch_size"

def reader() -> None:
for _ in range(200):
# get_all should not hold the lock while converting to Python objects
config.get_all()

def writer() -> None:
for index in range(200):
config.set(key, str(1024 + index))

with ThreadPoolExecutor(max_workers=2) as executor:
reader_future = executor.submit(reader)
writer_future = executor.submit(writer)
reader_future.result(timeout=10)
writer_future.result(timeout=10)

assert config.get(key) is not None


def test_case_builder_reuse_from_multiple_threads() -> None:
"""Ensure the case builder can be safely reused across threads."""

ctx = SessionContext()
values = pa.array([0, 1, 2, 3, 4], type=pa.int32())
df = ctx.create_dataframe([[pa.record_batch([values], names=["value"])]])

base_builder = f.case(col("value"))

def add_case(i: int) -> None:
base_builder.when(lit(i), lit(f"value-{i}"))

_run_in_threads(add_case, count=8)

with ThreadPoolExecutor(max_workers=2) as executor:
otherwise_future = executor.submit(base_builder.otherwise, lit("default"))
case_expr = otherwise_future.result()

result = df.select(case_expr.alias("label")).collect()
assert sum(batch.num_rows for batch in result) == len(values)

predicate_builder = f.when(col("value") == lit(0), lit("zero"))

def add_predicate(i: int) -> None:
predicate_builder.when(col("value") == lit(i + 1), lit(f"value-{i + 1}"))

_run_in_threads(add_predicate, count=4)

with ThreadPoolExecutor(max_workers=2) as executor:
end_future = executor.submit(predicate_builder.end)
predicate_expr = end_future.result()

result = df.select(predicate_expr.alias("label")).collect()
assert sum(batch.num_rows for batch in result) == len(values)
102 changes: 102 additions & 0 deletions python/tests/test_expr.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@
# under the License.

import re
from concurrent.futures import ThreadPoolExecutor
from datetime import datetime, timezone

import pyarrow as pa
Expand Down Expand Up @@ -53,6 +54,10 @@
ensure_expr_list,
)

# Avoid passing boolean literals positionally (FBT003). Use a named constant
# so linters don't see a bare True/False literal in a function call.
_TRUE = True
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this reduces code readability. Instead we can whitelist functions like lit

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

whitelist functions like lit

I'll implement this.



@pytest.fixture
def test_ctx():
Expand Down Expand Up @@ -200,6 +205,103 @@ def traverse_logical_plan(plan):
assert not variant.negated()


def test_case_builder_error_preserves_builder_state():
case_builder = functions.when(lit(_TRUE), lit(1))

with pytest.raises(Exception) as exc_info:
case_builder.otherwise(lit("bad"))

err_msg = str(exc_info.value)
assert "multiple data types" in err_msg
assert "CaseBuilder has already been consumed" not in err_msg

with pytest.raises(Exception) as exc_info:
case_builder.end()

err_msg = str(exc_info.value)
assert "multiple data types" in err_msg
assert "CaseBuilder has already been consumed" not in err_msg


def test_case_builder_success_preserves_builder_state():
ctx = SessionContext()
df = ctx.from_pydict({"flag": [False]}, name="tbl")

case_builder = functions.when(col("flag"), lit("true"))

expr_default_one = case_builder.otherwise(lit("default-1")).alias("result")
result_one = df.select(expr_default_one).collect()
assert result_one[0].column(0).to_pylist() == ["default-1"]

expr_default_two = case_builder.otherwise(lit("default-2")).alias("result")
result_two = df.select(expr_default_two).collect()
assert result_two[0].column(0).to_pylist() == ["default-2"]

expr_end_one = case_builder.end().alias("result")
end_one = df.select(expr_end_one).collect()
assert end_one[0].column(0).to_pylist() == ["default-2"]

expr_end_two = case_builder.end().alias("result")
end_two = df.select(expr_end_two).collect()
assert end_two[0].column(0).to_pylist() == ["default-2"]


def test_case_builder_when_handles_are_independent():
ctx = SessionContext()
df = ctx.from_pydict(
{
"flag": [True, False, False, False],
"value": [1, 15, 25, 5],
},
name="tbl",
)

base_builder = functions.when(col("flag"), lit("flag-true"))

first_builder = base_builder.when(col("value") > lit(10), lit("gt10"))
second_builder = base_builder.when(col("value") > lit(20), lit("gt20"))

first_builder = first_builder.when(lit(_TRUE), lit("final-one"))

expr_first = first_builder.otherwise(lit("fallback-one")).alias("first")
expr_second = second_builder.otherwise(lit("fallback-two")).alias("second")

result = df.select(expr_first, expr_second).collect()[0]

assert result.column(0).to_pylist() == [
"flag-true",
"gt10",
"gt10",
"final-one",
]
assert result.column(1).to_pylist() == [
"flag-true",
"gt10",
"gt10",
"fallback-two",
]


def test_case_builder_when_thread_safe():
case_builder = functions.when(lit(_TRUE), lit(1))

def build_expr(value: int) -> bool:
builder = case_builder.when(lit(_TRUE), lit(value))
builder.otherwise(lit(value))
return True

with ThreadPoolExecutor(max_workers=8) as executor:
futures = [executor.submit(build_expr, idx) for idx in range(16)]
results = [future.result() for future in futures]

assert all(results)

# Ensure the shared builder remains usable after concurrent `when` calls.
follow_up_builder = case_builder.when(lit(_TRUE), lit(42))
assert isinstance(follow_up_builder, type(case_builder))
follow_up_builder.otherwise(lit(7))


def test_expr_getitem() -> None:
ctx = SessionContext()
data = {
Expand Down
Loading
Loading