Skip to content

Commit d16cffa

Browse files
Add Python 3.14 support (duckdb#116)
Adding 3.14 support: - Catching import errors for pyarrow. - Moving from the deprecated `typing._UnionGenericAlias` to check whether a given object is a Union to `isinstance(obj, types.UnionType)` for >= py310 and `typing.get_origin(obj) is typing.Union` as fallback. This meant running the importcache code generation scripts, which needed to output a new header (and had a small bug). - Moving from the [deprecated](https://pandas.pydata.org/docs/user_guide/indexing.html#why-does-assignment-fail-when-using-chained-indexing) [pandas chained assignment](https://pandas.pydata.org/docs/user_guide/copy_on_write.html#copy-on-write-chained-assignment) - which will never work with Copy-on-Write enabled in Pandas, which in turn will be the default from pandas 3.0 onwards - to a delete-and-write approach for `tz_localize` columns in Pandas. Apart from the chained assignment issue, the tz is now also explicitly part of the column's datatype, and pandas doesn't like it when we change the timezone and throws a `FutureWarning: Setting an item of incompatible dtype is deprecated and will raise in a future error of pandas. Value '<DatetimeArray>`. - Remove py313 from the "fast" sanity check of the packaging workflow. - Added `--quiet` to `uv export` in packaging_wheels. The output has been very useful to set the env markers correctly for all dependencies (which is much better now than it was in-tree) but I agree it's a bit noisy. It was already set for sdists, so may as well have it here as well.
2 parents cb1a499 + 3ee4f02 commit d16cffa

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

46 files changed

+215
-151
lines changed

.github/workflows/cleanup_pypi.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -50,9 +50,9 @@ jobs:
5050
exit 1
5151
5252
- name: Install Astral UV
53-
uses: astral-sh/setup-uv@v6
53+
uses: astral-sh/setup-uv@v7
5454
with:
55-
version: "0.7.14"
55+
version: "0.9.0"
5656

5757
- name: Run Cleanup
5858
env:

.github/workflows/code_quality.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -29,9 +29,9 @@ jobs:
2929
persist-credentials: false
3030

3131
- name: Install Astral UV
32-
uses: astral-sh/setup-uv@v6
32+
uses: astral-sh/setup-uv@v7
3333
with:
34-
version: "0.7.14"
34+
version: "0.9.0"
3535
python-version: 3.9
3636

3737
- name: pre-commit (cache)

.github/workflows/coverage.yml

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -68,9 +68,9 @@ jobs:
6868
sudo apt-get -y install ccache
6969
7070
- name: Install Astral UV and enable the cache
71-
uses: astral-sh/setup-uv@v6
71+
uses: astral-sh/setup-uv@v7
7272
with:
73-
version: "0.7.14"
73+
version: "0.9.0"
7474
python-version: 3.9
7575
enable-cache: true
7676
cache-suffix: -${{ github.workflow }}
@@ -79,7 +79,7 @@ jobs:
7979
shell: bash
8080
run: |
8181
if [[ "${{ inputs.testsuite }}" == "all" ]]; then
82-
uv run coverage run -m pytest ./tests --ignore=./tests/stubs
82+
uv run coverage run -m pytest ./tests
8383
elif [[ "${{ inputs.testsuite }}" == "fast" ]]; then
8484
uv run coverage run -m pytest ./tests/fast
8585
else

.github/workflows/packaging_sdist.yml

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -56,9 +56,9 @@ jobs:
5656
run: echo "OVERRIDE_GIT_DESCRIBE=${{ inputs.set-version }}" >> $GITHUB_ENV
5757

5858
- name: Install Astral UV
59-
uses: astral-sh/setup-uv@v6
59+
uses: astral-sh/setup-uv@v7
6060
with:
61-
version: "0.7.14"
61+
version: "0.9.0"
6262
python-version: 3.11
6363

6464
- name: Build sdist
@@ -80,7 +80,7 @@ jobs:
8080
# run tests
8181
tests_root="${{ github.workspace }}/tests"
8282
tests_dir="${tests_root}${{ inputs.testsuite == 'fast' && '/fast' || '/' }}"
83-
uv run --verbose pytest $tests_dir --verbose --ignore=${tests_root}/stubs
83+
uv run --verbose pytest -c ${{ github.workspace }}/pyproject.toml $tests_dir
8484
8585
- id: versioning
8686
run: |

.github/workflows/packaging_wheels.yml

Lines changed: 19 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@ jobs:
3030
strategy:
3131
fail-fast: false
3232
matrix:
33-
python: [ cp39, cp310, cp311, cp312, cp313 ]
33+
python: [ cp39, cp310, cp311, cp312, cp313, cp314 ]
3434
platform:
3535
- { os: windows-2025, arch: amd64, cibw_system: win }
3636
- { os: ubuntu-24.04, arch: x86_64, cibw_system: manylinux }
@@ -44,16 +44,29 @@ jobs:
4444
- { minimal: true, python: cp310 }
4545
- { minimal: true, python: cp311 }
4646
- { minimal: true, python: cp312 }
47+
- { minimal: true, python: cp313 }
4748
- { minimal: true, platform: { arch: universal2 } }
4849
runs-on: ${{ matrix.platform.os }}
4950
env:
51+
### cibuildwheel configuration
52+
#
53+
# This is somewhat brittle, so be careful with changes. Some notes for our future selves (and others):
54+
# - cibw will change its cwd to a temp dir and create a separate venv for testing. It then installs the wheel it
55+
# built into that venv, and run the CIBW_TEST_COMMAND. We have to install all dependencies ourselves, and make
56+
# sure that the pytest config in pyproject.toml is available.
57+
# - CIBW_BEFORE_TEST installs the test dependencies by exporting them into a pylock.toml. At the time of writing,
58+
# `uv sync --no-install-project` had problems correctly resolving dependencies using resolution environments
59+
# across all platforms we build for. This might be solved in newer uv versions.
60+
# - CIBW_TEST_COMMAND specifies pytest conf from pyproject.toml. --confcutdir is needed to prevent pytest from
61+
# traversing the full filesystem, which produces an error on Windows.
62+
# - CIBW_TEST_SKIP we always skip tests for *-macosx_universal2 builds, because we run tests for arm64 and x86_64.
5063
CIBW_TEST_SKIP: ${{ inputs.testsuite == 'none' && '*' || '*-macosx_universal2' }}
5164
CIBW_TEST_SOURCES: tests
5265
CIBW_BEFORE_TEST: >
53-
uv export --only-group test --no-emit-project --output-file pylock.toml --directory {project} &&
66+
uv export --only-group test --no-emit-project --quiet --output-file pylock.toml --directory {project} &&
5467
uv pip install -r pylock.toml
5568
CIBW_TEST_COMMAND: >
56-
uv run -v pytest ${{ inputs.testsuite == 'fast' && './tests/fast' || './tests' }} --verbose --ignore=./tests/stubs
69+
uv run -v pytest --confcutdir=. --rootdir . -c {project}/pyproject.toml ${{ inputs.testsuite == 'fast' && './tests/fast' || './tests' }}
5770
5871
steps:
5972
- name: Checkout DuckDB Python
@@ -78,14 +91,14 @@ jobs:
7891
run: echo "CIBW_ENVIRONMENT=OVERRIDE_GIT_DESCRIBE=${{ inputs.set-version }}" >> $GITHUB_ENV
7992

8093
# Install Astral UV, which will be used as build-frontend for cibuildwheel
81-
- uses: astral-sh/setup-uv@v6
94+
- uses: astral-sh/setup-uv@v7
8295
with:
83-
version: "0.7.14"
96+
version: "0.9.0"
8497
enable-cache: false
8598
cache-suffix: -${{ matrix.python }}-${{ matrix.platform.cibw_system }}_${{ matrix.platform.arch }}
8699

87100
- name: Build${{ inputs.testsuite != 'none' && ' and test ' || ' ' }}wheels
88-
uses: pypa/cibuildwheel@v3.0
101+
uses: pypa/cibuildwheel@v3.2
89102
env:
90103
CIBW_ARCHS: ${{ matrix.platform.arch == 'amd64' && 'AMD64' || matrix.platform.arch }}
91104
CIBW_BUILD: ${{ matrix.python }}-${{ matrix.platform.cibw_system }}_${{ matrix.platform.arch }}

cmake/compiler_launcher.cmake

Lines changed: 2 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -8,8 +8,7 @@ include(CMakeParseArguments)
88
# Function to look for ccache and sccache to speed up builds, if available
99
# ────────────────────────────────────────────
1010
function(setup_compiler_launcher_if_available)
11-
if(NOT DEFINED CMAKE_C_COMPILER_LAUNCHER AND NOT DEFINED
12-
ENV{CMAKE_C_COMPILER_LAUNCHER})
11+
if(NOT DEFINED CMAKE_C_COMPILER_LAUNCHER)
1312
find_program(COMPILER_LAUNCHER NAMES ccache sccache)
1413
if(COMPILER_LAUNCHER)
1514
message(STATUS "Using ${COMPILER_LAUNCHER} as C compiler launcher")
@@ -19,8 +18,7 @@ function(setup_compiler_launcher_if_available)
1918
endif()
2019
endif()
2120

22-
if(NOT DEFINED CMAKE_CXX_COMPILER_LAUNCHER
23-
AND NOT DEFINED ENV{CMAKE_CXX_COMPILER_LAUNCHER})
21+
if(NOT DEFINED CMAKE_CXX_COMPILER_LAUNCHER)
2422
find_program(COMPILER_LAUNCHER NAMES ccache sccache)
2523
if(COMPILER_LAUNCHER)
2624
message(STATUS "Using ${COMPILER_LAUNCHER} as C++ compiler launcher")

pyproject.toml

Lines changed: 7 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -47,7 +47,7 @@ all = [ # users can install duckdb with 'duckdb[all]', which will install this l
4747
"fsspec", # used in duckdb.filesystem
4848
"numpy", # used in duckdb.experimental.spark and in duckdb.fetchnumpy()
4949
"pandas", # used for pandas dataframes all over the place
50-
"pyarrow", # used for pyarrow support
50+
"pyarrow; python_version < '3.14'", # used for pyarrow support
5151
"adbc-driver-manager", # for the adbc driver
5252
]
5353

@@ -226,13 +226,14 @@ stubdeps = [ # dependencies used for typehints in the stubs
226226
"fsspec",
227227
"pandas",
228228
"polars",
229-
"pyarrow",
229+
"pyarrow; python_version < '3.14'",
230230
]
231231
test = [ # dependencies used for running tests
232232
"adbc-driver-manager",
233233
"pytest",
234234
"pytest-reraise",
235235
"pytest-timeout",
236+
"pytest-timestamper",
236237
"mypy",
237238
"coverage",
238239
"gcovr",
@@ -248,8 +249,8 @@ test = [ # dependencies used for running tests
248249
"urllib3",
249250
"fsspec>=2022.11.0",
250251
"pandas>=2.0.0",
251-
"pyarrow>=18.0.0",
252-
"torch>=2.2.2; sys_platform != 'darwin' or platform_machine != 'x86_64' or python_version < '3.13'",
252+
"pyarrow>=18.0.0; python_version < '3.14'",
253+
"torch>=2.2.2; python_version < '3.14' and ( sys_platform != 'darwin' or platform_machine != 'x86_64' or python_version < '3.13' )",
253254
"tensorflow==2.14.0; sys_platform == 'darwin' and python_version < '3.12'",
254255
"tensorflow-cpu>=2.14.0; sys_platform == 'linux' and platform_machine != 'aarch64' and python_version < '3.12'",
255256
"tensorflow-cpu>=2.14.0; sys_platform == 'win32' and python_version < '3.12'",
@@ -265,7 +266,7 @@ scripts = [ # dependencies used for running scripts
265266
"pandas",
266267
"pcpp",
267268
"polars",
268-
"pyarrow",
269+
"pyarrow; python_version < '3.14'",
269270
"pytz"
270271
]
271272
pypi = [ # dependencies used by the pypi cleanup script
@@ -302,7 +303,7 @@ dev = [ # tooling like uv will install this automatically when syncing the envir
302303

303304
[tool.pytest.ini_options]
304305
minversion = "6.0"
305-
addopts = "-ra -q"
306+
addopts = "-ra --verbose"
306307
testpaths = ["tests"]
307308
filterwarnings = [
308309
"error",

scripts/cache_data.json

Lines changed: 14 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -622,7 +622,8 @@
622622
"full_path": "typing",
623623
"name": "typing",
624624
"children": [
625-
"typing._UnionGenericAlias"
625+
"typing.Union",
626+
"typing.get_origin"
626627
]
627628
},
628629
"typing._UnionGenericAlias": {
@@ -793,5 +794,17 @@
793794
"full_path": "pyarrow.decimal128",
794795
"name": "decimal128",
795796
"children": []
797+
},
798+
"typing.get_origin": {
799+
"type": "attribute",
800+
"full_path": "typing.get_origin",
801+
"name": "get_origin",
802+
"children": []
803+
},
804+
"typing.Union": {
805+
"type": "attribute",
806+
"full_path": "typing.Union",
807+
"name": "Union",
808+
"children": []
796809
}
797810
}

scripts/generate_import_cache_cpp.py

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -151,10 +151,10 @@ def to_string(self):
151151
152152
//! Note: This class is generated using scripts.
153153
//! If you need to add a new object to the cache you must:
154-
//! 1. adjust tools/pythonpkg/scripts/imports.py
155-
//! 2. run python3 tools/pythonpkg/scripts/generate_import_cache_json.py
156-
//! 3. run python3 tools/pythonpkg/scripts/generate_import_cache_cpp.py
157-
//! 4. run make format-main (the generator doesn't respect the formatting rules ;))
154+
//! 1. adjust scripts/imports.py
155+
//! 2. run python scripts/generate_import_cache_json.py
156+
//! 3. run python scripts/generate_import_cache_cpp.py
157+
//! 4. run pre-commit to fix formatting errors
158158
159159
namespace duckdb {{
160160
{self.get_classes()}
@@ -230,7 +230,7 @@ def get_root_modules(files: list[ModuleFile]):
230230

231231

232232
def get_module_file_path_includes(files: list[ModuleFile]):
233-
template = '#include "duckdb_python/import_cache/modules/{}'
233+
template = '#include "duckdb_python/import_cache/modules/{}"'
234234
return "\n".join(template.format(f.file_name) for f in files)
235235

236236

scripts/imports.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -128,7 +128,8 @@
128128

129129
import typing
130130

131-
typing._UnionGenericAlias
131+
typing.Union
132+
typing.get_origin
132133

133134
import uuid
134135

0 commit comments

Comments
 (0)