Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
27 commits
Select commit Hold shift + click to select a range
b0a7878
Core: Respect partition evolution in inspect.partitions (#2845)
010Soham Dec 24, 2025
59cdf33
fix: Add Cython build step to Makefile (#2869)
geruh Dec 28, 2025
fa03e08
feat: Add models for rest scan planning (#2861)
geruh Dec 29, 2025
a8033c1
feat: Allow servers to express supported endpoints with ConfigRespons…
geruh Dec 29, 2025
4a2d205
Build: Bump pytest-checkdocs from 2.13.0 to 2.14.0 (#2875)
dependabot[bot] Dec 29, 2025
aa87147
Build: Bump prek from 0.2.23 to 0.2.25 (#2873)
dependabot[bot] Dec 29, 2025
1f7b51f
Build: Bump pyparsing from 3.3.0 to 3.3.1 (#2872)
dependabot[bot] Dec 29, 2025
1b69a25
fix: Validate SetStatisticsUpdate correctly (#2866)
ragnard Dec 30, 2025
4ef55d3
chore: Use `SnapshotRefType` Enum instead of hard-coded strings (#2880)
jayceslesar Jan 4, 2026
2d8397e
feat: Add `snapshot_properties` to upsert operation (#2829)
somasays Jan 5, 2026
dea8078
infra: Add Python 3.13 support (#2863)
kevinjqliu Jan 5, 2026
41b790c
Build: Bump cython from 3.2.3 to 3.2.4 (#2884)
dependabot[bot] Jan 5, 2026
53d45ed
Build: Bump google-auth from 2.45.0 to 2.46.0 (#2883)
dependabot[bot] Jan 5, 2026
4ac3188
Build: Bump pypa/cibuildwheel from 3.3.0 to 3.3.1 (#2882)
dependabot[bot] Jan 5, 2026
d0cbc19
infra: auto update docker image (#2885)
kevinjqliu Jan 6, 2026
70c63be
infra: use Iceberg 1.10.1 (#2886)
kevinjqliu Jan 6, 2026
8aa8515
infra: spark 4 already comes with spark connect jar (#2887)
kevinjqliu Jan 6, 2026
7b84d10
Clean up logging: move exception tracebacks to debug level (#2867)
kevinjqliu Jan 6, 2026
4547e91
cli: add log level param (#2868)
kevinjqliu Jan 6, 2026
85c9cb6
Fix live reload for `make docs-serve` (#2889)
kevinjqliu Jan 6, 2026
3855f64
Core: Improve error for null/unknown schema types in table creation (…
010Soham Jan 8, 2026
ce31fc9
chore: Remove unused [tool.black] and [tool.pycln] config (#2891)
geruh Jan 12, 2026
b0880c8
feat: Add Set Current Snapshot to ManageSnapshots API (#2871)
geruh Jan 12, 2026
50e6c6a
try 0.8.0rc1
kevinjqliu Dec 22, 2025
9e2fde4
align datafusion version with iceberg-rust
kevinjqliu Dec 23, 2025
0d0a003
Remove deprecated datafusion APIs
Fokko Nov 17, 2025
4769c82
try 0.8.8rc2
kevinjqliu Jan 12, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 3 additions & 2 deletions .github/workflows/pypi-build-artifacts.yml
Original file line number Diff line number Diff line change
Expand Up @@ -45,6 +45,7 @@ jobs:
3.10
3.11
3.12
3.13

- name: Install UV
uses: astral-sh/setup-uv@v7
Expand All @@ -61,14 +62,14 @@ jobs:
if: startsWith(matrix.os, 'ubuntu')

- name: Build wheels
uses: pypa/cibuildwheel@v3.3.0
uses: pypa/cibuildwheel@v3.3.1
with:
output-dir: wheelhouse
config-file: "pyproject.toml"
env:
# Ignore 32 bit architectures
CIBW_ARCHS: "auto64"
CIBW_PROJECT_REQUIRES_PYTHON: ">=3.10"
CIBW_PROJECT_REQUIRES_PYTHON: ">=3.10,<3.14"
CIBW_TEST_REQUIRES: "pytest==7.4.2 moto==5.0.1"
CIBW_TEST_COMMAND: "pytest {project}/tests/avro/test_decoder.py"
# Ignore tests for pypy since not all dependencies are compiled for it
Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/python-ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ jobs:
runs-on: ubuntu-latest
strategy:
matrix:
python: ['3.10', '3.11', '3.12']
python: ['3.10', '3.11', '3.12', '3.13']

steps:
- uses: actions/checkout@v6
Expand All @@ -71,7 +71,7 @@ jobs:
runs-on: ubuntu-latest
strategy:
matrix:
python: ['3.10', '3.11', '3.12']
python: ['3.10', '3.11', '3.12', '3.13']

steps:
- uses: actions/checkout@v6
Expand Down
5 changes: 3 additions & 2 deletions .github/workflows/svn-build-artifacts.yml
Original file line number Diff line number Diff line change
Expand Up @@ -45,6 +45,7 @@ jobs:
3.10
3.11
3.12
3.13

- name: Install UV
uses: astral-sh/setup-uv@v7
Expand All @@ -56,14 +57,14 @@ jobs:
if: startsWith(matrix.os, 'ubuntu')

- name: Build wheels
uses: pypa/cibuildwheel@v3.3.0
uses: pypa/cibuildwheel@v3.3.1
with:
output-dir: wheelhouse
config-file: "pyproject.toml"
env:
# Ignore 32 bit architectures
CIBW_ARCHS: "auto64"
CIBW_PROJECT_REQUIRES_PYTHON: ">=3.10"
CIBW_PROJECT_REQUIRES_PYTHON: ">=3.10,<3.14"
CIBW_TEST_REQUIRES: "pytest==7.4.2 moto==5.0.1"
CIBW_TEST_COMMAND: "pytest {project}/tests/avro/test_decoder.py"
# Ignore tests for pypy since not all dependencies are compiled for it
Expand Down
6 changes: 3 additions & 3 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ setup-venv: ## Create virtual environment
uv venv $(PYTHON_ARG)

install-dependencies: setup-venv ## Install all dependencies including extras
uv sync $(PYTHON_ARG) --all-extras
uv sync $(PYTHON_ARG) --all-extras --reinstall

install: install-uv install-dependencies ## Install uv and dependencies

Expand Down Expand Up @@ -100,7 +100,7 @@ test-integration: test-integration-setup test-integration-exec test-integration-
test-integration-setup: ## Start Docker services for integration tests
docker compose -f dev/docker-compose-integration.yml kill
docker compose -f dev/docker-compose-integration.yml rm -f
docker compose -f dev/docker-compose-integration.yml up -d --wait
docker compose -f dev/docker-compose-integration.yml up -d --build --wait
uv run $(PYTHON_ARG) python dev/provision.py

test-integration-exec: ## Run integration tests (excluding provision)
Expand Down Expand Up @@ -148,7 +148,7 @@ docs-install: ## Install docs dependencies (included in default groups)
uv sync $(PYTHON_ARG) --group docs

docs-serve: ## Serve local docs preview (hot reload)
uv run $(PYTHON_ARG) mkdocs serve -f mkdocs/mkdocs.yml
uv run $(PYTHON_ARG) mkdocs serve -f mkdocs/mkdocs.yml --livereload

docs-build: ## Build the static documentation site
uv run $(PYTHON_ARG) mkdocs build -f mkdocs/mkdocs.yml --strict
Expand Down
4 changes: 1 addition & 3 deletions dev/spark/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -18,9 +18,8 @@ ARG BASE_IMAGE_SPARK_VERSION=4.0.1
FROM apache/spark:${BASE_IMAGE_SPARK_VERSION}

# Dependency versions - keep these compatible
ARG ICEBERG_VERSION=1.10.0
ARG ICEBERG_VERSION=1.10.1
ARG ICEBERG_SPARK_RUNTIME_VERSION=4.0_2.13
ARG SPARK_VERSION=4.0.1
ARG HADOOP_VERSION=3.4.1
ARG SCALA_VERSION=2.13
ARG AWS_SDK_VERSION=2.24.6
Expand All @@ -43,7 +42,6 @@ RUN mkdir -p /home/iceberg/spark-events && \

# Required JAR dependencies
ENV JARS_TO_DOWNLOAD="\
org/apache/spark/spark-connect_${SCALA_VERSION}/${SPARK_VERSION}/spark-connect_${SCALA_VERSION}-${SPARK_VERSION}.jar \
org/apache/iceberg/iceberg-spark-runtime-${ICEBERG_SPARK_RUNTIME_VERSION}/${ICEBERG_VERSION}/iceberg-spark-runtime-${ICEBERG_SPARK_RUNTIME_VERSION}-${ICEBERG_VERSION}.jar \
org/apache/iceberg/iceberg-aws-bundle/${ICEBERG_VERSION}/iceberg-aws-bundle-${ICEBERG_VERSION}.jar \
org/apache/hadoop/hadoop-aws/${HADOOP_VERSION}/hadoop-aws-${HADOOP_VERSION}.jar \
Expand Down
2 changes: 1 addition & 1 deletion mkdocs/docs/api.md
Original file line number Diff line number Diff line change
Expand Up @@ -1967,7 +1967,7 @@ iceberg_table.append(data)

# Register the table with DataFusion
ctx = SessionContext()
ctx.register_table_provider("test", iceberg_table)
ctx.register_table("test", iceberg_table)

# Query the table using DataFusion SQL
ctx.table("test").show()
Expand Down
23 changes: 22 additions & 1 deletion mkdocs/docs/contributing.md
Original file line number Diff line number Diff line change
Expand Up @@ -118,7 +118,7 @@ make lint

In addition to manually running `make lint`, you can install the pre-commit hooks in your local repo with `prek install`. By doing this, linting is run automatically every time you make a commit.

You can bump the integrations to the latest version using `prek auto-update`. This will check if there is a newer version of `{black,mypy,isort,...}` and update the yaml.
You can bump the integrations to the latest version using `prek auto-update`. This will check if there is a newer version of `{ruff,mypy,...}` and update the yaml.

## Cleaning

Expand Down Expand Up @@ -258,6 +258,27 @@ Which will warn:
Deprecated in 0.1.0, will be removed in 0.2.0. The old_property is deprecated. Please use the something_else property instead.
```

### Logging

PyIceberg uses Python's standard logging module. You can control the logging level using either:

**CLI option:**

```bash
pyiceberg --log-level DEBUG describe my_table
```

**Environment variable:**

```bash
export PYICEBERG_LOG_LEVEL=DEBUG
pyiceberg describe my_table
```

Valid log levels are: `DEBUG`, `INFO`, `WARNING` (default), `ERROR`, `CRITICAL`.

Debug logging is particularly useful for troubleshooting issues with FileIO implementations, catalog connections, and other integration points.

### Type annotations

For the type annotation the types from the `Typing` package are used.
Expand Down
12 changes: 6 additions & 6 deletions pyiceberg/catalog/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -285,8 +285,8 @@ def delete_files(io: FileIO, files_to_delete: set[str], file_type: str) -> None:
for file in files_to_delete:
try:
io.delete(file)
except OSError as exc:
logger.warning(msg=f"Failed to delete {file_type} file {file}", exc_info=exc)
except OSError:
logger.warning(f"Failed to delete {file_type} file {file}", exc_info=logger.isEnabledFor(logging.DEBUG))


def delete_data_files(io: FileIO, manifests_to_delete: list[ManifestFile]) -> None:
Expand All @@ -305,8 +305,8 @@ def delete_data_files(io: FileIO, manifests_to_delete: list[ManifestFile]) -> No
if not deleted_files.get(path, False):
try:
io.delete(path)
except OSError as exc:
logger.warning(msg=f"Failed to delete data file {path}", exc_info=exc)
except OSError:
logger.warning(f"Failed to delete data file {path}", exc_info=logger.isEnabledFor(logging.DEBUG))
deleted_files[path] = True


Expand All @@ -319,8 +319,8 @@ def _import_catalog(name: str, catalog_impl: str, properties: Properties) -> Cat
module = importlib.import_module(module_name)
class_ = getattr(module, class_name)
return class_(name, **properties)
except ModuleNotFoundError as exc:
logger.warning(f"Could not initialize Catalog: {catalog_impl}", exc_info=exc)
except ModuleNotFoundError:
logger.warning(f"Could not initialize Catalog: {catalog_impl}", exc_info=logger.isEnabledFor(logging.DEBUG))
return None


Expand Down
Loading