Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
17 commits
Select commit Hold shift + click to select a range
9f28b10
fix: enable auto_mkdir for local filesystem in StorageBackend
dimitri-yatsenko Jan 8, 2026
9737dee
fix(Top): allow order_by=None to inherit existing ordering (#1242)
dimitri-yatsenko Jan 8, 2026
c2a2eae
refactor: improve API consistency for jobs and schema.drop
dimitri-yatsenko Jan 8, 2026
cabdb74
refactor(delete): replace force_parts/force_masters with part_integrity
dimitri-yatsenko Jan 8, 2026
dacf4ac
ci: add MySQL/MinIO services to GitHub Actions workflow
dimitri-yatsenko Jan 8, 2026
ae5fd68
style: format user_tables.py
dimitri-yatsenko Jan 8, 2026
fbc4cad
ci: use docker-compose for test services
dimitri-yatsenko Jan 8, 2026
d778e4f
ci: install graphviz for ERD tests
dimitri-yatsenko Jan 8, 2026
4f4a924
fix(jobs): use MySQL server time consistently for all scheduling
dimitri-yatsenko Jan 8, 2026
344de9b
fix(jobs): use NOW(3) to match CURRENT_TIMESTAMP(3) precision
dimitri-yatsenko Jan 8, 2026
2100487
refactor(jobs): always use NOW(3) + INTERVAL for scheduled_time
dimitri-yatsenko Jan 8, 2026
1fdfb3e
ci: use pixi for CI workflow
dimitri-yatsenko Jan 8, 2026
307983a
docs: update developer guide to use pixi as primary toolchain
dimitri-yatsenko Jan 8, 2026
272fcb5
ci: disable locked mode for pixi install
dimitri-yatsenko Jan 8, 2026
b8645f8
fix(pixi): add test extras to feature-specific pypi-dependencies
dimitri-yatsenko Jan 8, 2026
27391c7
feat: add mypy type checking to pre-commit
dimitri-yatsenko Jan 8, 2026
f195110
feat: add unit tests to pre-commit hooks
dimitri-yatsenko Jan 8, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
64 changes: 41 additions & 23 deletions .github/workflows/test.yaml
Original file line number Diff line number Diff line change
@@ -1,37 +1,55 @@
name: Test

on:
push:
branches:
- "**" # every branch
- "!gh-pages" # exclude gh-pages branch
- "!stage*" # exclude branches beginning with stage
- "**"
- "!gh-pages"
- "!stage*"
paths:
- "src/datajoint"
- "tests"
- "src/datajoint/**"
- "tests/**"
- "pyproject.toml"
- "pixi.lock"
- ".github/workflows/test.yaml"
pull_request:
branches:
- "**" # every branch
- "!gh-pages" # exclude gh-pages branch
- "!stage*" # exclude branches beginning with stage
- "**"
- "!gh-pages"
- "!stage*"
paths:
- "src/datajoint"
- "tests"
- "src/datajoint/**"
- "tests/**"
- "pyproject.toml"
- "pixi.lock"
- ".github/workflows/test.yaml"

jobs:
test:
runs-on: ubuntu-latest
strategy:
matrix:
py_ver: ["3.10", "3.11", "3.12", "3.13"]
mysql_ver: ["8.0"]
steps:
- uses: actions/checkout@v4
- name: Set up Python ${{matrix.py_ver}}
uses: actions/setup-python@v5

- name: Set up pixi
uses: prefix-dev/[email protected]
with:
python-version: ${{matrix.py_ver}}
- name: Integration test
env:
MYSQL_VER: ${{matrix.mysql_ver}}
run: |
pip install -e ".[test]"
pytest --cov-report term-missing --cov=datajoint tests
cache: true
locked: false

- name: Run tests
run: pixi run -e test test-cov

# Unit tests run without containers (faster feedback)
unit-tests:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4

- name: Set up pixi
uses: prefix-dev/[email protected]
with:
cache: true
locked: false

- name: Run unit tests
run: pixi run -e test pytest tests/unit -v
24 changes: 21 additions & 3 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,24 @@ repos:
hooks:
# lint github actions workflow yaml
- id: actionlint

## Suggest to add pytest hook that runs unit test | Prerequisite: split unit/integration test
## https://github.com/datajoint/datajoint-python/issues/1211
- repo: https://github.com/pre-commit/mirrors-mypy
rev: v1.14.1
hooks:
- id: mypy
files: ^src/datajoint/
additional_dependencies:
- pydantic
- pydantic-settings
- types-PyMySQL
- types-tqdm
- pandas-stubs
- numpy
- repo: local
hooks:
- id: unit-tests
name: unit tests
entry: pytest tests/unit/ -v --tb=short
language: system
pass_filenames: false
always_run: true
stages: [pre-commit]
73 changes: 40 additions & 33 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -100,23 +100,25 @@ Scientific data includes both structured metadata and large data objects (time s
### Prerequisites

- [Docker](https://docs.docker.com/get-docker/) (Docker daemon must be running)
- Python 3.10+
- [pixi](https://pixi.sh) (recommended) or Python 3.10+

### Quick Start
### Quick Start with pixi (Recommended)

[pixi](https://pixi.sh) manages all dependencies including Python, graphviz, and test tools:

```bash
# Clone and install
# Clone the repo
git clone https://github.com/datajoint/datajoint-python.git
cd datajoint-python
pip install -e ".[test]"

# Run all tests (containers start automatically via testcontainers)
pytest tests/
# Install dependencies and run tests (containers managed by testcontainers)
pixi run test

# Install and run pre-commit hooks
pip install pre-commit
pre-commit install
pre-commit run --all-files
# Run with coverage
pixi run test-cov

# Run pre-commit hooks
pixi run pre-commit run --all-files
```

### Running Tests
Expand All @@ -126,16 +128,30 @@ Tests use [testcontainers](https://testcontainers.com/) to automatically manage

```bash
# Run all tests (recommended)
pytest tests/
pixi run test

# Run with coverage report
pytest --cov-report term-missing --cov=datajoint tests/
pixi run test-cov

# Run only unit tests (no containers needed)
pixi run -e test pytest tests/unit/

# Run specific test file
pytest tests/integration/test_blob.py -v
pixi run -e test pytest tests/integration/test_blob.py -v
```

# Run only unit tests (no containers needed)
pytest tests/unit/
**macOS Docker Desktop users:** If tests fail to connect to Docker, set `DOCKER_HOST`:
```bash
export DOCKER_HOST=unix://$HOME/.docker/run/docker.sock
```

### Alternative: Using pip

If you prefer pip over pixi:

```bash
pip install -e ".[test]"
pytest tests/
```

### Alternative: External Containers
Expand All @@ -147,7 +163,8 @@ For development/debugging, you may prefer persistent containers that survive tes
docker compose up -d db minio

# Run tests using external containers
DJ_USE_EXTERNAL_CONTAINERS=1 pytest tests/
DJ_USE_EXTERNAL_CONTAINERS=1 pixi run test
# Or with pip: DJ_USE_EXTERNAL_CONTAINERS=1 pytest tests/

# Stop containers when done
docker compose down
Expand All @@ -161,31 +178,21 @@ Run tests entirely in Docker (no local Python needed):
docker compose --profile test up djtest --build
```

### Alternative: Using pixi

[pixi](https://pixi.sh) users can run tests with:

```bash
pixi install # First time setup
pixi run test # Runs tests (testcontainers manages containers)
```

### Pre-commit Hooks

Pre-commit hooks run automatically on `git commit` to check code quality.
**All hooks must pass before committing.**

```bash
# Install hooks (first time only)
pip install pre-commit
pre-commit install
pixi run pre-commit install
# Or with pip: pip install pre-commit && pre-commit install

# Run all checks manually
pre-commit run --all-files
pixi run pre-commit run --all-files

# Run specific hook
pre-commit run ruff --all-files
pre-commit run codespell --all-files
pixi run pre-commit run ruff --all-files
```

Hooks include:
Expand All @@ -196,9 +203,9 @@ Hooks include:

### Before Submitting a PR

1. **Run all tests**: `pytest tests/`
2. **Run pre-commit**: `pre-commit run --all-files`
3. **Check coverage**: `pytest --cov-report term-missing --cov=datajoint tests/`
1. **Run all tests**: `pixi run test`
2. **Run pre-commit**: `pixi run pre-commit run --all-files`
3. **Check coverage**: `pixi run test-cov`

### Environment Variables

Expand Down
117 changes: 117 additions & 0 deletions RELEASE_MEMO.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,117 @@
# DataJoint 2.0 Release Memo

## PyPI Release Process

### Steps

1. **Run "Manual Draft Release" workflow** on GitHub Actions
2. **Edit the draft release**:
- Change release name to `Release 2.0.0`
- Change tag to `v2.0.0`
3. **Publish the release**
4. Automation will:
- Update `version.py` to `2.0.0`
- Build and publish to PyPI
- Create PR to merge version update back to master

### Version Note

The release drafter computes version from the previous tag (`v0.14.6`), so it would generate `0.14.7` or `0.15.0`. You must **manually edit** the release name to include `2.0.0`.

The regex on line 42 of `post_draft_release_published.yaml` extracts version from the release name:
```bash
VERSION=$(echo "${{ github.event.release.name }}" | grep -oP '\d+\.\d+\.\d+')
```

---

## Conda-Forge Release Process

DataJoint has a [conda-forge feedstock](https://github.com/conda-forge/datajoint-feedstock).

### How Conda-Forge Updates Work

Conda-forge has **automated bots** that detect new PyPI releases and create PRs automatically:

1. **You publish to PyPI** (via the GitHub release workflow)
2. **regro-cf-autotick-bot** detects the new version within ~24 hours
3. **Bot creates a PR** to the feedstock with updated version and hash
4. **Maintainers review and merge** (you're listed as a maintainer)
5. **Package builds automatically** for all platforms

### Manual Update (if bot doesn't trigger)

If the bot doesn't create a PR, manually update the feedstock:

1. **Fork** [conda-forge/datajoint-feedstock](https://github.com/conda-forge/datajoint-feedstock)

2. **Edit `recipe/meta.yaml`**:
```yaml
{% set version = "2.0.0" %}

package:
name: datajoint
version: {{ version }}

source:
url: https://pypi.io/packages/source/d/datajoint/datajoint-{{ version }}.tar.gz
sha256: <NEW_SHA256_HASH>

build:
number: 0 # Reset to 0 for new version
```

3. **Get the SHA256 hash**:
```bash
curl -sL https://pypi.org/pypi/datajoint/2.0.0/json | jq -r '.urls[] | select(.packagetype=="sdist") | .digests.sha256'
```

4. **Update license** (important for 2.0!):
```yaml
about:
license: Apache-2.0 # Changed from LGPL-2.1-only
license_file: LICENSE
```

5. **Submit PR** to the feedstock

### Action Items for 2.0 Release

1. **First**: Publish to PyPI via GitHub release (name it "Release 2.0.0")
2. **Wait**: ~24 hours for conda-forge bot to detect
3. **Check**: [datajoint-feedstock PRs](https://github.com/conda-forge/datajoint-feedstock/pulls) for auto-PR
4. **Review**: Ensure license changed from LGPL to Apache-2.0
5. **Merge**: As maintainer, approve and merge the PR

### Timeline

| Step | When |
|------|------|
| PyPI release | Day 0 |
| Bot detects & creates PR | Day 0-1 |
| Review & merge PR | Day 1-2 |
| Conda-forge package available | Day 1-2 |

### Verification

After release:
```bash
conda search datajoint -c conda-forge
# Should show 2.0.0
```

---

## Maintainers

- @datajointbot
- @dimitri-yatsenko
- @drewyangdev
- @guzman-raphael
- @ttngu207

## Links

- [datajoint-feedstock on GitHub](https://github.com/conda-forge/datajoint-feedstock)
- [datajoint on Anaconda.org](https://anaconda.org/conda-forge/datajoint)
- [datajoint on PyPI](https://pypi.org/project/datajoint/)
6 changes: 3 additions & 3 deletions docs/src/archive/manipulation/delete.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,6 @@ Entities in a [part table](../design/tables/master-part.md) are usually removed
consequence of deleting the master table.

To enforce this workflow, calling `delete` directly on a part table produces an error.
In some cases, it may be necessary to override this behavior.
To remove entities from a part table without calling `delete` master, use the argument `force_parts=True`.
To include the corresponding entries in the master table, use the argument `force_masters=True`.
In some cases, it may be necessary to override this behavior using the `part_integrity` parameter:
- `part_integrity="ignore"`: Remove entities from a part table without deleting from master (breaks integrity).
- `part_integrity="cascade"`: Delete from parts and also cascade up to delete the corresponding master entries.
Loading
Loading