Skip to content

Commit 46333e0

Browse files
Merge PR #1312: DataJoint 2.0 - Jobs 2.0, CI, part_integrity, and more
feat: dj.Top order inheritance, part_integrity parameter, and storage fixes
2 parents 0e81f05 + f195110 commit 46333e0

File tree

18 files changed

+603
-158
lines changed

18 files changed

+603
-158
lines changed

.github/workflows/test.yaml

Lines changed: 41 additions & 23 deletions
Original file line numberDiff line numberDiff line change
@@ -1,37 +1,55 @@
11
name: Test
2+
23
on:
34
push:
45
branches:
5-
- "**" # every branch
6-
- "!gh-pages" # exclude gh-pages branch
7-
- "!stage*" # exclude branches beginning with stage
6+
- "**"
7+
- "!gh-pages"
8+
- "!stage*"
89
paths:
9-
- "src/datajoint"
10-
- "tests"
10+
- "src/datajoint/**"
11+
- "tests/**"
12+
- "pyproject.toml"
13+
- "pixi.lock"
14+
- ".github/workflows/test.yaml"
1115
pull_request:
1216
branches:
13-
- "**" # every branch
14-
- "!gh-pages" # exclude gh-pages branch
15-
- "!stage*" # exclude branches beginning with stage
17+
- "**"
18+
- "!gh-pages"
19+
- "!stage*"
1620
paths:
17-
- "src/datajoint"
18-
- "tests"
21+
- "src/datajoint/**"
22+
- "tests/**"
23+
- "pyproject.toml"
24+
- "pixi.lock"
25+
- ".github/workflows/test.yaml"
26+
1927
jobs:
2028
test:
2129
runs-on: ubuntu-latest
22-
strategy:
23-
matrix:
24-
py_ver: ["3.10", "3.11", "3.12", "3.13"]
25-
mysql_ver: ["8.0"]
2630
steps:
2731
- uses: actions/checkout@v4
28-
- name: Set up Python ${{matrix.py_ver}}
29-
uses: actions/setup-python@v5
32+
33+
- name: Set up pixi
34+
uses: prefix-dev/[email protected]
3035
with:
31-
python-version: ${{matrix.py_ver}}
32-
- name: Integration test
33-
env:
34-
MYSQL_VER: ${{matrix.mysql_ver}}
35-
run: |
36-
pip install -e ".[test]"
37-
pytest --cov-report term-missing --cov=datajoint tests
36+
cache: true
37+
locked: false
38+
39+
- name: Run tests
40+
run: pixi run -e test test-cov
41+
42+
# Unit tests run without containers (faster feedback)
43+
unit-tests:
44+
runs-on: ubuntu-latest
45+
steps:
46+
- uses: actions/checkout@v4
47+
48+
- name: Set up pixi
49+
uses: prefix-dev/[email protected]
50+
with:
51+
cache: true
52+
locked: false
53+
54+
- name: Run unit tests
55+
run: pixi run -e test pytest tests/unit -v

.pre-commit-config.yaml

Lines changed: 21 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -36,6 +36,24 @@ repos:
3636
hooks:
3737
# lint github actions workflow yaml
3838
- id: actionlint
39-
40-
## Suggest to add pytest hook that runs unit test | Prerequisite: split unit/integration test
41-
## https://github.com/datajoint/datajoint-python/issues/1211
39+
- repo: https://github.com/pre-commit/mirrors-mypy
40+
rev: v1.14.1
41+
hooks:
42+
- id: mypy
43+
files: ^src/datajoint/
44+
additional_dependencies:
45+
- pydantic
46+
- pydantic-settings
47+
- types-PyMySQL
48+
- types-tqdm
49+
- pandas-stubs
50+
- numpy
51+
- repo: local
52+
hooks:
53+
- id: unit-tests
54+
name: unit tests
55+
entry: pytest tests/unit/ -v --tb=short
56+
language: system
57+
pass_filenames: false
58+
always_run: true
59+
stages: [pre-commit]

README.md

Lines changed: 40 additions & 33 deletions
Original file line numberDiff line numberDiff line change
@@ -100,23 +100,25 @@ Scientific data includes both structured metadata and large data objects (time s
100100
### Prerequisites
101101

102102
- [Docker](https://docs.docker.com/get-docker/) (Docker daemon must be running)
103-
- Python 3.10+
103+
- [pixi](https://pixi.sh) (recommended) or Python 3.10+
104104

105-
### Quick Start
105+
### Quick Start with pixi (Recommended)
106+
107+
[pixi](https://pixi.sh) manages all dependencies including Python, graphviz, and test tools:
106108

107109
```bash
108-
# Clone and install
110+
# Clone the repo
109111
git clone https://github.com/datajoint/datajoint-python.git
110112
cd datajoint-python
111-
pip install -e ".[test]"
112113
113-
# Run all tests (containers start automatically via testcontainers)
114-
pytest tests/
114+
# Install dependencies and run tests (containers managed by testcontainers)
115+
pixi run test
115116
116-
# Install and run pre-commit hooks
117-
pip install pre-commit
118-
pre-commit install
119-
pre-commit run --all-files
117+
# Run with coverage
118+
pixi run test-cov
119+
120+
# Run pre-commit hooks
121+
pixi run pre-commit run --all-files
120122
```
121123

122124
### Running Tests
@@ -126,16 +128,30 @@ Tests use [testcontainers](https://testcontainers.com/) to automatically manage
126128

127129
```bash
128130
# Run all tests (recommended)
129-
pytest tests/
131+
pixi run test
130132
131133
# Run with coverage report
132-
pytest --cov-report term-missing --cov=datajoint tests/
134+
pixi run test-cov
135+
136+
# Run only unit tests (no containers needed)
137+
pixi run -e test pytest tests/unit/
133138
134139
# Run specific test file
135-
pytest tests/integration/test_blob.py -v
140+
pixi run -e test pytest tests/integration/test_blob.py -v
141+
```
136142

137-
# Run only unit tests (no containers needed)
138-
pytest tests/unit/
143+
**macOS Docker Desktop users:** If tests fail to connect to Docker, set `DOCKER_HOST`:
144+
```bash
145+
export DOCKER_HOST=unix://$HOME/.docker/run/docker.sock
146+
```
147+
148+
### Alternative: Using pip
149+
150+
If you prefer pip over pixi:
151+
152+
```bash
153+
pip install -e ".[test]"
154+
pytest tests/
139155
```
140156

141157
### Alternative: External Containers
@@ -147,7 +163,8 @@ For development/debugging, you may prefer persistent containers that survive tes
147163
docker compose up -d db minio
148164
149165
# Run tests using external containers
150-
DJ_USE_EXTERNAL_CONTAINERS=1 pytest tests/
166+
DJ_USE_EXTERNAL_CONTAINERS=1 pixi run test
167+
# Or with pip: DJ_USE_EXTERNAL_CONTAINERS=1 pytest tests/
151168
152169
# Stop containers when done
153170
docker compose down
@@ -161,31 +178,21 @@ Run tests entirely in Docker (no local Python needed):
161178
docker compose --profile test up djtest --build
162179
```
163180

164-
### Alternative: Using pixi
165-
166-
[pixi](https://pixi.sh) users can run tests with:
167-
168-
```bash
169-
pixi install # First time setup
170-
pixi run test # Runs tests (testcontainers manages containers)
171-
```
172-
173181
### Pre-commit Hooks
174182

175183
Pre-commit hooks run automatically on `git commit` to check code quality.
176184
**All hooks must pass before committing.**
177185

178186
```bash
179187
# Install hooks (first time only)
180-
pip install pre-commit
181-
pre-commit install
188+
pixi run pre-commit install
189+
# Or with pip: pip install pre-commit && pre-commit install
182190
183191
# Run all checks manually
184-
pre-commit run --all-files
192+
pixi run pre-commit run --all-files
185193
186194
# Run specific hook
187-
pre-commit run ruff --all-files
188-
pre-commit run codespell --all-files
195+
pixi run pre-commit run ruff --all-files
189196
```
190197

191198
Hooks include:
@@ -196,9 +203,9 @@ Hooks include:
196203

197204
### Before Submitting a PR
198205

199-
1. **Run all tests**: `pytest tests/`
200-
2. **Run pre-commit**: `pre-commit run --all-files`
201-
3. **Check coverage**: `pytest --cov-report term-missing --cov=datajoint tests/`
206+
1. **Run all tests**: `pixi run test`
207+
2. **Run pre-commit**: `pixi run pre-commit run --all-files`
208+
3. **Check coverage**: `pixi run test-cov`
202209

203210
### Environment Variables
204211

RELEASE_MEMO.md

Lines changed: 117 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,117 @@
1+
# DataJoint 2.0 Release Memo
2+
3+
## PyPI Release Process
4+
5+
### Steps
6+
7+
1. **Run "Manual Draft Release" workflow** on GitHub Actions
8+
2. **Edit the draft release**:
9+
- Change release name to `Release 2.0.0`
10+
- Change tag to `v2.0.0`
11+
3. **Publish the release**
12+
4. Automation will:
13+
- Update `version.py` to `2.0.0`
14+
- Build and publish to PyPI
15+
- Create PR to merge version update back to master
16+
17+
### Version Note
18+
19+
The release drafter computes version from the previous tag (`v0.14.6`), so it would generate `0.14.7` or `0.15.0`. You must **manually edit** the release name to include `2.0.0`.
20+
21+
The regex on line 42 of `post_draft_release_published.yaml` extracts version from the release name:
22+
```bash
23+
VERSION=$(echo "${{ github.event.release.name }}" | grep -oP '\d+\.\d+\.\d+')
24+
```
25+
26+
---
27+
28+
## Conda-Forge Release Process
29+
30+
DataJoint has a [conda-forge feedstock](https://github.com/conda-forge/datajoint-feedstock).
31+
32+
### How Conda-Forge Updates Work
33+
34+
Conda-forge has **automated bots** that detect new PyPI releases and create PRs automatically:
35+
36+
1. **You publish to PyPI** (via the GitHub release workflow)
37+
2. **regro-cf-autotick-bot** detects the new version within ~24 hours
38+
3. **Bot creates a PR** to the feedstock with updated version and hash
39+
4. **Maintainers review and merge** (you're listed as a maintainer)
40+
5. **Package builds automatically** for all platforms
41+
42+
### Manual Update (if bot doesn't trigger)
43+
44+
If the bot doesn't create a PR, manually update the feedstock:
45+
46+
1. **Fork** [conda-forge/datajoint-feedstock](https://github.com/conda-forge/datajoint-feedstock)
47+
48+
2. **Edit `recipe/meta.yaml`**:
49+
```yaml
50+
{% set version = "2.0.0" %}
51+
52+
package:
53+
name: datajoint
54+
version: {{ version }}
55+
56+
source:
57+
url: https://pypi.io/packages/source/d/datajoint/datajoint-{{ version }}.tar.gz
58+
sha256: <NEW_SHA256_HASH>
59+
60+
build:
61+
number: 0 # Reset to 0 for new version
62+
```
63+
64+
3. **Get the SHA256 hash**:
65+
```bash
66+
curl -sL https://pypi.org/pypi/datajoint/2.0.0/json | jq -r '.urls[] | select(.packagetype=="sdist") | .digests.sha256'
67+
```
68+
69+
4. **Update license** (important for 2.0!):
70+
```yaml
71+
about:
72+
license: Apache-2.0 # Changed from LGPL-2.1-only
73+
license_file: LICENSE
74+
```
75+
76+
5. **Submit PR** to the feedstock
77+
78+
### Action Items for 2.0 Release
79+
80+
1. **First**: Publish to PyPI via GitHub release (name it "Release 2.0.0")
81+
2. **Wait**: ~24 hours for conda-forge bot to detect
82+
3. **Check**: [datajoint-feedstock PRs](https://github.com/conda-forge/datajoint-feedstock/pulls) for auto-PR
83+
4. **Review**: Ensure license changed from LGPL to Apache-2.0
84+
5. **Merge**: As maintainer, approve and merge the PR
85+
86+
### Timeline
87+
88+
| Step | When |
89+
|------|------|
90+
| PyPI release | Day 0 |
91+
| Bot detects & creates PR | Day 0-1 |
92+
| Review & merge PR | Day 1-2 |
93+
| Conda-forge package available | Day 1-2 |
94+
95+
### Verification
96+
97+
After release:
98+
```bash
99+
conda search datajoint -c conda-forge
100+
# Should show 2.0.0
101+
```
102+
103+
---
104+
105+
## Maintainers
106+
107+
- @datajointbot
108+
- @dimitri-yatsenko
109+
- @drewyangdev
110+
- @guzman-raphael
111+
- @ttngu207
112+
113+
## Links
114+
115+
- [datajoint-feedstock on GitHub](https://github.com/conda-forge/datajoint-feedstock)
116+
- [datajoint on Anaconda.org](https://anaconda.org/conda-forge/datajoint)
117+
- [datajoint on PyPI](https://pypi.org/project/datajoint/)

docs/src/archive/manipulation/delete.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -26,6 +26,6 @@ Entities in a [part table](../design/tables/master-part.md) are usually removed
2626
consequence of deleting the master table.
2727

2828
To enforce this workflow, calling `delete` directly on a part table produces an error.
29-
In some cases, it may be necessary to override this behavior.
30-
To remove entities from a part table without calling `delete` master, use the argument `force_parts=True`.
31-
To include the corresponding entries in the master table, use the argument `force_masters=True`.
29+
In some cases, it may be necessary to override this behavior using the `part_integrity` parameter:
30+
- `part_integrity="ignore"`: Remove entities from a part table without deleting from master (breaks integrity).
31+
- `part_integrity="cascade"`: Delete from parts and also cascade up to delete the corresponding master entries.

0 commit comments

Comments
 (0)