-
Notifications
You must be signed in to change notification settings - Fork 69
[ENG-8221][ENG-8038] less database data; part 1 #877
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
aaxelb
merged 2 commits into
CenterForOpenScience:develop
from
aaxelb:feature/eng-8038--migrate-expirations
Jun 16, 2025
Merged
Changes from all commits
Commits
Show all changes
2 commits
Select commit
Hold shift + click to select a range
File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,51 @@ | ||
| import datetime | ||
| import time | ||
|
|
||
| from django.db.models import OuterRef | ||
|
|
||
| from trove.util.django import pk_chunked | ||
|
|
||
| from share import models as share_db | ||
| from share.management.commands import BaseShareCommand | ||
| from trove import models as trove_db | ||
|
|
||
|
|
||
| class Command(BaseShareCommand): | ||
| # copy all non-null values from `RawDatum.expiration_date` to `SupplementaryIndexcardRdf.expiration_date` | ||
| # (while being overly cautious to avoid joins on `RawDatum` or `SourceUniqueIdentifier`) | ||
| # meant to be run after trove migration 0008_expiration_dates, before share.RawDatum is deleted | ||
|
|
||
| def add_arguments(self, parser): | ||
| parser.add_argument('--chunk-size', type=int, default=666) | ||
| parser.add_argument('--today', type=datetime.date.fromisoformat, default=datetime.date.today()) | ||
| parser.add_argument('--continue-after', type=str, default=None) | ||
|
|
||
| def handle(self, *args, chunk_size: int, today: datetime.date, continue_after, **kwargs): | ||
| _before = time.perf_counter() | ||
| _total_updated = 0 | ||
| _raw_qs = ( | ||
| share_db.RawDatum.objects.latest_for_each_suid() | ||
| .filter(expiration_date__gt=today) # ignore the expired (and the non-expiring) | ||
| ) | ||
| if continue_after is not None: | ||
| _raw_qs = _raw_qs.filter(pk__gt=continue_after) | ||
| for _raw_pk_chunk in pk_chunked(_raw_qs, chunk_size): | ||
| _supp_qs = trove_db.SupplementaryIndexcardRdf.objects.filter( | ||
| from_raw_datum_id__in=_raw_pk_chunk, | ||
| expiration_date__isnull=True, # avoid overwriting non-null values | ||
| ) | ||
| _updated_count = _supp_qs.update( | ||
| expiration_date=share_db.RawDatum.objects.filter( | ||
| id=OuterRef('from_raw_datum_id'), | ||
| ).values('expiration_date'), | ||
| ) | ||
| _total_updated += _updated_count | ||
| _last_pk = _raw_pk_chunk[-1] | ||
| _elapsed = time.perf_counter() - _before | ||
| self.stdout.write( | ||
| f'{_elapsed:.2f}: migrated {_updated_count} of {len(_raw_pk_chunk)} --continue-after={_last_pk}', | ||
| ) | ||
| _total_seconds = time.perf_counter() - _before | ||
| self.stdout.write( | ||
| self.style.SUCCESS(f'done! migrated {_total_updated} in {_total_seconds}s'), | ||
| ) | ||
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,36 @@ | ||
| # Generated by Django 3.2.25 on 2025-06-09 15:42 | ||
|
|
||
| from django.db import migrations, models | ||
|
|
||
|
|
||
| class Migration(migrations.Migration): | ||
|
|
||
| dependencies = [ | ||
| ('trove', '0007_rawdata_fks_do_nothing'), | ||
| ] | ||
|
|
||
| operations = [ | ||
| migrations.AddField( | ||
| model_name='archivedindexcardrdf', | ||
| name='expiration_date', | ||
| field=models.DateField(blank=True, help_text='An (optional) date when this description will no longer be valid.', null=True), | ||
| ), | ||
| migrations.AddField( | ||
| model_name='latestindexcardrdf', | ||
| name='expiration_date', | ||
| field=models.DateField(blank=True, help_text='An (optional) date when this description will no longer be valid.', null=True), | ||
| ), | ||
| migrations.AddField( | ||
| model_name='supplementaryindexcardrdf', | ||
| name='expiration_date', | ||
| field=models.DateField(blank=True, help_text='An (optional) date when this description will no longer be valid.', null=True), | ||
| ), | ||
| migrations.AddIndex( | ||
| model_name='latestindexcardrdf', | ||
| index=models.Index(fields=['expiration_date'], name='trove_lates_expirat_92ac89_idx'), | ||
| ), | ||
| migrations.AddIndex( | ||
| model_name='supplementaryindexcardrdf', | ||
| index=models.Index(fields=['expiration_date'], name='trove_suppl_expirat_3ea6e1_idx'), | ||
| ), | ||
| ] |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,29 @@ | ||
| from __future__ import annotations | ||
| from collections.abc import Iterator | ||
|
|
||
|
|
||
| __all__ = ('pk_chunked',) | ||
|
|
||
|
|
||
| def pk_chunked(queryset, chunksize: int) -> Iterator[list]: | ||
| '''pk_chunked: get primary key values, in chunks, for the given queryset | ||
|
|
||
| yields non-empty lists of primary keys up to `chunksize` long | ||
| ''' | ||
| _ordered_qs = queryset.order_by('pk') | ||
| _prior_end_pk = None | ||
| while True: # for each chunk: | ||
| _qs = ( | ||
| _ordered_qs | ||
| if _prior_end_pk is None | ||
| else _ordered_qs.filter(pk__gt=_prior_end_pk) | ||
| ) | ||
| # load primary key values only | ||
| _pks = list(_qs.values_list('pk', flat=True)[:chunksize]) | ||
| if not _pks: | ||
| break # done | ||
| _end_pk = _pks[-1] | ||
| if (_prior_end_pk is not None) and (_end_pk <= _prior_end_pk): | ||
| raise RuntimeError(f'sentinel pks not ascending?? got {_end_pk} after {_prior_end_pk}') | ||
| _prior_end_pk = _end_pk | ||
| yield _pks |
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Rewrote this loop slightly to be non-modifying, and did some benchmarking. A little slow to start due to
_raw_qsand the size of theRawDatumtable, but appears workable.EXPLAINALYZE'd the slowest query, and no apparent way to speed it up: