-
Notifications
You must be signed in to change notification settings - Fork 52
feat: Add incremental re-hashing API for large models #562
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
edonadei
wants to merge
6
commits into
sigstore:main
Choose a base branch
from
edonadei:main
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
…tures This method enables reading a manifest from a signature file without performing cryptographic verification. This is the foundation for incremental re-hashing, where we need to know what files were previously signed to determine which files need re-hashing. The method: - Reads and parses Sigstore bundle JSON format - Extracts the DSSE envelope payload - Decodes base64-encoded payload - Validates manifest integrity (root digest matches resources) - Returns a Manifest object Includes comprehensive tests covering: - Valid manifest extraction - Rejection of inconsistent manifests - Error handling for missing files, invalid JSON, and missing envelopes Related to issue sigstore#160 - API for incremental model re-hashing Signed-off-by: Emrick Donadei <[email protected]>
Implements the core incremental hashing logic that compares the current model state against an existing manifest and only re-hashes changed files. Key features: - Reuses digests for unchanged files from previous manifest - Hashes new files not in the previous signature - Handles modified files via files_to_hash parameter - Handles file deletions automatically (omits them from new manifest) - Uses same parallel hashing as standard file serializer The algorithm: 1. Scan current model directory for all files 2. Build set of files to rehash from files_to_hash parameter 3. For each current file: - If not in old manifest: hash it (new file) - If in files_to_hash list: hash it (modified file) - Otherwise: reuse digest from old manifest (unchanged) 4. Deleted files are automatically excluded (not on disk) 5. Return manifest with mix of reused and new digests Usage for incremental signing (e.g., 500GB model, 1KB README changed): # Get changed files from git changed = subprocess.check_output(['git', 'diff', '--name-only', 'HEAD']) files_to_hash = [model_path / f for f in changed.decode().split()] # Only re-hash the changed file(s) serializer.serialize(model_path, files_to_hash=files_to_hash) This provides significant performance improvements - only re-hashing the changed 1KB instead of all 500GB. Includes comprehensive tests covering: - No changes: all digests reused - New file added: only new file hashed - Modified file: only modified file re-hashed - File deleted (auto): removed from manifest - File deleted (in files_to_hash): safely ignored - Mixed changes: all scenarios working together Related to issue sigstore#160 - API for incremental model re-hashing Signed-off-by: Emrick Donadei <[email protected]>
Integrates the IncrementalSerializer into the high-level hashing API,
making it accessible through the Config class.
Usage:
# Extract manifest from previous signature
old_manifest = Manifest.from_signature(Path("model.sig.old"))
# Configure incremental hashing
config = hashing.Config().use_incremental_serialization(
old_manifest,
hashing_algorithm="sha256"
)
# Get changed files and hash them
changed_files = [model_path / "README.md"]
new_manifest = config.hash(model_path, files_to_hash=changed_files)
This method follows the same pattern as use_file_serialization() and
use_shard_serialization(), providing a consistent API for users.
The configuration:
- Accepts an existing manifest to compare against
- Supports all the same hashing algorithms (SHA256, BLAKE2, BLAKE3)
- Supports the same parameters (chunk_size, max_workers, etc.)
- Returns Self for method chaining
Related to issue sigstore#160 - API for incremental model re-hashing
Signed-off-by: Emrick Donadei <[email protected]>
Provides high-level convenience functions for incremental model signing
that combine all the pieces: manifest extraction, incremental hashing,
and signing.
Two levels of API:
1. Simple function API:
sign_incremental(
model_path="huge-model/",
old_signature_path="model.sig.old",
new_signature_path="model.sig.new",
files_to_hash=["huge-model/README.md"]
)
2. Configurable class API:
Config().use_elliptic_key_signer(private_key="key").sign_incremental(
model_path="huge-model/",
old_signature_path="model.sig.old",
new_signature_path="model.sig.new",
files_to_hash=["huge-model/README.md"]
)
Both APIs:
- Extract manifest from old signature automatically
- Configure incremental hashing
- Hash only changed/new files
- Sign the new manifest
- Write the new signature
Also added set_allow_symlinks() method to IncrementalSerializer to
maintain compatibility with the hashing Config class, which calls this
method before serialization.
This makes it trivial for users to incrementally sign large models
where only a few files changed, avoiding hours of re-hashing.
Related to issue sigstore#160 - API for incremental model re-hashing
Signed-off-by: Emrick Donadei <[email protected]>
|
@mihaimaruseac if you can take a look at this, I tried to follow up with the last discussions from #160 (from 2024 that's old) in that thread and tried to implement a solution. It's a bit long and probably imperfect, but I'm open to feedback, there's a |
|
Amazing! Will take a look this week (JupyterCon) |
- Fix SIM118: Use 'key in dict' instead of 'key in dict.keys()' - Fix E501: Break long lines to stay under 80 characters - Fix F401: Remove unused pytest import from incremental_test.py - Fix F401: Remove unused json import from manifest_test.py All critical lint errors resolved. Signed-off-by: Emrick Donadei <[email protected]>
Auto-format code with ruff to match the project's formatting standards: - Adjust line breaking for long expressions - Format function call arguments consistently - Apply consistent parentheses placement No functional changes, only formatting. Signed-off-by: Emrick Donadei <[email protected]>
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Motivation
This PR implements incremental model re-hashing to solve a critical performance problem when signing large ML models. Currently, when a user makes a small change to a large model (e.g., updating README.md in a 500GB model), the entire model must be re-hashed before re-signing, which can take hours. This makes it impractical to update documentation or configuration files in large models.
This PR adds a Python API that reuses digests from previous signatures for unchanged files, only re-hashing files that were added or modified. For a 500GB model with a 1KB documentation update, this reduces re-hashing time from hours to seconds (~500,000x speedup).
Changes
Manifest.from_signature()- Extracts manifest from existing signature files without cryptographic verification, enabling digest reuseIncrementalSerializer- Core implementation that compares current model state against existing manifest and only re-hashes changed filesConfig.use_incremental_serialization()- Integrates incremental serializer into hashing APIsign_incremental()- High-level convenience API that combines extraction + incremental hashing + signingDesign Decision
Following the discussion in #160, this implementation uses a user-driven approach where changed files are specified via the files_to_hash parameter (e.g., from git diff). This was chosen over automatic change detection because:
Test Coverage:
Future Work: Based on maintainer feedback, these could be added in follow-up PRs:
Questions for Maintainers:
Testing this PR