Skip to content

Conversation

@gpeacock
Copy link
Collaborator

@gpeacock gpeacock commented Jan 21, 2026

Our current placeholders for unsigned signatures are not valid cbor and thus will generate parsing errors if you try to validate them. That wasn't a problem when the placeholder was just a temporary placeholder, but since we want to have unsigned archives, it would be nice if we could validate one of these and generate a signature error rather than a cbor parse error.

I'm not sure this is the answer. The better answer may be to always sign with some kind of temporary cert. But then we need a cert in the codebase or default settings.

Reader into_builder now perserves reader context into the generated builder.
There is no longer a need to disable verify_after_reading before calling from_archive.
This changes sign_claim_placeholder() to create properly formatted but
unsigned COSE_Sign1 structures instead of raw placeholder bytes. This
eliminates CBOR parsing errors when loading archives.

Key changes:
- sign_claim_placeholder() now returns Result<Vec<u8>>
- Creates valid COSE_Sign1 with proper headers and detached payload
- Iteratively calculates signature field size to match target total size
- Ensures placeholder and real signature are exactly the same size
- Updated 8 call sites to handle Result return type

Benefits:
- No CBOR parsing errors - placeholders are valid CBOR/COSE
- Maintains patching mechanism - placeholder bytes still searchable
- Architecturally correct - represents actual unsigned manifest
- All 500+ tests pass including archive tests
@codecov
Copy link

codecov bot commented Jan 21, 2026

Codecov Report

❌ Patch coverage is 71.11111% with 13 lines in your changes missing coverage. Please review.
✅ Project coverage is 75.68%. Comparing base (40922a8) to head (2dd0439).

Files with missing lines Patch % Lines
sdk/src/store.rs 71.11% 13 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #1763      +/-   ##
==========================================
- Coverage   75.72%   75.68%   -0.05%     
==========================================
  Files         168      168              
  Lines       38939    38973      +34     
==========================================
+ Hits        29486    29496      +10     
- Misses       9453     9477      +24     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@codspeed-hq
Copy link

codspeed-hq bot commented Jan 21, 2026

Merging this PR will degrade performance by 10.19%

❌ 1 regressed benchmark
✅ 15 untouched benchmarks
⏩ 2 skipped benchmarks1

⚠️ Please fix the performance issues or acknowledge them on CodSpeed.

Performance Changes

Benchmark BASE HEAD Efficiency
sign 100kb tiff 1.7 ms 1.9 ms -10.19%

Comparing gpeacock/valid_placeholder (2dd0439) with main (40922a8)

Open in CodSpeed

Footnotes

  1. 2 benchmarks were skipped, so the baseline results were used instead. If they were deleted from the codebase, click here and archive them to remove them from the performance reports.

@tmathern
Copy link
Contributor

This is an interesting question. It comes back to how stringent checks should be for "work in progress" data, no?

The default certificate may be a thing, but what happens if it gets revoked? Then you get invalid errors again, that's expected. But what would it mean for your work in progress (the archives)? Could you still use them? Would you still want to use them, or does it mean they should get thrown away? Deciding that may help answering the overall root question here.

Thoughts @gpeacock ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants