feat: M13 VESSEL — Content-on-Node#276
Conversation
…ta, has, delete (#272) New module src/content.js implements content-on-node using git's native CAS (content-addressed storage). Content is stored as git blobs via hash-object and referenced from WARP node properties under the _content.* prefix. Property convention: _content.sha — git blob SHA _content.mime — MIME type _content.size — byte count _content.encoding — content encoding Integrity verification on read (re-hash and compare). Public API exported from src/index.js.
New CLI subcommands under `git mind content`: - set <node> --from <file> — attach content from a file - show <node> [--raw] — display attached content - meta <node> — show content metadata - delete <node> — remove attached content All commands support --json output. MIME type auto-detected from file extension with --mime override. Chalk-formatted metadata display via new formatContentMeta() in format.js.
Three new CLI contract schemas for content commands: - content-set.schema.json - content-show.schema.json - content-meta.schema.json (conditional: sha/mime/size/encoding required when hasContent=true) Draft 2020-12, following established M9.T1 pattern.
…cts (#275) New test/content.test.js with three suites: - content store core (13 unit tests): write, read, integrity, meta, has, delete, overwrite, buffer - content CLI commands (10 integration tests): set, show, meta, delete with flags - content CLI schema contracts (4 tests): validates --json output against schemas Also fixes: - hasContent() checks value not just key presence (null after delete) - content-meta.schema.json simplified for ajv strict mode - VALID_SAMPLES added for 3 new schemas in contracts.test.js 564 tests passing across 29 files.
Content-on-node milestone: 4 new CLI commands, content store API, 3 JSON schema contracts, 27 new tests. 564 tests across 29 files.
|
Warning Rate limit exceeded
⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. 📝 WalkthroughWalkthroughAdds a Git-backed content-on-node subsystem: new src/content.js with content CRUD, CLI content subcommands (set/show/meta/delete) and flags/JSON outputs, JSON schema contracts, formatting helpers, tests, and package version bumped to 3.3.0. Changes
Sequence Diagram(s)sequenceDiagram
participant User as CLI User
participant CLI as bin/git-mind.js
participant Cmd as contentSet
participant Graph as Graph
participant Git as Git
participant Patch as NodePatch
User->>CLI: git mind content set <node> --from <file> [--mime]
CLI->>Cmd: invoke contentSet(cwd, nodeId, filePath, opts)
Cmd->>Graph: validate node exists / load graph
Graph-->>Cmd: node exists
Cmd->>Git: git hash-object -w --stdin (write blob)
Git-->>Cmd: sha
Cmd->>Patch: patch _content.sha/_content.mime/_content.size on node
Patch-->>Cmd: patch applied
Cmd-->>CLI: return result (JSON or human)
CLI-->>User: output
sequenceDiagram
participant User as CLI User
participant CLI as bin/git-mind.js
participant Cmd as contentShow
participant Graph as Graph
participant Git as Git
participant Verify as Verifier
User->>CLI: git mind content show <node> [--raw|--json]
CLI->>Cmd: invoke contentShow(cwd, nodeId, opts)
Cmd->>Graph: getContentMeta(nodeId)
Graph-->>Cmd: { sha, mime, size } or null
alt has content
Cmd->>Git: git cat-file blob <sha>
Git-->>Cmd: blob bytes
Cmd->>Verify: recompute hash -> compare with stored sha
Verify-->>Cmd: OK / mismatch
alt --raw
Cmd-->>User: raw blob bytes
else --json
Cmd-->>User: JSON envelope with content + meta
else
Cmd-->>User: metadata header + content body
end
else no content
Cmd-->>User: error or JSON { hasContent: false }
end
Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes Possibly related issues
Possibly related PRs
Poem
🚥 Pre-merge checks | ✅ 5✅ Passed checks (5 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 8
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@docs/contracts/cli/content-meta.schema.json`:
- Around line 7-17: Add a JSON Schema conditional: add an "if" that checks {
"properties": { "hasContent": { "const": true } } } and a "then" that requires
["sha","mime","size","encoding"] so that when hasContent is true those fields
are mandatory; update the schema around the existing properties (schemaVersion,
command, nodeId, hasContent, sha, mime, size, encoding) and ensure the "then"
uses the same property names; optionally adjust the "sha" pattern to accept
SHA-256 (64 hex chars) or document that only SHA-1 (40 chars) is allowed.
In `@src/cli/commands.js`:
- Around line 838-841: The bug is that writeContent is called without an
explicit encoding so binary files get recorded with the default 'utf-8'
encoding; detect binary MIME types before calling writeContent and pass
encoding: 'base64' for non-text MIME types (use the same MIME_MAP/extname logic
already present) so writeContent stores encoding correctly; update the call site
that currently does writeContent(cwd, graph, nodeId, buf, { mime }) to compute
an encoding (e.g., determine if mime startsWith('text/') or is in a whitelist of
text types, otherwise 'base64') and call writeContent(cwd, graph, nodeId, buf, {
mime, encoding }); ensure this uses the same extname(filePath).toLowerCase() and
opts.mime resolution logic.
- Around line 871-876: The human-readable show path calls
formatContentMeta(meta) with meta from readContent that lacks nodeId, causing
missing node identity; update this call to pass the nodeId the same way other
sites do (e.g., call formatContentMeta(meta, nodeId) or otherwise add nodeId
into the meta object before calling) so the metadata header includes node
identity consistent with contentSet/contentMeta usage.
- Around line 835-836: Hoist the dynamic imports by adding readFile to the
existing top-level import from 'node:fs/promises' and extname to the existing
top-level import from 'node:path', then remove the in-function lines "const {
readFile } = await import('node:fs/promises');" and "const { extname } = await
import('node:path');" so the code uses the statically imported readFile and
extname symbols (ensure any references to readFile and extname in the
surrounding function continue to work after removal).
In `@src/content.js`:
- Around line 168-175: The hasContent function contains a redundant check —
after computing sha with propsMap?.get(KEYS.sha) ?? null, sha can never be
undefined, so remove the unnecessary "sha !== undefined" and simplify the return
to check only that sha is not null (e.g., return sha !== null) or use a truthy
check if appropriate; update the return in hasContent to reflect the simplified
condition and keep references to propsMap and KEYS.sha intact.
- Around line 101-112: The code is vulnerable because meta.sha is interpolated
into execSync which allows shell injection; replace the execSync call with
child_process.spawnSync using an argument array to avoid shell interpretation
(e.g., call spawnSync('git', ['cat-file', 'blob', meta.sha'], {cwd, encoding:
meta.encoding === 'base64' ? 'buffer' : 'utf-8'})), then check the result.status
or result.error and throw the same Error with the nodeId if the spawn fails;
update the block that currently uses execSync (the content variable assignment
where meta.sha is used) to use spawnSync and preserve the encoding behavior and
error message.
In `@test/content.test.js`:
- Around line 56-66: The tests assert SHA length with a hardcoded 40-char regex
(expect(result.sha).toMatch(/^[0-9a-f]{40}$/)) which breaks when the repo uses
SHA-256; update every occurrence of that assertion (the
expect(...result.sha...).toMatch calls in the test file, e.g., in the
writeContent test and the other sha checks) to use a broader regex such as
/^[0-9a-f]{40,64}$/ (or /^[0-9a-f]+$/) so both SHA-1 and SHA-256 hashes pass.
- Around line 77-86: The test currently sets _content.sha to a nonexistent value
so readContent hits the "blob not found" branch; instead create a real,
different blob and point doc:readme's _content.sha at that existing-but-wrong
SHA so readContent reaches the integrity-check branch. Concretely: after
writeContent(tempDir, graph, 'doc:readme', 'original', ...), create a second
content blob (e.g., writeContent(tempDir, graph, 'doc:other', 'tampered', ...))
and obtain its stored SHA, then use
graph.createPatch()/patch.setProperty('doc:readme', '_content.sha',
<that-other-sha>)/patch.commit() and assert readContent(...) rejects with the
integrity-mismatch error expected from the integrity-check logic in readContent.
- CRITICAL: replace execSync shell interpolation with execFileSync + SHA validation regex in readContent() to prevent command injection - Remove dead encoding parameter from content store, format, schemas, and tests — content is always UTF-8 - Static imports for readFile/extname in content CLI commands - Pass nodeId to formatContentMeta in content show non-raw path - Add dependentRequired constraint to content-meta schema - Compile schema validators once in beforeAll, remove $id workaround - Add 4 error-path CLI tests (568 tests total)
There was a problem hiding this comment.
Actionable comments posted: 2
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@src/cli/commands.js`:
- Around line 814-824: Update the MIME_MAP constant to include common web asset
types used in the graph: add entries for '.css' mapping to 'text/css' and '.svg'
mapping to 'image/svg+xml'. Locate the MIME_MAP object in src/cli/commands.js
(the MIME_MAP declaration) and append these two key/value pairs so the existing
fallback to 'application/octet-stream' and the --mime override remain unchanged.
In `@src/content.js`:
- Around line 73-78: In writeContent (the block that runs git hash-object -w
--stdin) replace execSync with execFileSync using an argv array (e.g., ['git',
'hash-object', '-w', '--stdin']) and pass the binary input via the input option
to avoid shell interpolation; do the same for the verification hash call
referenced around the second occurrence (the verification on line ~123) so both
places mirror readContent's execFileSync usage and omit the shell.
---
Duplicate comments:
In `@docs/contracts/cli/content-meta.schema.json`:
- Around line 18-22: The schema currently uses dependentRequired to tie sha,
mime and size together but doesn't force them when hasContent is true; update
the JSON Schema by adding an if/then conditional that checks if the property
hasContent is true (if: { "properties": { "hasContent": { "const": true } } })
and in the then clause require ["sha","mime","size"]; keep the existing
dependentRequired intact so the three fields still depend on each other in other
cases.
In `@src/cli/commands.js`:
- Around line 833-851: The PR allows attaching arbitrary files in contentSet but
readContent currently fetches blobs as 'utf-8' which corrupts binary
round-trips; update the code to reject binary MIME types when writing so only
text content is allowed: in contentSet (and any caller that infers MIME via
MIME_MAP/fallback) validate the resolved mime against a whitelist of text/* and
known safe types and throw a clear error for binary types (e.g., images,
application/octet-stream), referencing MIME_MAP, contentSet, and readContent so
the validation is added before writeContent is invoked; ensure the error message
explains that only text content is supported in this version.
In `@src/content.js`:
- Around line 175-177: The boolean check at the end of the function is redundant
because propsMap?.get(KEYS.sha) ?? null converts undefined to null, so `sha` can
never be undefined; update the return to only test for null (e.g., replace the
`sha !== null && sha !== undefined` expression with a single `sha !== null`
check) in the function that calls graph.getNodeProps(nodeId) and assigns `sha`
from `propsMap?.get(KEYS.sha) ?? null`.
In `@test/content.test.js`:
- Around line 77-86: The test currently sets _content.sha to a nonexistent SHA
so the "blob not found" branch runs instead of the integrity check; instead,
create or write a second valid blob and obtain its real SHA, then use
patch.setProperty('doc:readme', '_content.sha', <otherBlobSha>) so the SHA
exists but points to different data, and finally call await
expect(readContent(tempDir, graph, 'doc:readme')).rejects.toThrow(/hash
mismatch|integrity|sha/i) to assert the re-hash/comparison code path in
readContent is exercised; use the existing helpers writeContent,
graph.createPatch, patch.setProperty and readContent to implement this.
- Use execFileSync consistently for all git calls in content.js —
zero shell invocations remain (eliminates execSync entirely)
- Add if/then conditional to content-meta schema: sha/mime/size
required when hasContent is true
- Remove redundant sha !== undefined check in hasContent()
- Broaden SHA regex in tests to accept SHA-256 ({40,64})
- Split misleading integrity test: blob-not-found + genuine
integrity mismatch via non-UTF-8 blob
- Add .css and .svg to MIME_MAP
- Update CHANGELOG with full review fixes
|
@coderabbitai review |
✅ Actions performedReview triggered.
|
There was a problem hiding this comment.
Actionable comments posted: 5
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@docs/contracts/cli/content-meta.schema.json`:
- Around line 15-19: The schema's then block redundantly re-declares the same
property constraints as the top-level "properties" (sha, mime, size) while only
needing to enforce the conditional "required" set; remove the duplicated
property definitions inside the "then" clause and replace them with a "required"
array (e.g., ["sha","mime","size"]) so the top-level "properties" and
"additionalProperties": false continue to validate types/patterns and the "then"
only expresses the conditional requirement.
In `@src/content.js`:
- Around line 28-34: The current SHA_RE and assertValidSha only accept 40-char
SHA-1 digests and will reject valid 64-char SHA-256 object IDs; update the
SHA_RE used by assertValidSha (symbol: SHA_RE) to accept 40 or 64 hex characters
(e.g. change the regex to allow {40,64}) and keep assertValidSha's validation
logic intact so valid 64-char SHAs from sha256 repos do not throw false errors.
- Around line 110-127: readContent is reading git blobs with encoding: 'utf-8',
corrupting binary data and causing verifySha to differ; update readContent to
detect binary vs text (using meta.mime or an _content.encoding flag persisted by
writeContent/KEYS) and call execFileSync('git', ['cat-file','blob', meta.sha'],
{cwd, encoding: null}) for binary to get a Buffer, while keeping encoding:
'utf-8' for text; ensure the function's return type and callers (notably
contentShow in commands.js) accept either string or Buffer and that writeContent
persists _content.encoding/mime into KEYS so readContent can decide reliably.
In `@test/content.test.js`:
- Around line 38-50: Extract the repeated git-repo setup in the test file into a
single async helper (e.g., createTestRepo or setupTestRepo) that performs
mkdtemp(join(tmpdir(), 'gitmind-content-')), runs git init and the two git
config commands using execFileSync (to avoid shell), and returns the tempDir;
then update the beforeEach/beforeAll blocks that call mkdtemp/execSync/initGraph
(and the places creating graph via initGraph and the test node patch creation)
to call this helper and pass the returned tempDir into initGraph so code in
functions like initGraph, graph.createPatch, and patch.commit remains unchanged.
Ensure the helper is reused in all three locations currently duplicating the
three-line git setup (including lines near the shown block and at the other two
noted ranges).
- Around line 167-173: Rename or augment the ambiguous test to make its intent
explicit: either rename the test case "'handles Buffer input'" to "'handles
Buffer (utf-8 compatible) input'" or add a new companion test that constructs a
non-UTF8 Buffer (e.g., Buffer.from([0x80,0x81,0x82])) and asserts that
readContent/readContent integrity fails, documenting the limitation caused by
encoding: 'utf-8' in src/content.js; update references to the functions
writeContent and readContent in test/content.test.js and add a brief comment
pointing to src/content.js encoding behavior.
---
Duplicate comments:
In `@docs/contracts/cli/content-meta.schema.json`:
- Line 16: The SHA regex only allows 40 hex chars (SHA-1); update the pattern
used for the "sha" property in both places (the top-level properties "sha" and
the conditional "then" constraint referencing "sha") to accept 40–64 hex
characters by changing the pattern from ^[0-9a-f]{40}$ to ^[0-9a-f]{40,64}$ so
the schema matches the tests that expect SHA-256 support.
In `@src/cli/commands.js`:
- Around line 838-841: The call to writeContent is missing an encoding so binary
files get stored without an encoding and later fail on read; update the call
site where mime is computed to also determine and pass an encoding (e.g. const
encoding = opts.encoding ?? (Buffer.isBuffer(buf) ? 'binary' : 'utf8')) and call
writeContent(cwd, graph, nodeId, buf, { mime, encoding }); keep references to
mime, buf, opts, MIME_MAP, and writeContent so the encoding decision aligns with
readContent/read persistence changes.
- content list command for discovering content-bearing nodes - binary content support (base64 encoding for non-text MIME) - content meta --verify flag for bulk integrity checks - execFileSync standardization across all modules
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@docs/contracts/cli/content-meta.schema.json`:
- Line 16: The "sha" JSON Schema property currently uses the pattern
"^[0-9a-f]{40}$" which only allows 40-hex-character SHA-1 values; update the
pattern for the "sha" property to allow 40–64 hex characters (e.g.
"^[0-9a-f]{40,64}$") so SHA-256 is accepted, and apply this change to both
occurrences of the "sha" property in the schema to match the tests in
test/content.test.js.
---
Duplicate comments:
In `@docs/contracts/cli/content-meta.schema.json`:
- Around line 13-19: The then block duplicates property type/pattern constraints
already declared at the schema top-level; remove the duplicated "sha", "mime",
and "size" entries from then.properties and keep only the "required":
["sha","mime","size"] under "then" so the conditional relies on the top-level
"properties" (with additionalProperties: false) instead of repeating
types/patterns; update the schema's "then" to only express required.
In `@src/content.js`:
- Around line 27-35: The current SHA_RE and assertValidSha only accept 40-char
SHA-1 hashes and will reject 64-char SHA-256 object IDs; update SHA_RE to accept
both lengths (e.g. /^[0-9a-f]{40,64}$/) and adjust the comment/JSDoc to reflect
"40- or 64-hex-char git object hash" so assertValidSha uses the broadened regex
and its error message remains accurate when repos use sha256 object format;
locate and change the constants/strings in SHA_RE and the assertValidSha
function.
In `@test/content.test.js`:
- Around line 38-50: Extract the repeated git setup into a helper async function
(e.g., setupTestRepo) that creates the temp dir with mkdtemp, runs the three git
commands using execFileSync (git init, git config user.email, git config
user.name) with the cwd set to the temp dir, and returns the tempDir; replace
the three execSync lines in each beforeEach with tempDir = await
setupTestRepo(); update imports to use execFileSync and remove execSync; keep
the existing call to graph = await initGraph(tempDir) so only repo setup is
moved into setupTestRepo.
- Add missing content-delete.schema.json contract (H-1) - Fix positional arg parsing in content subcommands with extractPositionals (M-1) - Add else clause to content-meta.schema.json forbidding fields when hasContent=false (M-2) - Change YAML MIME to application/yaml (L-2) - Fix misleading SHA changelog claim (L-3) - Fold [Unreleased] into [3.3.0] (L-1) - 2 new schema contract tests (571 total)
- SHA patterns accept SHA-256 (40,64 hex chars) in all content schemas and src/content.js SHA_RE — consistent with test assertions - Extract setupGitRepo() helper in test/content.test.js (DRY, uses execFileSync) - content-meta.schema.json: add else clause, update SHA patterns
|
@coderabbitai review |
✅ Actions performedReview triggered.
|
There was a problem hiding this comment.
Actionable comments posted: 5
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@bin/git-mind.js`:
- Around line 155-171: Replace the duplicated inline positional-argument
extraction in the "view" command handler with a call to the existing helper
function extractPositionals: remove the loop that manually skips --flag value
pairs and instead call extractPositionals(args) to produce the positional array,
then use that returned array where the current inline logic populates
positionals; reference the extractPositionals helper and the "view" case handler
to locate and simplify the duplicate code.
In `@docs/contracts/cli/content-delete.schema.json`:
- Line 14: Add a spot test to exercise the removed: false / previousSha: null
path by inserting the suggested spec into the "optional fields" block of
test/contracts.test.js: create a sample object with schemaVersion: 1, command:
'content-delete', nodeId: 'doc:readme', removed: false, previousSha: null,
obtain the validator via validators.get('content-delete.schema.json') and assert
validate(sample) is true; this will ensure the schema entry for "previousSha"
(the pattern allowing string or null) is actually validated for the null case.
In `@test/content.test.js`:
- Around line 158-162: Add unit tests mirroring the existing writeContent test
to cover the remaining error paths: assert that readContent(tempDir, graph,
'doc:nonexistent') rejects with /No content attached/ for nodes without content,
and assert that getContentMeta(tempDir, graph, 'doc:nonexistent') and
deleteContent(tempDir, graph, 'doc:nonexistent') each reject with /Node not
found/; follow the same pattern as the current it('writeContent fails on
non-existent node', ...) by using await expect(...).rejects.toThrow(/.../) and
reuse tempDir and graph setup to keep tests isolated and fast.
- Around line 280-302: Update the four tests that currently use bare toThrow()
to assert the specific failure reason: capture the thrown error from runCli (or
use expect(() => runCli(...)).toThrowError(/regex/)) and match on the expected
stderr/error message for each case (e.g., the "file not found" message for the
'content set --from nonexistent file' test and the "node not found" or "no
content" messages for the 'content show'/'content delete' tests). Locate the
tests using the runCli helper (the examples with 'content set', 'content show',
and 'content delete' in test/content.test.js) and replace the generic toThrow()
with assertions that match the expected error string or regex so the tests fail
if a different error (like an arg-parsing bug) occurs.
In `@test/contracts.test.js`:
- Around line 267-275: Add negative and positive spot-tests to exercise the
content-meta schema's else branch: in test/contracts.test.js within the
"optional fields" describe block, add one case that builds a sample with
hasContent: false but includes sha (using 'a'.repeat(40)) and asserts
validators.get('content-meta.schema.json') returns false, and a second case that
builds hasContent: false with no sha/mime/size and asserts the validator returns
true; use the existing pattern of getting the validator via validators.get and
expect(...).toBe(false/true) so the schema's forbidden fields are actually
tested.
There was a problem hiding this comment.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Duplicate comments:
In `@bin/git-mind.js`:
- Around line 199-207: The inline loop in the 'view' case duplicates the
extractPositionals helper; remove the manual loop that builds viewPositionals
from viewArgs and BOOLEAN_FLAGS and instead call extractPositionals(viewArgs)
(assign its result to viewPositionals). Ensure you keep the existing variable
names (viewArgs, viewPositionals, BOOLEAN_FLAGS) and any surrounding logic
unchanged so behavior remains identical while avoiding duplication.
In `@test/content.test.js`:
- Around line 158-162: Add three unit tests next to the existing writeContent
error test in test/content.test.js to cover the missing error paths: one that
calls readContent(tempDir, graph, 'doc:nodeWithNoContent') and expects rejection
with /No content attached to node/, one that calls getContentMeta(tempDir,
graph, 'doc:nonexistent') and expects rejection with /Node not found/, and one
that calls deleteContent(tempDir, graph, 'doc:nonexistent') and expects
rejection with /Node not found/; ensure you use the same async
expect(...).rejects.toThrow(...) pattern as the writeContent test so these
failures are exercised at unit level.
- Around line 280-302: The four negative tests ('content set --from nonexistent
file throws', 'content show on node without content throws', 'content show on
non-existent node throws', 'content delete on non-existent node throws')
currently use bare expect(...).toThrow(); update each to assert the specific
failure mode from runCli by matching the expected error message or error class
(e.g., use toThrowError(/file not found/) or toThrowErrorMatchingInlineSnapshot
for the file-missing case, and appropriate regex/messages for the "no content"
and "node not found" cases) so the tests fail on unrelated errors; locate calls
to runCli in those test blocks and replace the generic toThrow with targeted
message/type matchers.
In `@test/contracts.test.js`:
- Around line 267-282: Add spot-tests covering the untested branches for the two
schemas: for 'content-meta.schema.json' add a test that validates
hasContent:false with no sha/mime/size is accepted and another test that
supplies hasContent:false plus sha (e.g. 'a'.repeat(40)) and asserts the
validator rejects it; for 'content-delete.schema.json' add a test that validates
removed:false with previousSha:null is accepted. Use the existing validators map
(validators.get('content-meta.schema.json') and
validators.get('content-delete.schema.json')) and the same sample
nodeId/command/schemaVersion shape as other tests to keep consistency.
- Use extractPositionals helper in view case (DRY) - Add unit tests for readContent/getContentMeta/deleteContent error paths - Assert specific failure reasons in CLI error-path tests - Add spot tests for content-delete removed:false/null and content-meta else branch
|
@coderabbitai review |
✅ Actions performedReview triggered.
|
Summary
src/content.js) — write/read/meta/has/delete using git's native CAS (hash-object -w/cat-file blob). SHA integrity verification on every read. ([CAS-001] Content store core — src/content.js #272)git mind content set|show|meta|deletewith--json,--raw,--mimeflags. MIME auto-detection from file extension. ([CAS-002] CLI content set|show|meta|delete commands #273)content-set.schema.json,content-show.schema.json,content-meta.schema.jsonindocs/contracts/cli/([CAS-003] CLI JSON schema contracts for content commands #274)Problem Statement
ADR Compliance (Required)
Relevant ADR(s)
Compliance Declaration
Architecture Laws Checklist (Hard Gates)
Canonical Truth & Context
--at,--observer,--trust) or deterministically defaulted.Determinism & Provenance
Artifact Hygiene
Contracts & Compatibility
Extension/Effects Safety (if applicable)
Scope Control
Backward Compatibility
_content.*node properties — existing nodes unaffectedTest Plan (Required)
Unit
Integration
npx vitest run test/content.test.js # includes CLI integration testsDeterminism
npx vitest run test/content.test.js -- -t "integrity"Contract/Schema
npx vitest run test/contracts.test.js npx vitest run test/content.test.js -- -t "schema"Policy Gates
git commit # pre-commit hook runs mechanical gatesSecurity / Trust Impact
execFileSyncarg arrays (no shell),assertValidSha()regex validation before any git callreadContent()throws on missing blob or integrity mismatchPerformance Impact
Observability / Debuggability
Operational Notes
_content.*properties viagit mind content delete; git blobs cleaned bygit gcLinked Issues / Milestones
Summary by CodeRabbit
New Features
Documentation
Tests
Chores