Conversation
📝 WalkthroughWalkthroughAdds CycloneDX pedigree processing: new event, task, and importer that extract pedigree "resolved" patches from BOMs, map them to project components/vulnerabilities, and mark matching findings as RESOLVED; integrates pedigree handling into BOM upload flow and adds Nix flake/dev environment and tests. Changes
Sequence Diagram(s)sequenceDiagram
participant U as User / BOM Upload
participant BUPT as BomUploadProcessingTask
participant VAN as VulnerabilityAnalysis (event)
participant BPAE as BomPedigreeAnalysisEvent
participant BPAT as BomPedigreeAnalysisTask
participant CDPI as CycloneDXPedigreeImporter
participant DB as Database
U->>BUPT: Upload BOM
BUPT->>BUPT: Extract pedigree map (bomRef → vulnIds)
BUPT->>VAN: Trigger vulnerability analysis (chain pedigree map)
VAN->>DB: Persist findings
VAN-->>BUPT: Analysis complete
BUPT->>BPAE: Dispatch BomPedigreeAnalysisEvent(with map)
BPAE->>BPAT: Event received
BPAT->>CDPI: applyPedigree(map)
CDPI->>DB: Query components by bomRef/UUID
CDPI->>DB: Match findings → update analysis state to RESOLVED
CDPI->>DB: Add analysis comments/suppressions
CDPI-->>BPAT: Done
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~35 minutes Poem
🚥 Pre-merge checks | ✅ 1 | ❌ 2❌ Failed checks (1 warning, 1 inconclusive)
✅ Passed checks (1 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches
🧪 Generate unit tests (beta)
Comment |
There was a problem hiding this comment.
Actionable comments posted: 2
🧹 Nitpick comments (2)
src/main/java/org/dependencytrack/event/BomPedigreeAnalysisEvent.java (1)
38-43: Make the event payload immutable at construction time.This event is dispatched asynchronously; storing mutable
Map/Setreferences directly makes it vulnerable to accidental post-dispatch mutation.♻️ Proposed hardening
+import java.util.Objects; +import java.util.stream.Collectors; @@ public BomPedigreeAnalysisEvent( final UUID projectUuid, final Map<UUID, Set<String>> resolvedVulnIdsByComponentUuid) { - this.projectUuid = projectUuid; - this.resolvedVulnIdsByComponentUuid = resolvedVulnIdsByComponentUuid; + this.projectUuid = Objects.requireNonNull(projectUuid, "projectUuid must not be null"); + this.resolvedVulnIdsByComponentUuid = Objects.requireNonNull(resolvedVulnIdsByComponentUuid, "resolvedVulnIdsByComponentUuid must not be null") + .entrySet().stream() + .collect(Collectors.toUnmodifiableMap( + Map.Entry::getKey, + entry -> Set.copyOf(entry.getValue()) + )); }Also applies to: 49-50
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/main/java/org/dependencytrack/event/BomPedigreeAnalysisEvent.java` around lines 38 - 43, The constructor for BomPedigreeAnalysisEvent stores mutable Map/Set references; make defensive immutable copies at construction so the asynchronous event payload cannot be mutated after dispatch: in the BomPedigreeAnalysisEvent(UUID projectUuid, Map<UUID, Set<String>> resolvedVulnIdsByComponentUuid) (and the other constructor overload around lines 49-50) replace assignment with an immutable copy such as: create a new Map by iterating resolvedVulnIdsByComponentUuid and mapping each key to an immutable Set copy (e.g., Set.copyOf(value) or Collections.unmodifiableSet(new HashSet<>(value))), then wrap the map with Map.copyOf(...) or Collections.unmodifiableMap(...), and assign that to the resolvedVulnIdsByComponentUuid field so the stored payload is immutable.src/test/java/org/dependencytrack/parser/cyclonedx/CycloneDXPedigreeImporterTest.java (1)
322-357: Add a regression test for multi-BOM-ref deduplication.Current tests don’t cover the case where multiple BOM refs deduplicate to one persisted component identity. That scenario is important for verifying the UUID pedigree map does not drop resolved IDs during ref-to-component collapsing.
Also applies to: 361-390
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/test/java/org/dependencytrack/parser/cyclonedx/CycloneDXPedigreeImporterTest.java` around lines 322 - 357, Add a regression test to CycloneDXPedigreeImporterTest that exercises deduplication of multiple BOM refs collapsing to one persisted component so the UUID pedigree map doesn't drop resolved IDs: create a persisted Component (using qm.createComponent), persist two or more distinct BOM ref identities that should map to the same persisted component UUID (simulate by creating components or mapping inputs), attach a Vulnerability to that persisted component, then call pedigreeImporter.applyPedigree(qm, Map.of(persistedComponent.getUuid(), Set.of("CVE-2021-44228"))) but feed inputs that mimic multiple BOM refs resolving to that one UUID and assert the Analysis for the project is RESOLVED and suppressed; add this as a new `@Test` (e.g., shouldHandleMultiBomRefDeduplication) near the other applyPedigree tests so applyPedigree, CycloneDXPedigreeImporterTest, qm, persistedComponent, and Analysis assertions are exercised.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In
`@src/main/java/org/dependencytrack/parser/cyclonedx/CycloneDXPedigreeImporter.java`:
- Around line 168-170: The current code in CycloneDXPedigreeImporter uses
result.put(cdxComponent.getBomRef(), resolvedIds) which overwrites any existing
set for the same bomRef and can silently drop IDs; change this to merge with the
existing set instead (for example use
result.computeIfAbsent(cdxComponent.getBomRef(), k -> new
HashSet<>()).addAll(resolvedIds) or result.merge(..., (oldSet, newSet) -> {
oldSet.addAll(newSet); return oldSet; })) so duplicate bomRef entries accumulate
IDs rather than replace them.
---
Nitpick comments:
In `@src/main/java/org/dependencytrack/event/BomPedigreeAnalysisEvent.java`:
- Around line 38-43: The constructor for BomPedigreeAnalysisEvent stores mutable
Map/Set references; make defensive immutable copies at construction so the
asynchronous event payload cannot be mutated after dispatch: in the
BomPedigreeAnalysisEvent(UUID projectUuid, Map<UUID, Set<String>>
resolvedVulnIdsByComponentUuid) (and the other constructor overload around lines
49-50) replace assignment with an immutable copy such as: create a new Map by
iterating resolvedVulnIdsByComponentUuid and mapping each key to an immutable
Set copy (e.g., Set.copyOf(value) or Collections.unmodifiableSet(new
HashSet<>(value))), then wrap the map with Map.copyOf(...) or
Collections.unmodifiableMap(...), and assign that to the
resolvedVulnIdsByComponentUuid field so the stored payload is immutable.
In
`@src/test/java/org/dependencytrack/parser/cyclonedx/CycloneDXPedigreeImporterTest.java`:
- Around line 322-357: Add a regression test to CycloneDXPedigreeImporterTest
that exercises deduplication of multiple BOM refs collapsing to one persisted
component so the UUID pedigree map doesn't drop resolved IDs: create a persisted
Component (using qm.createComponent), persist two or more distinct BOM ref
identities that should map to the same persisted component UUID (simulate by
creating components or mapping inputs), attach a Vulnerability to that persisted
component, then call pedigreeImporter.applyPedigree(qm,
Map.of(persistedComponent.getUuid(), Set.of("CVE-2021-44228"))) but feed inputs
that mimic multiple BOM refs resolving to that one UUID and assert the Analysis
for the project is RESOLVED and suppressed; add this as a new `@Test` (e.g.,
shouldHandleMultiBomRefDeduplication) near the other applyPedigree tests so
applyPedigree, CycloneDXPedigreeImporterTest, qm, persistedComponent, and
Analysis assertions are exercised.
ℹ️ Review info
Configuration used: defaults
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
flake.lockis excluded by!**/*.lock
📒 Files selected for processing (8)
.envrcflake.nixsrc/main/java/org/dependencytrack/event/BomPedigreeAnalysisEvent.javasrc/main/java/org/dependencytrack/event/EventSubsystemInitializer.javasrc/main/java/org/dependencytrack/parser/cyclonedx/CycloneDXPedigreeImporter.javasrc/main/java/org/dependencytrack/tasks/BomPedigreeAnalysisTask.javasrc/main/java/org/dependencytrack/tasks/BomUploadProcessingTask.javasrc/test/java/org/dependencytrack/parser/cyclonedx/CycloneDXPedigreeImporterTest.java
| if (!resolvedIds.isEmpty() && cdxComponent.getBomRef() != null) { | ||
| result.put(cdxComponent.getBomRef(), resolvedIds); | ||
| } |
There was a problem hiding this comment.
Avoid overwriting resolved IDs for repeated bomRef entries.
Using put here replaces prior IDs for the same ref. Merge sets instead to prevent silent loss when duplicate refs appear.
🔧 Proposed fix
- if (!resolvedIds.isEmpty() && cdxComponent.getBomRef() != null) {
- result.put(cdxComponent.getBomRef(), resolvedIds);
+ if (!resolvedIds.isEmpty() && cdxComponent.getBomRef() != null) {
+ result.computeIfAbsent(cdxComponent.getBomRef(), ignored -> new java.util.HashSet<>())
+ .addAll(resolvedIds);
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| if (!resolvedIds.isEmpty() && cdxComponent.getBomRef() != null) { | |
| result.put(cdxComponent.getBomRef(), resolvedIds); | |
| } | |
| if (!resolvedIds.isEmpty() && cdxComponent.getBomRef() != null) { | |
| result.computeIfAbsent(cdxComponent.getBomRef(), ignored -> new java.util.HashSet<>()) | |
| .addAll(resolvedIds); | |
| } |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In
`@src/main/java/org/dependencytrack/parser/cyclonedx/CycloneDXPedigreeImporter.java`
around lines 168 - 170, The current code in CycloneDXPedigreeImporter uses
result.put(cdxComponent.getBomRef(), resolvedIds) which overwrites any existing
set for the same bomRef and can silently drop IDs; change this to merge with the
existing set instead (for example use
result.computeIfAbsent(cdxComponent.getBomRef(), k -> new
HashSet<>()).addAll(resolvedIds) or result.merge(..., (oldSet, newSet) -> {
oldSet.addAll(newSet); return oldSet; })) so duplicate bomRef entries accumulate
IDs rather than replace them.
e0c4bed to
e07acd7
Compare
There was a problem hiding this comment.
♻️ Duplicate comments (1)
src/main/java/org/dependencytrack/tasks/BomUploadProcessingTask.java (1)
866-885:⚠️ Potential issue | 🟡 MinorPedigree map construction may still lose resolved IDs after component deduplication.
When multiple CycloneDX components with different
bomRefvalues are deduplicated to a single persisted component, only onebomRefsurvives on the persisted component. The current implementation iterates overprocessedComponentsand looks up bycomponent.getBomRef(), so resolved IDs attached to non-retainedbomRefvalues are lost.Consider iterating over
resolvedVulnIdsByBomRef.entrySet()and usingidentitiesByBomRefto resolve the component UUID, then merging IDs for the same UUID.🔧 Proposed fix
private static Map<UUID, Set<String>> buildPedigreeMap( final org.cyclonedx.model.Bom cdxBom, - final List<Component> processedComponents) { + final Map<String, ComponentIdentity> identitiesByBomRef) { final Map<String, Set<String>> resolvedVulnIdsByBomRef = CycloneDXPedigreeImporter.extractResolvedVulnIdsByBomRef(cdxBom); if (resolvedVulnIdsByBomRef.isEmpty()) { return Map.of(); } final var result = new HashMap<UUID, Set<String>>(); - for (final Component component : processedComponents) { - if (component.getBomRef() == null) { + for (final var entry : resolvedVulnIdsByBomRef.entrySet()) { + final ComponentIdentity identity = identitiesByBomRef.get(entry.getKey()); + if (identity == null || identity.getUuid() == null || entry.getValue().isEmpty()) { continue; } - final Set<String> resolvedIds = resolvedVulnIdsByBomRef.get(component.getBomRef()); - if (resolvedIds != null && !resolvedIds.isEmpty()) { - result.put(component.getUuid(), resolvedIds); - } + result.computeIfAbsent(identity.getUuid(), ignored -> new HashSet<>()) + .addAll(entry.getValue()); } return result; }Also update the call site at line 312:
- final Map<UUID, Set<String>> pedigreeByComponentUuid = buildPedigreeMap(cdxBom, processedComponents); + final Map<UUID, Set<String>> pedigreeByComponentUuid = buildPedigreeMap(cdxBom, identitiesByBomRef);🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/main/java/org/dependencytrack/tasks/BomUploadProcessingTask.java` around lines 866 - 885, The current buildPedigreeMap method loses resolved IDs when multiple CycloneDX bomRef values deduplicate to one persisted Component; change buildPedigreeMap to iterate over CycloneDXPedigreeImporter.extractResolvedVulnIdsByBomRef(cdxBom).entrySet(), for each entry look up the persisted Component UUID via the identitiesByBomRef mapping (the dedup map created during processing) rather than using component.getBomRef(), and merge the resolved ID sets into a single Set<String> per UUID (use result.computeIfAbsent(uuid, ...) and addAll) so all resolved IDs from different bomRefs that map to the same persisted component are preserved; also update the caller that currently supplies processedComponents to pass or make available the identitiesByBomRef map so buildPedigreeMap can resolve bomRef→UUID.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Duplicate comments:
In `@src/main/java/org/dependencytrack/tasks/BomUploadProcessingTask.java`:
- Around line 866-885: The current buildPedigreeMap method loses resolved IDs
when multiple CycloneDX bomRef values deduplicate to one persisted Component;
change buildPedigreeMap to iterate over
CycloneDXPedigreeImporter.extractResolvedVulnIdsByBomRef(cdxBom).entrySet(), for
each entry look up the persisted Component UUID via the identitiesByBomRef
mapping (the dedup map created during processing) rather than using
component.getBomRef(), and merge the resolved ID sets into a single Set<String>
per UUID (use result.computeIfAbsent(uuid, ...) and addAll) so all resolved IDs
from different bomRefs that map to the same persisted component are preserved;
also update the caller that currently supplies processedComponents to pass or
make available the identitiesByBomRef map so buildPedigreeMap can resolve
bomRef→UUID.
ℹ️ Review info
Configuration used: defaults
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
flake.lockis excluded by!**/*.lock
📒 Files selected for processing (8)
.envrcflake.nixsrc/main/java/org/dependencytrack/event/BomPedigreeAnalysisEvent.javasrc/main/java/org/dependencytrack/event/EventSubsystemInitializer.javasrc/main/java/org/dependencytrack/parser/cyclonedx/CycloneDXPedigreeImporter.javasrc/main/java/org/dependencytrack/tasks/BomPedigreeAnalysisTask.javasrc/main/java/org/dependencytrack/tasks/BomUploadProcessingTask.javasrc/test/java/org/dependencytrack/parser/cyclonedx/CycloneDXPedigreeImporterTest.java
🚧 Files skipped from review as they are similar to previous changes (2)
- .envrc
- src/main/java/org/dependencytrack/event/BomPedigreeAnalysisEvent.java
Description
Addressed Issue
Additional Details
Checklist
Summary by CodeRabbit