Implement image optimization modes across runtime targets#143
Conversation
|
Caution Review failedPull request was closed or merged during review 📝 WalkthroughWalkthroughThe PR introduces image optimization capabilities with a new Changes
Sequence Diagram(s)sequenceDiagram
actor User
participant CLI
participant Core
participant Codec
participant Output
User->>CLI: optimize input.jpg --mode lossy --target-quality ssim:0.98
CLI->>CLI: parse_args → optimize_from_clap
CLI->>Core: validate TransformOptions (optimize, targetQuality)
Core->>Core: normalize() enforces compatibility<br/>format supports optimization
Core->>Codec: transform_raster(options)
alt Lossless Mode
Codec->>Codec: try_passthrough_lossless_optimization
alt Passthrough eligible
Codec->>Output: return early (no decode/encode)
else Requires processing
Codec->>Codec: decode image
Codec->>Codec: encode_output(None) → lossless
end
else Lossy Mode
Codec->>Codec: decode image
Codec->>Codec: encode_output(Lossy) → baseline encode
Codec->>Codec: measure quality (SSIM/PSNR)
loop binary search toward target_quality
Codec->>Codec: adjust quality parameter
Codec->>Codec: re-encode and measure
end
Codec->>Codec: inject metadata (EXIF/ICC policy)
else Auto Mode
Codec->>Codec: encode_output(Auto) → decide<br/>lossless vs lossy based on format
Codec->>Codec: compare sizes, apply heuristics
end
Codec->>Output: EncodedOutput (bytes, metadata)
Output->>User: optimized image file
Estimated code review effort🎯 4 (Complex) | ⏱️ ~60 minutes Possibly related PRs
Poem
🚥 Pre-merge checks | ✅ 5✅ Passed checks (5 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches
🧪 Generate unit tests (beta)
📝 Coding Plan
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 5
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (2)
src/codecs/raster.rs (1)
259-265:⚠️ Potential issue | 🟡 MinorEmit WebP metadata-drop warnings only for metadata that existed.
Line 263 and Line 264 currently push both EXIF and ICC warnings whenever retained metadata is non-empty. This can report ICC dropped even when ICC was never present/requested.
🧭 Proposed fix
if normalized.options.format == MediaType::Webp && encoded.used_lossy_webp - && retained_metadata.as_ref().is_some_and(|m| !m.is_empty()) { - warnings.push(TransformWarning::MetadataDropped(MetadataKind::Exif)); - warnings.push(TransformWarning::MetadataDropped(MetadataKind::Icc)); + if let Some(metadata) = retained_metadata.as_ref() { + if metadata.exif_metadata.is_some() { + warnings.push(TransformWarning::MetadataDropped(MetadataKind::Exif)); + } + if metadata.icc_profile.is_some() { + warnings.push(TransformWarning::MetadataDropped(MetadataKind::Icc)); + } + } }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/codecs/raster.rs` around lines 259 - 265, The current block always pushes both EXIF and ICC metadata-drop warnings when retained_metadata is non-empty; update the logic to only push TransformWarning::MetadataDropped(MetadataKind::Exif) if retained_metadata actually contains EXIF, and only push TransformWarning::MetadataDropped(MetadataKind::Icc) if retained_metadata actually contains ICC. Locate the condition using normalized.options.format, encoded.used_lossy_webp, retained_metadata, and warnings and change the two unconditional pushes to per-kind checks against retained_metadata (e.g., test for presence of EXIF and ICC in the retained_metadata before pushing the corresponding warnings).web/app.js (1)
337-347:⚠️ Potential issue | 🟡 MinorReset the previous size comparison before each new run.
After one successful transform, a later failure keeps the old before/after text on screen because only the success path updates
#size-comparison. That makes the new metric look like it belongs to the failed run.♻️ Proposed fix
async function runTransform() { clearMessages(); if (!state.inputBytes || !state.inputArtifact) { showError("Choose an image before running the transform."); return; } const outputFormat = elements.format.value; if (state.inputArtifact.mediaType === "svg" && !state.capabilities.svg) { showError("This browser build does not include SVG processing."); return; } if (outputFormat === "svg" && state.inputArtifact.mediaType !== "svg") { showError("SVG output is only available when the input is already SVG."); return; } const options = collectOptions(); + elements.sizeComparison.textContent = "No size comparison yet."; setBusy(true);🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@web/app.js` around lines 337 - 347, The old size comparison remains visible after a failed transform because only the success path updates the `#size-comparison`; to fix it, clear or reset the size comparison at the start of a new run and on error by calling renderSizeComparison(0,0) (or explicitly clearing the element) — add that call before showError in the catch block and/or at the beginning of the transform flow so renderSizeComparison is always reset when a new run begins (refer to the renderSizeComparison function and the catch block where parseWasmError, showError, and setStatus are used).
🧹 Nitpick comments (5)
src/adapters/server/cache.rs (1)
539-556: Add cache-key regression tests foroptimizeandtargetQuality.These new fields now affect cache identity (Line 539 and Line 553). Targeted tests here would guard against future regressions where either field is accidentally dropped from canonicalization.
✅ Suggested test additions
+ #[test] + fn cache_key_differs_by_optimize_mode() { + let base = TransformOptions::default(); + let auto = TransformOptions { + optimize: crate::OptimizeMode::Auto, + ..TransformOptions::default() + }; + assert_ne!( + compute_cache_key("img.png", &base, None, None), + compute_cache_key("img.png", &auto, None, None) + ); + } + + #[test] + fn cache_key_differs_by_target_quality() { + let a = TransformOptions { + optimize: crate::OptimizeMode::Lossy, + target_quality: Some(crate::TargetQuality { + metric: crate::QualityMetric::Ssim, + value: 0.98, + }), + ..TransformOptions::default() + }; + let b = TransformOptions { + optimize: crate::OptimizeMode::Lossy, + target_quality: Some(crate::TargetQuality { + metric: crate::QualityMetric::Ssim, + value: 0.99, + }), + ..TransformOptions::default() + }; + assert_ne!( + compute_cache_key("img.png", &a, None, None), + compute_cache_key("img.png", &b, None, None) + ); + }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/adapters/server/cache.rs` around lines 539 - 556, Add unit tests that assert the cache-key canonicalization includes the new fields introduced around the canonical construction (the canonical String built with push_param) by exercising options.optimize and options.target_quality (and options.targetQuality variant) so future changes won't drop them; specifically, create tests that call the cache-key generation path (the function that builds the canonical string using the canonical variable and push_param) with (a) optimize set to a non-None OptimizeMode and (b) target_quality set to Some(value), then assert the produced canonical/cache key contains the corresponding "optimize" and "targetQuality" parameter substrings and that differing values produce different keys.tests/server_transform_basic.rs (1)
37-59: Add a JSON-body regression fortargetQuality.This request only proves the new
optimizefield survives JSON parsing.targetQualityis a separate camelCase field and parser path, so one end-to-end case with something like"targetQuality":"ssim:0.98"would close the remaining server-side gap here.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@tests/server_transform_basic.rs` around lines 37 - 59, Extend the existing test serve_once_accepts_optimize_options_in_json_body to include a camelCase targetQuality field in the JSON body (e.g. "targetQuality":"ssim:0.98") so the server's parser path for targetQuality is exercised; call the same send_transform_request with that JSON payload (keeping ServerConfig and auth the same), then after split_response/sniff_artifact verify the request still succeeds (HTTP 200) and that the transformed artifact and content_type are as expected—this ensures targetQuality is correctly parsed and applied in the server handling code.src/adapters/wasm.rs (1)
743-761: Add negative-case tests for new WASM options.Please add explicit invalid-input tests (e.g., bad
optimize, malformedtargetQuality) to lock down failure behavior for the new parsing paths.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/adapters/wasm.rs` around lines 743 - 761, Add negative-case unit tests for parse_wasm_options to assert it rejects invalid inputs: create tests that call parse_wasm_options with WasmTransformOptions where optimize is an invalid string (e.g., "fast" or "unknown") and where target_quality is malformed (e.g., "ssim:abc" or "invalid_format"), then assert the call returns Err (use expect_err or assert!(result.is_err())) and optionally check the error variant/message; reference the existing parse_wasm_options function, WasmTransformOptions struct, and OptimizeMode enum to locate where to add tests and what failure behavior to assert.src/adapters/cli/mod.rs (1)
981-988: Consider collapsing duplicate convert/optimize execution arms.Both match arms call the same executor with identical handling.
♻️ Proposed refactor
- Ok(Command::Convert(command)) => match convert::execute_convert(command, stdin, stdout) { - Ok(()) => EXIT_SUCCESS, - Err(error) => write_error(stderr, error), - }, - Ok(Command::Optimize(command)) => match convert::execute_convert(command, stdin, stdout) { + Ok(Command::Convert(command) | Command::Optimize(command)) => { + match convert::execute_convert(command, stdin, stdout) { Ok(()) => EXIT_SUCCESS, Err(error) => write_error(stderr, error), - }, + } + }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/adapters/cli/mod.rs` around lines 981 - 988, The two match arms for Command::Convert and Command::Optimize duplicate the same call/handling; collapse them by matching both patterns in a single arm (e.g. pattern OR: Command::Convert(cmd) | Command::Optimize(cmd)) and then call convert::execute_convert(cmd, stdin, stdout) with the same Ok => EXIT_SUCCESS and Err(error) => write_error(stderr, error) handling, keeping the existing symbols convert::execute_convert, write_error and EXIT_SUCCESS.src/core.rs (1)
719-797: Add focused tests for the new optimize/target-quality validation matrix.This is a critical option-validation path; adding table-driven tests here would harden behavior across mode/format/metric boundary cases.
Also applies to: 1421-1443
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/core.rs` around lines 719 - 797, Add table-driven unit tests covering the optimize/target_quality validation matrix: create test cases that vary OptimizeMode (None, Auto, Lossy, Lossless), MediaType outputs (jpeg, webp, avif, png, svg, etc.), whether target_quality and quality are Some/None, and optimize combinations to assert the expected TransformError or success from the validation path that uses validate_target_quality, format.supports_optimization(), format.supports_lossy_optimization(), and format.is_lossy(); include specific cases for (a) target_quality present with optimize None/Lossless (expect InvalidOptions), (b) target_quality present with non-lossy formats (expect InvalidOptions), (c) quality present with non-lossy formats or optimize=lossless (expect InvalidOptions), and (d) preserve_exif vs strip_metadata interactions and SVG format check—implement these as a table-driven loop so each tuple maps to a clear assertion against the validate/validation entry point used in core (the code around validate_target_quality / optimize checks).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@doc/openapi.yaml`:
- Around line 733-743: The TargetQuality query param schema (name:
targetQuality) allows non-numeric values like "ssim:abc"; update its pattern to
enforce numeric values (integers or decimals) after the metric prefix so runtime
parsing won't reject clients—e.g., replace the current pattern
'^(ssim|psnr):.+$' with a stricter regex such as '^(ssim|psnr):\d+(\.\d+)?$'
(apply the same change to the second occurrence of the TargetQuality pattern as
noted).
In `@README.md`:
- Around line 100-102: The README introduces the new CLI command "truss
optimize" but the Commands table later in the document doesn't list it; update
the Commands table to include an entry for "optimize" with a short description
(e.g., "Optimize images in-place or to output file"), usage example matching the
one shown ("truss optimize photo.jpg -o photo-optimized.jpg --mode auto"), and
any relevant flags (like -o/--mode) so the table and the earlier example are
consistent; locate the Commands table header and add a row for the optimize
command accordingly.
In `@src/adapters/cli/mod.rs`:
- Around line 523-524: ClapOptimizeArgs currently accepts any MediaType for the
format field; update the CLI to reject non-optimizable types by changing the
argument's parser from parse_media_type to a new validator (e.g.,
parse_optimizable_media_type) or by wrapping the field in an
OptimizableMediaType newtype and using its FromStr/validator; the validator
should parse into MediaType then check against the set of supported/optimizable
formats (using the same core logic or lookup used by the optimizer) and return
an error for unsupported types so optimize --format fails fast and matches the
help text.
- Around line 165-166: The CLI help and convert::optimize_from_clap expect
OptimizeMode::Auto but the shared option mapping in mod.rs uses
TransformOptions::default().optimize (which is OptimizeMode::None), causing
inconsistent defaults; update the shared mapping to explicitly use
OptimizeMode::Auto instead of TransformOptions::default().optimize (or change
TransformOptions::default() so its .optimize is OptimizeMode::Auto) so both code
paths consistently default to OptimizeMode::Auto; reference the symbols
optimize_from_clap, OptimizeMode::Auto, TransformOptions::default().optimize and
the shared option mapping in mod.rs to locate and fix the code.
---
Outside diff comments:
In `@src/codecs/raster.rs`:
- Around line 259-265: The current block always pushes both EXIF and ICC
metadata-drop warnings when retained_metadata is non-empty; update the logic to
only push TransformWarning::MetadataDropped(MetadataKind::Exif) if
retained_metadata actually contains EXIF, and only push
TransformWarning::MetadataDropped(MetadataKind::Icc) if retained_metadata
actually contains ICC. Locate the condition using normalized.options.format,
encoded.used_lossy_webp, retained_metadata, and warnings and change the two
unconditional pushes to per-kind checks against retained_metadata (e.g., test
for presence of EXIF and ICC in the retained_metadata before pushing the
corresponding warnings).
In `@web/app.js`:
- Around line 337-347: The old size comparison remains visible after a failed
transform because only the success path updates the `#size-comparison`; to fix it,
clear or reset the size comparison at the start of a new run and on error by
calling renderSizeComparison(0,0) (or explicitly clearing the element) — add
that call before showError in the catch block and/or at the beginning of the
transform flow so renderSizeComparison is always reset when a new run begins
(refer to the renderSizeComparison function and the catch block where
parseWasmError, showError, and setStatus are used).
---
Nitpick comments:
In `@src/adapters/cli/mod.rs`:
- Around line 981-988: The two match arms for Command::Convert and
Command::Optimize duplicate the same call/handling; collapse them by matching
both patterns in a single arm (e.g. pattern OR: Command::Convert(cmd) |
Command::Optimize(cmd)) and then call convert::execute_convert(cmd, stdin,
stdout) with the same Ok => EXIT_SUCCESS and Err(error) => write_error(stderr,
error) handling, keeping the existing symbols convert::execute_convert,
write_error and EXIT_SUCCESS.
In `@src/adapters/server/cache.rs`:
- Around line 539-556: Add unit tests that assert the cache-key canonicalization
includes the new fields introduced around the canonical construction (the
canonical String built with push_param) by exercising options.optimize and
options.target_quality (and options.targetQuality variant) so future changes
won't drop them; specifically, create tests that call the cache-key generation
path (the function that builds the canonical string using the canonical variable
and push_param) with (a) optimize set to a non-None OptimizeMode and (b)
target_quality set to Some(value), then assert the produced canonical/cache key
contains the corresponding "optimize" and "targetQuality" parameter substrings
and that differing values produce different keys.
In `@src/adapters/wasm.rs`:
- Around line 743-761: Add negative-case unit tests for parse_wasm_options to
assert it rejects invalid inputs: create tests that call parse_wasm_options with
WasmTransformOptions where optimize is an invalid string (e.g., "fast" or
"unknown") and where target_quality is malformed (e.g., "ssim:abc" or
"invalid_format"), then assert the call returns Err (use expect_err or
assert!(result.is_err())) and optionally check the error variant/message;
reference the existing parse_wasm_options function, WasmTransformOptions struct,
and OptimizeMode enum to locate where to add tests and what failure behavior to
assert.
In `@src/core.rs`:
- Around line 719-797: Add table-driven unit tests covering the
optimize/target_quality validation matrix: create test cases that vary
OptimizeMode (None, Auto, Lossy, Lossless), MediaType outputs (jpeg, webp, avif,
png, svg, etc.), whether target_quality and quality are Some/None, and optimize
combinations to assert the expected TransformError or success from the
validation path that uses validate_target_quality,
format.supports_optimization(), format.supports_lossy_optimization(), and
format.is_lossy(); include specific cases for (a) target_quality present with
optimize None/Lossless (expect InvalidOptions), (b) target_quality present with
non-lossy formats (expect InvalidOptions), (c) quality present with non-lossy
formats or optimize=lossless (expect InvalidOptions), and (d) preserve_exif vs
strip_metadata interactions and SVG format check—implement these as a
table-driven loop so each tuple maps to a clear assertion against the
validate/validation entry point used in core (the code around
validate_target_quality / optimize checks).
In `@tests/server_transform_basic.rs`:
- Around line 37-59: Extend the existing test
serve_once_accepts_optimize_options_in_json_body to include a camelCase
targetQuality field in the JSON body (e.g. "targetQuality":"ssim:0.98") so the
server's parser path for targetQuality is exercised; call the same
send_transform_request with that JSON payload (keeping ServerConfig and auth the
same), then after split_response/sniff_artifact verify the request still
succeeds (HTTP 200) and that the transformed artifact and content_type are as
expected—this ensures targetQuality is correctly parsed and applied in the
server handling code.
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: f94d6482-ce63-4d3b-80a7-7858eb37c821
📒 Files selected for processing (20)
README.mddoc/openapi.yamldocs/configuration.mdintegration/cli/spec/convert_spec.shsrc/adapters/cli/convert.rssrc/adapters/cli/mod.rssrc/adapters/cli/sign.rssrc/adapters/server/auth.rssrc/adapters/server/cache.rssrc/adapters/server/config.rssrc/adapters/server/handler.rssrc/adapters/server/mod.rssrc/adapters/server/signing.rssrc/adapters/wasm.rssrc/codecs/raster.rssrc/core.rssrc/lib.rstests/server_transform_basic.rsweb/app.jsweb/index.html
There was a problem hiding this comment.
Actionable comments posted: 7
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
src/core.rs (1)
753-797:⚠️ Potential issue | 🟠 MajorLossy optimize path still lacks ICC-preserving default metadata behavior.
Line 753-Line 797 validate optimize modes, but Line 805-Line 818 still derive metadata policy solely from
strip_metadata/preserve_exif. With defaults,optimize=lossystrips all metadata (including ICC), which conflicts with the PR objective to avoid stripping ICC unless explicitly requested.Please add an optimize-aware metadata policy path (e.g., “strip non-essential, keep ICC”) and use it as the default for lossy optimize mode.
Also applies to: 805-818
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/core.rs` around lines 753 - 797, The metadata-policy derivation must become optimize-aware: in the block that currently derives the policy from strip_metadata and preserve_exif (the metadata policy derivation following the optimize checks), add a branch for when optimize == OptimizeMode::Lossy to set a policy like “strip non-essential, keep ICC” (e.g., set metadata_policy = PreserveICCStripOther). Implement this so that when optimize is Lossy and the caller did not explicitly request strip_metadata or preserve_exif, ICC profiles are preserved but EXIF/other non-essential metadata are stripped; keep existing behavior when preserve_exif is true or strip_metadata is explicitly set. Reference symbols to change: optimize / OptimizeMode::Lossy, strip_metadata, preserve_exif, and the metadata policy variable or function in the derivation block (currently at the spot after the optimize validation checks).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@doc/openapi.yaml`:
- Around line 733-743: Update the TargetQuality (targetQuality) schema to
enforce per-metric ranges by replacing the single pattern with a oneOf that
defines two explicit options: one schema for ssim with a pattern that allows
decimal values >0 and <=1 (e.g. ssim pattern that rejects 0 and accepts 1 or
1.0), and another schema for psnr with a pattern that accepts numeric values >0;
also update the description to document these ranges (ssim in (0,1], psnr > 0)
so generated clients match server validation and consumers see the constraints.
In `@src/adapters/server/cache.rs`:
- Around line 553-556: When building the cache canonical key, ensure the
implicit default targetQuality is included when it would affect output: if
options.target_quality is Some push it as now, else if options.optimize is Auto
or Lossy and options.quality is None compute default_target_quality(media_type)
and push that into canonical (use the same push_param call) so
omitted-and-default and explicit-default produce the same hash; otherwise leave
it omitted. Reference symbols: options.target_quality, options.optimize,
options.quality, default_target_quality(media_type), push_param(&mut canonical,
"targetQuality", ...), canonical.
In `@src/codecs/raster.rs`:
- Around line 1537-1540: The WebP branch in the media encoder (MediaType::Webp
returning EncodedOutput with encode_webp_lossy_bytes) currently ignores retained
metadata flags (e.g., retained_metadata / strip_metadata / preserve_exif) and
may emit a lossy WebP that drops EXIF/ICC; update this branch to check the
retained_metadata flags and either (a) route to a metadata-preserving WebP
encoder path (or use the existing metadata-preserving encode path) when metadata
must be retained, or (b) return an error/reject the combination when lossy WebP
cannot preserve metadata; ensure you reference and respect retained_metadata,
strip_metadata and preserve_exif when deciding between encode_webp_lossy_bytes
and the metadata-preserving alternative (or error).
- Around line 1214-1241: The optimization helpers (e.g., encode_auto_output,
encode_lossy_optimized_output, encode_png_optimized and any loops they call)
perform multiple full encodes but do not re-check the request deadline during
their work; modify the call sites so the deadline/context is passed into these
helpers (or provide a closure/check function) and add periodic deadline checks
inside their optimization loops and candidate encoding/decoding steps, returning
a TransformError::DeadlineExceeded (or appropriate error variant) immediately
when the deadline is exceeded; also update transform_raster/encode_output call
sites to forward the deadline/context so the helpers can enforce it.
In `@tests/common/mod.rs`:
- Around line 73-74: The read_fixture_request function currently polls for
readiness because the accepted TcpStream stays non-blocking and lacks a
socket-level timeout; update read_fixture_request to call
stream.set_read_timeout(Some(Duration::from_secs(5))) (matching the pattern used
in src/adapters/server/mod.rs) before entering the deadline loop so reads will
block with a 5s timeout instead of spinning on WouldBlock; ensure you handle the
Result from set_read_timeout (propagate or unwrap in tests) and keep the
existing Instant deadline logic as a fallback.
In `@web/app.js`:
- Around line 370-371: When building the options object that includes
elements.optimizeMode.value and targetQuality (where targetQualityEnabled() and
emptyToNull are used), validate and normalize the optimize mode against the
chosen output codec/format and other flags before serializing: call the same
logic used in refreshOptimizeState() (or centralize it) to reject modes
unsupported by the selected codec (e.g., disallow "lossy" for PNG/BMP/TIFF/SVG,
disallow WebP "lossy" unless webpLossy is true, and disallow "lossless" when a
resize/crop/rotate transformation path is active), and fall back to a safe
default or null; update every place that serializes optimize:
elements.optimizeMode.value (including the blocks around the shown diff and the
other loci mentioned) to use this filtered value so invalid combinations never
get sent to the core.
---
Outside diff comments:
In `@src/core.rs`:
- Around line 753-797: The metadata-policy derivation must become
optimize-aware: in the block that currently derives the policy from
strip_metadata and preserve_exif (the metadata policy derivation following the
optimize checks), add a branch for when optimize == OptimizeMode::Lossy to set a
policy like “strip non-essential, keep ICC” (e.g., set metadata_policy =
PreserveICCStripOther). Implement this so that when optimize is Lossy and the
caller did not explicitly request strip_metadata or preserve_exif, ICC profiles
are preserved but EXIF/other non-essential metadata are stripped; keep existing
behavior when preserve_exif is true or strip_metadata is explicitly set.
Reference symbols to change: optimize / OptimizeMode::Lossy, strip_metadata,
preserve_exif, and the metadata policy variable or function in the derivation
block (currently at the spot after the optimize validation checks).
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 2abc1816-62a6-4843-817b-8236f362ae0c
📒 Files selected for processing (10)
README.mddoc/openapi.yamlsrc/adapters/cli/mod.rssrc/adapters/server/cache.rssrc/adapters/wasm.rssrc/codecs/raster.rssrc/core.rstests/common/mod.rstests/server_transform_basic.rsweb/app.js
🚧 Files skipped from review as they are similar to previous changes (1)
- tests/server_transform_basic.rs
There was a problem hiding this comment.
Actionable comments posted: 1
♻️ Duplicate comments (3)
src/codecs/raster.rs (3)
1548-1551:⚠️ Potential issue | 🟠 MajorLossy WebP path still ignores explicit metadata-retention requests.
Line 1549 always routes to
encode_webp_lossy_bytes(...)and does not consultretained_metadata. This still returns success while dropping requested EXIF/ICC retention.Suggested guard
MediaType::Webp => Ok(EncodedOutput { - bytes: encode_webp_lossy_bytes(image, quality)?, + bytes: { + if retained_metadata.is_some_and(|m| { + m.exif_metadata.is_some() || m.icc_profile.is_some() + }) { + return Err(TransformError::CapabilityMissing( + "lossy WebP cannot preserve requested EXIF/ICC metadata".to_string(), + )); + } + encode_webp_lossy_bytes(image, quality)? + }, used_lossy_webp: true, }),🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/codecs/raster.rs` around lines 1548 - 1551, The WebP branch always calls encode_webp_lossy_bytes and sets used_lossy_webp=true, which ignores the retained_metadata flag; update the MediaType::Webp arm to check the retained_metadata (the same flag used elsewhere) and if metadata (EXIF/ICC) must be preserved, route to the metadata-preserving path (e.g., use encode_webp_lossless_bytes or the existing lossless/metadata-preserving encoder) instead of encode_webp_lossy_bytes, and only set used_lossy_webp=true when actually using the lossy encoder; touch MediaType::Webp, EncodedOutput construction, encode_webp_lossy_bytes and the retained_metadata check to implement this guard.
1348-1358:⚠️ Potential issue | 🟠 Major
targetQualitystill succeeds even when target is missed.When no candidate reaches the requested metric, Lines 1351-1357 fall back to
max_qualityand return success. This makestargetQualitybest-effort instead of enforceable.Suggested fix
if let Some(best) = best { Ok(best) } else { - encode_lossy_with_quality( - image, - media_type, - max_quality.max(1), - retained_metadata, - true, - ) + Err(TransformError::InvalidOptions(format!( + "target quality {target} could not be met within quality cap {max_quality}" + ))) } }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/codecs/raster.rs` around lines 1348 - 1358, The current branch returns a lossy encode fallback when `best` is None, making `targetQuality` only best-effort; instead, change the else branch so that when no candidate meets the requested `targetQuality` you return a failure/Err (with a clear error variant/message indicating "target quality not met" and include the requested target and achieved `max_quality`), rather than calling `encode_lossy_with_quality`. Update the code around the `best` handling (the block that currently calls `encode_lossy_with_quality(image, media_type, max_quality.max(1), retained_metadata, true)`) to return an error result; keep the successful `Ok(best)` path unchanged.
1205-1223:⚠️ Potential issue | 🟠 MajorOptimization loops still run without in-loop deadline checks.
The encode optimization paths do multiple full encode/decode iterations (for example Line 1323 onward and Line 1523 onward), but deadline is only checked after
encode_outputreturns (Line 255). Requests can exceed deadline after doing all expensive work.Also applies to: 1321-1336, 1505-1528
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@src/codecs/raster.rs`:
- Around line 192-196: The early return from
try_passthrough_lossless_optimization(...) can bypass deadline enforcement
because deadline and start are set after that call; move the deadline
initialization (let deadline = normalized.options.deadline) and start timestamp
(let start = deadline.map(|_| Instant::now())) to before calling
try_passthrough_lossless_optimization, or otherwise ensure the passthrough path
checks/enforces the same deadline using normalized.options.deadline and
Instant::now() so the deadline logic always runs even when
try_passthrough_lossless_optimization returns early.
---
Duplicate comments:
In `@src/codecs/raster.rs`:
- Around line 1548-1551: The WebP branch always calls encode_webp_lossy_bytes
and sets used_lossy_webp=true, which ignores the retained_metadata flag; update
the MediaType::Webp arm to check the retained_metadata (the same flag used
elsewhere) and if metadata (EXIF/ICC) must be preserved, route to the
metadata-preserving path (e.g., use encode_webp_lossless_bytes or the existing
lossless/metadata-preserving encoder) instead of encode_webp_lossy_bytes, and
only set used_lossy_webp=true when actually using the lossy encoder; touch
MediaType::Webp, EncodedOutput construction, encode_webp_lossy_bytes and the
retained_metadata check to implement this guard.
- Around line 1348-1358: The current branch returns a lossy encode fallback when
`best` is None, making `targetQuality` only best-effort; instead, change the
else branch so that when no candidate meets the requested `targetQuality` you
return a failure/Err (with a clear error variant/message indicating "target
quality not met" and include the requested target and achieved `max_quality`),
rather than calling `encode_lossy_with_quality`. Update the code around the
`best` handling (the block that currently calls
`encode_lossy_with_quality(image, media_type, max_quality.max(1),
retained_metadata, true)`) to return an error result; keep the successful
`Ok(best)` path unchanged.
Close #139
Summary by CodeRabbit
New Features
optimizeCLI command supporting auto, lossless, and lossy optimization modes with target quality control.Documentation