Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
38 changes: 24 additions & 14 deletions src/blob.rs
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
//! # Blob directory management.

use std::cmp::max;
use std::io::{Cursor, Seek};
use std::iter::FusedIterator;
use std::mem;
Expand Down Expand Up @@ -255,7 +256,7 @@ impl<'a> BlobObject<'a> {

/// Recode image to avatar size.
pub async fn recode_to_avatar_size(&mut self, context: &Context) -> Result<()> {
let (img_wh, max_bytes) =
let (max_wh, max_bytes) =
match MediaQuality::from_i32(context.get_config_int(Config::MediaQuality).await?)
.unwrap_or_default()
{
Expand All @@ -272,7 +273,7 @@ impl<'a> BlobObject<'a> {
let is_avatar = true;
self.check_or_recode_to_size(
context, None, // The name of an avatar doesn't matter
viewtype, img_wh, max_bytes, is_avatar,
viewtype, max_wh, max_bytes, is_avatar,
)?;

Ok(())
Expand All @@ -293,7 +294,7 @@ impl<'a> BlobObject<'a> {
name: Option<String>,
viewtype: &mut Viewtype,
) -> Result<String> {
let (img_wh, max_bytes) =
let (max_wh, max_bytes) =
match MediaQuality::from_i32(context.get_config_int(Config::MediaQuality).await?)
.unwrap_or_default()
{
Expand All @@ -304,13 +305,15 @@ impl<'a> BlobObject<'a> {
MediaQuality::Worse => (constants::WORSE_IMAGE_SIZE, constants::WORSE_IMAGE_BYTES),
};
let is_avatar = false;
self.check_or_recode_to_size(context, name, viewtype, img_wh, max_bytes, is_avatar)
self.check_or_recode_to_size(context, name, viewtype, max_wh, max_bytes, is_avatar)
}

/// Checks or recodes the image so that it fits into limits on width/height and byte size.
/// Checks or recodes the image so that it fits into limits on width/height and/or byte size.
///
/// If `!is_avatar`, then if `max_bytes` is exceeded, reduces the image to `img_wh` and proceeds
/// with the result without rechecking.
/// If `!is_avatar`, then if `max_bytes` is exceeded, reduces the image to `max_wh` and proceeds
/// with the result (even if `max_bytes` is still exceeded).
///
/// If `is_avatar`, the resolution will be reduced in a loop until the image fits `max_bytes`.
Comment on lines +313 to +316
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This seems to be an outdated explanation.

Suggested change
/// If `!is_avatar`, then if `max_bytes` is exceeded, reduces the image to `max_wh` and proceeds
/// with the result (even if `max_bytes` is still exceeded).
///
/// If `is_avatar`, the resolution will be reduced in a loop until the image fits `max_bytes`.
/// If `max_bytes` is exceeded,
/// the image will be re-encoded to `max_wh` or the original resolution, whichever is smaller.
/// If it then is still larger than `max_bytes` it will be reduced in resolution,
/// until it fits within `max_bytes`.
///
/// If `is_avatar`, then images with a higher resolution than `max_wh`,
/// and images which include exif-data, will also be re-encoded.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

/// If it then is still larger than `max_bytes` it will be reduced in resolution, until it fits within `max_bytes`.

This is only true for avatars, though. If not is_avatar, then we'll break out of the loop after the first iteration.

If is_avatar, then images with a higher resolution than max_wh, and images which include exif-data, will also be re-encoded.

All images will be re-encoded if there is exif, not only avatars:

            if do_scale || exif.is_some() {
                [...]
                if encoded.is_empty() {
                    if mem::take(&mut add_white_bg) {
                        self::add_white_bg(&mut img);
                    }
                    encode_img(&img, ofmt, &mut encoded)?;
                }

Copy link
Contributor

@72374 72374 Jan 25, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is only true for avatars, though. If not is_avatar, then we'll break out of the loop after the first iteration.

Indeed, i overlooked the && is avatar after if encoded_img_exceeds_bytes.


I guess i better wait with further suggestions until i can get Delta Chat to build with a local core (i tried that for multiple hours, without success) and verify the changes i want to suggest; my incorrect suggestions were probably rather distracting. ^^'
(I also forgot i could have tried a few things with the unmodified version.)

But anyway, thank you (you and the others working on Delta Chat and/or Chatmail) for making and improving Delta Chat and Chatmail.
It works well for me and my family-members so far (aside from my issues with the limited image-quality),
and the thing with multiple relays for a more peer-to-peer-like setup is finally something that is both independent and reliable enough, that i can comfortably recommend a messenger to friends and family.

Copy link
Collaborator Author

@Hocuri Hocuri Jan 26, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i tried that for multiple hours, without success

What's the problem? And, are you trying to build DC Desktop, DC Android, or the REPL?

BTW, running the Rust tests locally should be just cargo test cargo nextest run, and that way you can already somewhat verify changes by writing a Rust test, as I did in the PR here.

But anyway, thank you (you and the others working on Delta Chat and/or Chatmail) for making and improving Delta Chat and Chatmail.

It's a pleasure 😊

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

BTW, running the Rust tests locally should be just cargo test

It is actually cargo nextest run, with cargo test some tests fail randomly because they are running in the same process.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What's the problem? And, are you trying to build DC Desktop, DC Android, or the REPL?

DC Desktop.

After setting it up according to the documentation in README.md and UPDATE_CORE.md like this:

git clone --recurse-submodules https://github.com/chatmail/core
git clone --recurse-submodules https://github.com/deltachat/deltachat-desktop
cd deltachat-desktop/
sudo npm i -g pnpm
pnpm install
pnpm -w build:electron
python ./bin/link_core/build_and_link_local_core.py ../core
pnpm -w dev:electron --allow-unsafe-core-replacement

When trying to start it, the following error happens instead (paths shortened at ) :

Error [ERR_MODULE_NOT_FOUND]: Cannot find module '/home/…/Delta_Chat/core/deltachat-rpc-server/npm-package/node_modules/@deltachat/jsonrpc-client/dist/deltachat.js' imported from /home/…/Delta_Chat/core/deltachat-rpc-server/npm-package/index.js

The "dist"-folder does not exist.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you open an issue in https://github.com/deltachat/deltachat-desktop ?
It looks like documentation bug, there is some build step for the core missing.

Copy link
Contributor

@72374 72374 Jan 26, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was able to build and start Delta Chat Desktop, after doing git checkout v1.58.2 && pnpm install && git checkout main && pnpm install. 1.58.2 is the last version before the upgrade to React 19.

A dependency is unavailable since then. There is an issue-report about forking it: deltachat/deltachat-desktop#5935

I opened an issue-report there: deltachat/deltachat-desktop#5971

///
/// This modifies the blob object in-place.
///
Expand All @@ -323,7 +326,7 @@ impl<'a> BlobObject<'a> {
context: &Context,
name: Option<String>,
viewtype: &mut Viewtype,
mut img_wh: u32,
max_wh: u32,
max_bytes: usize,
is_avatar: bool,
) -> Result<String> {
Expand Down Expand Up @@ -385,7 +388,14 @@ impl<'a> BlobObject<'a> {
_ => img,
};

let exceeds_wh = img.width() > img_wh || img.height() > img_wh;
// max_wh is the maximum image width and height, i.e. the resolution-limit.
// target_wh target-resolution for resizing the image.
let exceeds_wh = img.width() > max_wh || img.height() > max_wh;
let mut target_wh = if exceeds_wh {
max_wh
} else {
max(img.width(), img.height())
};
let exceeds_max_bytes = nr_bytes > max_bytes as u64;

let jpeg_quality = 75;
Expand Down Expand Up @@ -438,9 +448,9 @@ impl<'a> BlobObject<'a> {
// usually has less pixels by cropping, UI that needs to wait anyways,
// and also benefits from slightly better (5%) encoding of Triangle-filtered images.
let new_img = if is_avatar {
img.resize(img_wh, img_wh, image::imageops::FilterType::Triangle)
img.resize(target_wh, target_wh, image::imageops::FilterType::Triangle)
} else {
img.thumbnail(img_wh, img_wh)
img.thumbnail(target_wh, target_wh)
};

if encoded_img_exceeds_bytes(
Expand All @@ -451,19 +461,19 @@ impl<'a> BlobObject<'a> {
&mut encoded,
)? && is_avatar
{
if img_wh < 20 {
if target_wh < 20 {
return Err(format_err!(
"Failed to scale image to below {max_bytes}B.",
));
}

img_wh = img_wh * 2 / 3;
target_wh = target_wh * 2 / 3;
Copy link
Collaborator

@iequidoo iequidoo Jan 24, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd also remove this line from here and add

if !encoded.is_empty() {
    target_wh = target_wh * 2 / 3;
}

to the beginning of the loop so that no extra iteration for avatars is done. It's a small optimization, but may make the code clearer, otherwise one may wonder why we recode avatars to the same size again.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure, this may easily lead to another issue where we unnecessarily reduce the target resolution.

I anyways find the code block above hard to understand:

            let do_scale = exceeds_max_bytes
                || is_avatar
                    && (exceeds_wh
                        || exif.is_some() && {
                            if mem::take(&mut add_white_bg) {
                                self::add_white_bg(&mut img);
                            }
                            encoded_img_exceeds_bytes(
                                context,
                                &img,
                                ofmt.clone(),
                                max_bytes,
                                &mut encoded,
                            )?
                        });

so... if the avatar does not exceed the max_bytes or width/height, but it has some exif, then we directly do an iteration of adding white background and recoding, and check the resulting bytes size. If the image is too big now (note that it wasn't too big before), then do_scale is true and we reduce the resolution.

So, I guess that the code block is there to make really sure that the avatar file can't be too big in the end

Which is good, of course, it's just hard to understand, and I am wary of adding code that depends on is behavior.

Copy link
Contributor

@72374 72374 Jan 24, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shouldn't the || exif.is_some() && { there be || exif.is_some() || {, so that it is checked if any of the three is true?
The way i understand it, avatar-images that are not larger than max_bytes and max_wh, but include exif-data, would only be re-encoded, if those also do not fit within the size after being encoded with a white background.
Usually the purpose of specifically re-encoding images with exif-data is, to remove metadata (though it can also be removed from the image-file without re-encoding the image).

Copy link
Collaborator

@iequidoo iequidoo Jan 24, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[...] || exif.is_some() || { [...]

If it's an avatar w/o Exif, it should only be recoded if it exceeds any limits. White background is also not needed when not recoding an avatar, this is a w/a for transparent avatars recoded to JPEG, otherwise the image crate adds a black one.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That would still be the same after the change. I opened a PR with a short explanation for the change there: #7772

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Which is good, of course, it's just hard to understand, and I am wary of adding code that depends on is behavior.

Finally i think that it's fine to move the target_wh reduction above, if it's under !encoded.is_empty(), nothing will depend on the complex logic above: it is plainly "if we've already tried to recode and still inside the loop, we need to try a smaller size". Anyway, this is minor

Copy link
Contributor

@72374 72374 Jan 25, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seems correct.
A comment could make it more understandable why that is done; then it would be fine to have this optimisation, i think.

Something like this, before the beginning of the loop, as it only needs to be checked once:

if do_scale {
    if is_avatar && !encoded.is_empty() {
        // If this happens, the image was already encoded at the original resolution,
        // for the file-size-check for an avatar with an added white background,
        // and did not fit within the file-size-limit after that.
        // Skip encoding it again at the original resolution.
        target_wh = target_wh * 2 / 3;
    }
    loop { …

Which is similar to what you suggested in the previous PR, i think.


Actually, not having this can result in quality-reduction, because, when the file-size-check happens, it does not only check the file-size, it also changes the image that will be used, which, depending on the encoder, can result in encoding the image again with jpeg-quality 75 (or not, if the encoder does detect that it is the same format and quality, and skips re-encoding).

} else {
info!(
context,
"Final scaled-down image size: {}B ({}px).",
encoded.len(),
img_wh
target_wh
);
break;
}
Expand Down
53 changes: 53 additions & 0 deletions src/blob/blob_tests.rs
Original file line number Diff line number Diff line change
Expand Up @@ -798,3 +798,56 @@ async fn test_create_and_deduplicate_from_bytes() -> Result<()> {

Ok(())
}

/// Tests that an image that already fits into the width limit,
/// but not the bytes limit,
/// is compressed without changing the resolution.
#[tokio::test(flavor = "multi_thread", worker_threads = 2)]
async fn test_recode_without_downscaling() -> Result<()> {
let t = &TestContext::new().await;

let image = include_bytes!("../../test-data/image/screenshot120x120.jpg");
const { assert!(120 < constants::WORSE_AVATAR_SIZE) };

for is_avatar in [true, false] {
let mut blob =
BlobObject::create_and_deduplicate_from_bytes(t, image, "image.jpg").unwrap();
let image_path = blob.to_abs_path();
check_image_size(&image_path, 120, 120);

assert!(
fs::metadata(&image_path).await.unwrap().len() > constants::WORSE_AVATAR_BYTES as u64
);

// Repeat the check, because a second call to `check_or_recode_to_size()`
// is not supposed to change anything:
let mut imgs = vec![];
for _ in 0..2 {
let mut viewtype = Viewtype::Image;
let new_name = blob.check_or_recode_to_size(
t,
Some("image.jpg".to_string()),
&mut viewtype,
constants::WORSE_AVATAR_SIZE,
constants::WORSE_AVATAR_BYTES,
is_avatar,
)?;
let image_path = blob.to_abs_path();
assert_eq!(new_name, "image.jpg"); // The name shall not have changed
assert_eq!(viewtype, Viewtype::Image); // The viewtype shall not have changed
let img = check_image_size(&image_path, 120, 120); // The resolution shall not have changed
imgs.push(img);

let new_image_bytes = fs::metadata(&image_path).await.unwrap().len();
assert!(
new_image_bytes < constants::WORSE_AVATAR_BYTES as u64,
"The new image size, {new_image_bytes}, should be lower than {}, is_avatar={is_avatar}",
constants::WORSE_AVATAR_BYTES
);
}

assert_eq!(imgs[0], imgs[1]);
}

Ok(())
}
Binary file added test-data/image/screenshot120x120.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.