Skip to content

Conversation

@faaany
Copy link
Contributor

@faaany faaany commented Apr 14, 2025

What does this PR do?

Since there is no difference can be seen from the images generated with single and batched inputs, I adjusted the expected maximum difference from 5e-4 to 5e-3 to make the test pass on both XPU and CUDA.

pls help review it. Thanks! @hlky

Copy link
Contributor

@hlky hlky left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @faaany. Same comment as in #11308 (review) but I will add that having expected_max_diff per backend is certainly not a priority and the tolerances mentioned in that comment are not the best example of when this should be a priority. Essentially we wouldn't want to increase expected_max_diff to the point where changes affecting precision may not be detectable, however there haven't been many of these changes required and only for specific tests, test_inference_batch_single_identical in this case, so it would likely be picked up in another test, and we generally confirm outputs are visually the same when making any changes to a pipeline/model anyway.

@hlky hlky merged commit c7f2d23 into huggingface:main Apr 14, 2025
8 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants