-
Couldn't load subscription status.
- Fork 6.5k
Fix prepare latent image ids and vae sample generators for flux #9981
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thanks!
|
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
|
Wow, changing a randomly sampled latent (on every CI test run so far) to an actually properly seeded sampled latent breaks the test 😭 |
|
@a-r-r-o-w that's expected, no? |
|
Oh sorry, I misunderstood/forgot what was going on. We did always seed the test before, but it was globally seeded. Now we use a different seed/generator passed into the pipeline so yes, this is expected. |
|
|
||
| if latents is not None: | ||
| latent_image_ids = self._prepare_latent_image_ids(batch_size, height, width, device, dtype) | ||
| latent_image_ids = self._prepare_latent_image_ids(batch_size, height // 2, width // 2, device, dtype) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I can recollect two PRs here:
Gently pinging @DN6
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh, do you mean I am reverting a previous change that was necessary for something? I have been trying to pass latents for testing purposes but it always fails at this point unless the height and width are correctly adjusted -- it makes sense to me when comparing with other pipelines and doing rough calculation with tensor shapes, but LMK if this is wrong
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No this is fine. We missed scaling the height and width when passing in latents. We do it here when latents aren't passed. Change is good 👍🏽
| latent_image_ids = self._prepare_latent_image_ids(batch_size, height // 2, width // 2, device, dtype) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh okay. Then I'll update the slices and merge once CI is green
|
I think only need to update slice for |
…ingface#9981) * fix * update expected slice
* fix * update expected slice
No description provided.