-
Notifications
You must be signed in to change notification settings - Fork 6.5k
Description
Is your feature request related to a problem? Please describe.
SimpleTuner/Kohya allow T5 attention masked training, however this is not currently supported natively in diffusers
Describe the solution you'd like.
Already implemented and used in Simpletuner and Kohya: https://github.com/bghira/SimpleTuner/blob/main/helpers/models/flux/transformer.py
Describe alternatives you've considered.
Recent implementation doesn't really solve the use case of using existing fine tunes with attention masking with diffusers
#10122
Additional context.
@yiyixuxu @bghira @AmericanPresidentJimmyCarter
@bghira's suggestion: "i'd suggested they add encoder_attention_mask and image_attention_mask and if image_attention_mask is None that they could then 1-fill those positions and just cat them together"