Thank you very much for your meaningful work. I noticed that in your project, you have overall_guidance_attn_keys = [ ("down", 1, 0, 0), ("down", 2, 0, 0), ("down", 2, 1, 0), ("up", 1, 0, 0), ("up", 1, 1, 0), ("up", 2, 2, 0), ]. I'm not quite sure why these specific layers were chosen for attention calculation.