Skip to content

[Diffusion][Feat] support LoKr #21214

Open
RuixiangMa wants to merge 2 commits intosgl-project:mainfrom
RuixiangMa:fixLoKr
Open

[Diffusion][Feat] support LoKr #21214
RuixiangMa wants to merge 2 commits intosgl-project:mainfrom
RuixiangMa:fixLoKr

Conversation

@RuixiangMa
Copy link
Contributor

@RuixiangMa RuixiangMa commented Mar 23, 2026

Motivation

**[03-23 20:38:54] Rank 0: LoRA adapter(s)Tongyi-MAI/z-image-turbo-flow-dpo applied to 0 layers (targets: all, strengths: 1.00)**

Add LoKr (Low-Rank Kronecker Product) adapter support for diffusion models in SGLang.

**[03-23 23:30:49] Rank 0: LoRA adapter(s) Tongyi-MAI/z-image-turbo-flow-dpo applied to 180 layers (targets: all, strengths: 1.00)**

Modifications

Accuracy Tests

Prompt 1:
A professional realistic photograph of a woman standing by a window, golden hour lighting, cinematic shadows, highly detailed, 8k resolution

Prompt 2:
woman, Asian ethnicity, white dress, looking away, long hair, outdoor setting, building facade, plants, serene expression, elegance, side profile, standing, daylight, soft focus, pastel colors, fashion, youthful, casual elegance, architectural elements, natural light, tassel detail on dress
  

Metric NO LORA LORA
Prompt 1 A_professional_realistic_photograph_of_a_woman_standing A_professional_realistic_photograph_of_a_woman_standing_by_a_window_golden_
Prompt 2 woman_Asian_ethnicity_white_dress_looking_away_long_hair_outdoor_ woman_Asian_ethnicity_white_dress_looking_away_long_hair_outdoor_setting_building_facade_plan_20260323-144623_fc61a82c

Benchmarking and Profiling

Checklist

Review Process

  1. Ping Merge Oncalls to start the PR flow. See the PR Merge Process.
  2. Get approvals from CODEOWNERS and other reviewers.
  3. Trigger CI tests with comments or contact authorized users to do so.
    • /tag-run-ci-label, /rerun-failed-ci, /tag-and-rerun-ci
  4. After green CI and required approvals, ask Merge Oncalls to merge.

Signed-off-by: Lancer <maruixiang6688@gmail.com>
@gemini-code-assist
Copy link
Contributor

Summary of Changes

Hello, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request integrates support for LoKr (Low-Rank Kronecker Product) adapters into the SGLang framework, specifically for diffusion models. The primary goal is to enable the use of this efficient fine-tuning method, which differs from standard LoRA by performing permanent merges of weights. The changes include robust detection and conversion of LoKr state dictionaries, a new mechanism for applying these merged weights, and safeguards to prevent incompatible mixing of LoKr and traditional LoRA on model layers.

Highlights

  • LoKr Adapter Support: Introduced support for Low-Rank Kronecker Product (LoKr) adapters, allowing their application to diffusion models within SGLang.
  • Permanent Merges for LoKr: Implemented a mechanism where LoKr weight merges are permanent, directly modifying the base layer weights. Attempting to unmerge LoKr weights will now raise an error, requiring model reloading to restore original weights.
  • LoKr Format Detection and Conversion: Added logic to automatically detect the LoKr format from state dictionaries and convert it into a merged weight format suitable for direct application.
  • Prevention of Mixed LoRA Types: Enforced a rule preventing the simultaneous application of LoKr and standard LoRA adapters to the same layer to avoid conflicts and ensure predictable behavior.
  • New Weight Application Method: Added a dedicated apply_merged_weight method to directly apply pre-computed weight deltas, specifically designed for formats like LoKr that use Kronecker product decomposition.

🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console.

Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for GitHub and other Google products, sign up here.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces support for LoKr adapters in the diffusion model pipeline. The changes are well-structured, adding detection, conversion, and application logic for LoKr weights, including support for distributed environments. The implementation is robust, with checks for mixing adapter types and handling permanent merges. I have one suggestion to improve code clarity and follow PyTorch idioms more closely.

Signed-off-by: Lancer <maruixiang6688@gmail.com>
@RuixiangMa
Copy link
Contributor Author

/tag-and-rerun-ci

@RuixiangMa
Copy link
Contributor Author

@mickqian PTAL

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

diffusion SGLang Diffusion lora

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant