Skip to content

Conversation

@zqiu24
Copy link
Contributor

@zqiu24 zqiu24 commented Jan 21, 2026

Hi there,

I would like to merge this feat: Add OFT support for Embedding layers

  • Implement OFT Embedding layer with output rotation
  • Add merge/unmerge support for embedding weights
  • Override _available_adapters and reset_oft_parameters for embedding-specific handling
  • Tested with training script, matches LoRA performance"

Hope for a smooth integration. Thank you so much. :D

Best,

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant