Skip to content

Conversation

@arun477
Copy link

@arun477 arun477 commented Jan 27, 2024

    def reset_parameters(self):
        nn.Embedding.reset_parameters(self)
        if hasattr(self, 'lora_A'):
            # initialize A the same way as the default for nn.Linear and B to zero
            # lora_A should be normal and lora_B should be zeros
            nn.init.zeros_(self.lora_A)
            nn.init.normal_(self.lora_B)

@arun477 arun477 changed the title Initialization for LoRa weights A and B initialized in the wrong way. Initialization for LoRA weights A and B initialized in the wrong way. Jan 28, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants