-
Notifications
You must be signed in to change notification settings - Fork 213
[refactor] Update Ln-norm logic for upcoming PyTorch update #206
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from 6 commits
6badd1e
a619fa7
b5b8236
77c1d30
7ccd58e
0d02d18
bb630db
4e98a25
f4fccde
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -122,7 +122,9 @@ def configure_qkv_out(self, q_name: str, k_name: str, v_name: str, out_name: str | |
| out.in_features = hp_hidden_dim | ||
|
|
||
| assert isinstance(out, nn.Linear) | ||
| hp_hidden_dim.register_importance(lambda: out._parameters["weight"].detach().norm(dim=0)) | ||
| hp_hidden_dim.register_importance( | ||
| lambda: torch.linalg.norm(out._parameters["weight"].detach(), dim=0) | ||
|
||
| ) | ||
|
|
||
| def modify( | ||
| self, *, n_heads_ratio: tuple[float, ...] | None = None, n_heads_divisor: int = 1 | ||
|
|
||
Uh oh!
There was an error while loading. Please reload this page.