Skip to content

Commit 8bcd073

Browse files
committed
Update on "Add LoRA linear definition"
^ Add lora linear definition. Pull out linears from attention, and allow custom linear (eg. lora linear) to be passed in. If none, construct linear (current behaviour). Differential Revision: [D73953776](https://our.internmc.facebook.com/intern/diff/D73953776/) [ghstack-poisoned]
2 parents f40ca33 + c1d8f79 commit 8bcd073

File tree

2 files changed

+1
-1
lines changed

2 files changed

+1
-1
lines changed

examples/models/llama/attention.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,7 @@
66
import torch.nn as nn
77
import torch.nn.functional as F
88
from executorch.examples.models.llama.model_args import ModelArgs
9+
from executorch.examples.models.llama.lora import LoRALinear
910
from executorch.examples.models.llama.norm import RMSNorm
1011
from executorch.examples.models.llama.rope import Rope
1112

examples/models/llama/llama_transformer.py

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,6 @@
1818
ForwardOptions,
1919
)
2020

21-
from executorch.examples.models.llama.lora import LoRALinear
2221
from executorch.examples.models.llama.model_args import ModelArgs
2322
from executorch.examples.models.llama.norm import RMSNorm
2423
from executorch.examples.models.llama.rope import Rope

0 commit comments

Comments
 (0)