You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
^
Add lora linear definition. Pull out linears from attention, and allow custom linear (eg. lora linear) to be passed in.
If none, construct linear (current behaviour).
ghstack-source-id: 298411928
@exported-using-ghexport
Differential Revision: [D73953776](https://our.internmc.facebook.com/intern/diff/D73953776/)
help="checkpoint directory. Use with a sharded checkpoint, not for the standard llama2 model. Note, checkpoint_dir takes precedence over checkpoint if both are set.",
240
240
)
241
241
242
+
parser.add_argument(
243
+
"--adapter_checkpoint",
244
+
required=False,
245
+
help="Path to the adapter.pt file from torchtune. Used if the model has trained LoRA adapters. Must provide adapter_config.json",
246
+
)
247
+
248
+
parser.add_argument(
249
+
"--adapter_config",
250
+
required=False,
251
+
help="Path to the adapter_config.json file. Used if the model has trained LoRA adapters. Must provide adapter_checkpoint.",
0 commit comments