-
Notifications
You must be signed in to change notification settings - Fork 752
Support llama3 training #9149
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support llama3 training #9149
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/9149
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit 579ec45 with merge base ea2029d ( This comment was automatically generated by Dr. CI and updates every 15 minutes. |
|
This pull request was exported from Phabricator. Differential Revision: D70977130 |
|
This pull request was exported from Phabricator. Differential Revision: D70977130 |
8d37c2a to
b4b699a
Compare
| if not isinstance(logits, list): | ||
| labels = labels.reshape(-1) | ||
| logits = logits.reshape(-1, logits.size(-1)) | ||
| return self.loss(logits, labels) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should we return the prediction too?
Summary: * Add support to fine-tune llama * Add a config example to fine-tune llama 3.2 1B Differential Revision: D70977130
b4b699a to
48ea377
Compare
|
This pull request was exported from Phabricator. Differential Revision: D70977130 |
Summary: * Add support to fine-tune llama * Add a config example to fine-tune llama 3.2 1B Differential Revision: D70977130
48ea377 to
7739a21
Compare
|
This pull request was exported from Phabricator. Differential Revision: D70977130 |
Summary: Pull Request resolved: pytorch#9149 * Add support to fine-tune llama * Add a config example to fine-tune llama 3.2 1B Differential Revision: D70977130
7739a21 to
573f427
Compare
573f427 to
60a1a4e
Compare
Summary: * Add support to fine-tune llama * Add a config example to fine-tune llama 3.2 1B Differential Revision: D70977130
|
This pull request was exported from Phabricator. Differential Revision: D70977130 |
Summary: Pull Request resolved: pytorch#9149 * Add support to fine-tune llama * Add a config example to fine-tune llama 3.2 1B Differential Revision: D70977130
60a1a4e to
637ad82
Compare
637ad82 to
bc4c528
Compare
Summary: * Add support to fine-tune llama * Add a config example to fine-tune llama 3.2 1B Reviewed By: lucylq Differential Revision: D70977130
Summary: Pull Request resolved: pytorch#9149 * Add support to fine-tune llama * Add a config example to fine-tune llama 3.2 1B Reviewed By: lucylq Differential Revision: D70977130
|
This pull request was exported from Phabricator. Differential Revision: D70977130 |
bc4c528 to
6dc950d
Compare
Summary: * Add support to fine-tune llama * Add a config example to fine-tune llama 3.2 1B Reviewed By: lucylq Differential Revision: D70977130
6dc950d to
211dac0
Compare
|
This pull request was exported from Phabricator. Differential Revision: D70977130 |
211dac0 to
3f8deea
Compare
Summary: * Add support to fine-tune llama * Add a config example to fine-tune llama 3.2 1B Reviewed By: lucylq Differential Revision: D70977130
Summary: Pull Request resolved: pytorch#9149 * Add support to fine-tune llama * Add a config example to fine-tune llama 3.2 1B Reviewed By: lucylq Differential Revision: D70977130
|
This pull request was exported from Phabricator. Differential Revision: D70977130 |
3f8deea to
579ec45
Compare
Differential Revision: D70977130 Pull Request resolved: #9149
Differential Revision: D70977130