skip compiling opt step instead of erroring if opt_in_bwd=True#2827
skip compiling opt step instead of erroring if opt_in_bwd=True#2827felipemello1 wants to merge 6 commits intometa-pytorch:mainfrom
Conversation
This reverts commit 901723f.
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/torchtune/2827
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit 72d0b6f with merge base 2344509 ( This comment was automatically generated by Dr. CI and updates every 15 minutes. |
joecummings
left a comment
There was a problem hiding this comment.
Why? This doesn't happen that far into the process and it's a simple solution to just change compile optimizer step to false.
If anything, you could provide a better error message like "If you would still like to compile the model and loss, you can specify everything in the config like
blah blah blah"
I was doing llama 70B. It takes several minutes for it to start (not sure why). And then it errored. So for me, it felt far into the process. If you dont like skipping and prefer it to just error, i could do this logic in the init instead. |
Context
What is the purpose of this PR? Is it to
Changelog: as in the title