Running on LUMI HPC? #2466
Unanswered
SilverSulfide
asked this question in
Q&A
Replies: 1 comment 3 replies
-
We have a contributor that added their workflow to the docs for amd HPC https://axolotl-ai-cloud.github.io/axolotl/docs/amd_hpc.html |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Has anyone tried to run axolotl finetuning on an AMD HPC that uses slurm, in particular LUMI? If so, could you share your experience and provide some tips and suggestions? Context: I am currently pre-training a 30B model using GPT-neox on LUMI and it took a lot of work to configure that framework and it is still somewhat faulty, so a head-start on any potential axolotl issues would be appreciated. Thanks!
Beta Was this translation helpful? Give feedback.
All reactions