Using task.exitStatus with retries to delete large files in the failed processes work dir #5461
Unanswered
colindaven
asked this question in
Q&A
Replies: 2 comments
-
For cleaning the work directory on the fly, please read about the nf-boost plugin here. |
Beta Was this translation helpful? Give feedback.
0 replies
-
I have created an issue on nf-boost to track this: bentsherman/nf-boost#5 I have not considered how best to handle cleanup of failed tasks. It depends on how much information users might want to keep around for debugging |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Dear all,
If I use the new
task.exitStatus
, is it possible to delete large files (eg rm *.fastq) in the work directory of a failed process?For example following a 130 out of mem error, can I delete the half finished files easily using a closure in the work dir with the failed job?
I want to do this during the nextflow run, since workflows with many failed processes can easily sum up to 5-10 TB (so I guess afterScript is not applicable: https://www.nextflow.io/docs/latest/reference/process.html#afterscript ).
Might this idea or a variation of it work ?
afterScript = { task.exitStatus == 130 ? "rm -rf ${task.workDir}/*" : "" }
Thanks
Beta Was this translation helpful? Give feedback.
All reactions