Too many files open during parallelized workloads #637
Austin-s-h
started this conversation in
General
Replies: 1 comment
-
Thanks for reporting this. This is definitely an edge case but I can imagine this being super difficult to debug. 64 cores/threads is not an unreasonably high number these days and perhaps we should add this as a warning message to |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi all, I'm liking exploring ArchR so far! One thing that I ran into pretty early on was that when using highly parallelized tools that operate on a file structure, it is easy to run into user or OS-based file open limits. For example, my user 'nofile' limit on an Ubuntu 18.05 system was only 4096.
When running ArchR on 62 cores, I was opening more temporary files than this limit (unsure how many were related to unrelated tasks running in the background).
To resolve this issue, I updated the
/etc/security/limits.conf
file to include an open file limit of 64,000 for all users on my system. After a restart, this seemed to help resolve this issue. I understand this is unique to my situation, but other may run into issues and might not understand why their system fails parallel tasks but can complete it when run with fewer cores (which was how I got to this problem). It may be beneficial to have this as a warning message to help debugging issues with parallelization.Thank you!
Beta Was this translation helpful? Give feedback.
All reactions