-
-
Notifications
You must be signed in to change notification settings - Fork 33.6k
Open
Labels
stdlibStandard Library Python modules in the Lib/ directoryStandard Library Python modules in the Lib/ directorytopic-multiprocessingtype-bugAn unexpected behavior, bug, or errorAn unexpected behavior, bug, or error
Description
Bug report
Bug description:
"Too many open files" can occur with multiprocessing.Queue, even if the protocol used to work with the queue is correct.
The file descriptor leak can be demonstrated with two small programs:
File "ko.py"
import os
import multiprocessing
pid = os.getpid()
cmd = f'lsof -p {pid} | grep -i pipe | wc -l'
cmdfull = f'lsof -p {pid} | grep -i pipe'
queues = {}
for i in range(100):
queues[i] = multiprocessing.Queue()
os.system(cmd)
queues[i].close()
queues[i].join_thread()
os.system(cmd)
os.system(cmdfull)File "ok.py"
import os
import multiprocessing
pid = os.getpid()
cmd = f'lsof -p {pid} | grep -i pipe | wc -l'
cmdfull = f'lsof -p {pid} | grep -i pipe'
queues = {}
for i in range(100):
queues[i] = multiprocessing.Queue()
queues[i].put('Hello')
os.system(cmd)
queues[i].close()
queues[i].join_thread()
os.system(cmd)
os.system(cmdfull)An empty queue that remains empty is not abnormal (e.g., an error queue when no errors occur).
The workaround is to create a class extending multiprocessing.Queue, adding a close method that inserts an extra message into the queue before performing the actual close.
This issue can be observed on Linux and macOS. However, it is not specific to a particular operating system.
CPython versions tested on:
3.12
Operating systems tested on:
Linux
HarryVasanth
Metadata
Metadata
Assignees
Labels
stdlibStandard Library Python modules in the Lib/ directoryStandard Library Python modules in the Lib/ directorytopic-multiprocessingtype-bugAn unexpected behavior, bug, or errorAn unexpected behavior, bug, or error