我正在使用子进程来调用另一个程序,并将其返回值保存到变量中。循环重复此过程,经过数千次后程序崩溃,并出现以下错误:
Traceback (most recent call last):
File "./extract_pcgls.py", line 96, in <module>
SelfE.append( CalSelfEnergy(i) )
File "./extract_pcgls.py", line 59, in CalSelfEnergy
p = subprocess.Popen(cmd, stdout=subprocess.PIPE, shell=True)
File "/usr/lib/python3.2/subprocess.py", line 745, in __init__
restore_signals, start_new_session)
File "/usr/lib/python3.2/subprocess.py", line 1166, in _execute_child
errpipe_read, errpipe_write = _create_pipe()
OSError: [Errno 24] Too many open files
任何解决该问题的想法都非常感谢!
注释提供的代码:
cmd = "enerCHARMM.pl -parram=x,xtop=topology_modified.rtf,xpar=lipid27_modified.par,nobuildall -out vdwaals {0}".format(cmtup[1])
p = subprocess.Popen(cmd, stdout=subprocess.PIPE, shell=True)
out, err = p.communicate()
最佳答案
在Mac OSX(El Capitan)中,请参阅当前配置:
#ulimit -a
core file size (blocks, -c) 0
data seg size (kbytes, -d) unlimited
file size (blocks, -f) unlimited
max locked memory (kbytes, -l) unlimited
max memory size (kbytes, -m) unlimited
open files (-n) 256
pipe size (512 bytes, -p) 1
stack size (kbytes, -s) 8192
cpu time (seconds, -t) unlimited
max user processes (-u) 709
virtual memory (kbytes, -v) unlimited
将打开文件的值设置为10K:
#ulimit -Sn 10000
验证结果:
#ulimit -a
core file size (blocks, -c) 0
data seg size (kbytes, -d) unlimited
file size (blocks, -f) unlimited
max locked memory (kbytes, -l) unlimited
max memory size (kbytes, -m) unlimited
open files (-n) 10000
pipe size (512 bytes, -p) 1
stack size (kbytes, -s) 8192
cpu time (seconds, -t) unlimited
max user processes (-u) 709
virtual memory (kbytes, -v) unlimited
关于python - Python子流程: Too Many Open Files,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/16526783/