问题描述
我将一些matlpotlib函数整合到了一些django-celery任务中.
I have several matlpotlib functions rolled into some django-celery tasks.
每次调用更多任务时,RAM都专用于python.不久之后,python占用了所有RAM.
Every time the tasks are called more RAM is dedicated to python. Before too long, python is taking up all of the RAM.
问题:如何释放此内存?
更新2-第二种解决方案:
我专门问了一个类似的问题,即当matplotlib错误时内存被锁定,但是我很好地回答了这个问题,如果您使用多进程来运行,则不需要.clf()
,.close()
和gc.collect()
.绘图功能在一个单独的进程中,该进程结束后将自动释放其内存.
I asked a similar question specifically about the memory locked up when matplotlib errors, but I got a good answer to this question .clf()
, .close()
, and gc.collect()
aren't needed if you use multiprocess to run the plotting function in a separate process whose memory will automatically be freed once the process ends.
更新-解决方案:
这些stackoverflow帖子建议我可以使用以下命令释放matplotlib对象使用的内存:
These stackoverflow posts suggested that I can release the memory used by matplotlib objects with the following commands:
.clf()
: Matplotlib耗尽循环绘制时的内存消耗
.close()
: Python matplotlib:内存不足在指定图形尺寸时被释放
import gc
gc.collect()
这是我用来测试解决方案的示例:
Here is the example I used to test the solution:
import matplotlib
matplotlib.use('Agg')
import matplotlib.pyplot as plt
from pylab import import figure, savefig
import numpy as np
import gc
a = np.arange(1000000)
b = np.random.randn(1000000)
fig = plt.figure(num=1, dpi=100, facecolor='w', edgecolor='w')
fig.set_size_inches(10,7)
ax = fig.add_subplot(111)
ax.plot(a, b)
fig.clf()
plt.close()
del a, b
gc.collect()
推荐答案
您是否尝试过多次运行任务函数(在for中),以确保您的函数不会因芹菜而泄漏?确保django.settings.DEBUG设置为False(当DEBUG = True时,连接对象将所有查询保存在内存中).
Did you try to run you task function several times (in a for) to be sure that not your function is leaking no matter of celery?Make sure that django.settings.DEBUG is set False( The connection object holds all queries in memmory when DEBUG=True).
这篇关于创建matplotlib图形后如何释放内存的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!