问题描述
我将几个 matlpotlib 函数整合到了一些 django-celery 任务中.
I have several matlpotlib functions rolled into some django-celery tasks.
每次调用任务时,都会有更多的 RAM 专用于 Python.过不了多久,python 就占用了所有 RAM.
Every time the tasks are called more RAM is dedicated to python. Before too long, python is taking up all of the RAM.
问题:如何释放此内存?
更新 2 - 第二种解决方案:
我问了一个类似的问题,专门关于 matplotlib 错误时锁定的内存,但我得到了很好的答案 .clf()
, .close()
, 和 gc.collect()
如果您使用 multiprocess 在单独的进程中运行绘图功能,该进程的内存将在进程结束后自动释放.
I asked a similar question specifically about the memory locked up when matplotlib errors, but I got a good answer to this question .clf()
, .close()
, and gc.collect()
aren't needed if you use multiprocess to run the plotting function in a separate process whose memory will automatically be freed once the process ends.
更新 - 解决方案:
这些 stackoverflow 帖子建议我可以使用以下命令释放 matplotlib 对象使用的内存:
These stackoverflow posts suggested that I can release the memory used by matplotlib objects with the following commands:
.clf()
:Matplotlib 在循环绘图时内存不足
.close()
:Python matplotlib:指定图形大小时未释放内存
import gc
gc.collect()
这是我用来测试解决方案的示例:
Here is the example I used to test the solution:
import matplotlib
matplotlib.use('Agg')
import matplotlib.pyplot as plt
from pylab import import figure, savefig
import numpy as np
import gc
a = np.arange(1000000)
b = np.random.randn(1000000)
fig = plt.figure(num=1, dpi=100, facecolor='w', edgecolor='w')
fig.set_size_inches(10,7)
ax = fig.add_subplot(111)
ax.plot(a, b)
fig.clf()
plt.close()
del a, b
gc.collect()
推荐答案
您是否尝试多次运行您的任务函数(在 for 中)以确保您的函数不会泄漏,无论 celery 是什么?确保 django.settings.DEBUG 设置为 False(当 DEBUG=True 时,连接对象将所有查询保存在内存中).
Did you try to run you task function several times (in a for) to be sure that not your function is leaking no matter of celery?Make sure that django.settings.DEBUG is set False( The connection object holds all queries in memmory when DEBUG=True).
这篇关于创建 matplotlib 图形后如何释放内存的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!