问题描述
Greets,
我有一些麻烦让我的记忆被python释放,我怎么能强迫它释放内存?b br />
我已经尝试过del和gc.collect()但没有成功。
这是一个代码示例,在linux下解析一个XML文件2.4
(与Windows 2.5相同的问题,尝试了第一个例子):
#Python解释器内存使用情况:1.1 Mb私有,1.4 Mb共享
#Using 获取内存
信息
导入cElementTree为ElementTree #meminfo:2.3 Mb private,1.6 Mb
共享
import gc #no内存更改
et = ElementTree.parse(''primary.xml'')#meminfo:34.6 Mb private,1.6 Mb
shared
del et #no内存更改
gc.collect()#no内存更改
那么如何释放32.3 Mb t这是一个简单的file.readlines()
#Python解释器内存使用情况:1.1 Mb私有,1.4 Mb共享同样的问题来自ElementTree ??
导入gc #no内存更改
f = open(''primary.xml'')#no内存更改
data = f.readlines ()#meminfo:12 Mb private,1.4 Mb shared
del data #meminfo:11.5 Mb private,1.4 Mb shared
gc.collect()#no memory change
但是对于file.read()很好用:
#Python解释器内存使用情况:1.1 Mb私有,1.4 Mb共享
import gc #no memory change
f = open(''primary.xml'')#no memory change
data = f.read()#meminfo:7.3Mb private ,1.4 Mb共享
del data #meminfo:1.1 Mb私有,1.4 Mb共享
gc.collect()#无内存更改
我可以看到,python为列表维护一个内存池。
在我的第一个例子中,如果我重新解析xml文件,那么内存不会/>
增长非常多(精确0.1 Mb)
所以我认为我的内存池是正确的。
但是有没有一种强制python释放这个记忆的方法?!
问候,
FP
Greets,
I''ve some troubles getting my memory freed by python, how can I force
it to release the memory ?
I''ve tried del and gc.collect() with no success.
Here is a code sample, parsing an XML file under linux python 2.4
(same problem with windows 2.5, tried with the first example) :
#Python interpreter memory usage : 1.1 Mb private, 1.4 Mb shared
#Using http://www.pixelbeat.org/scripts/ps_mem.py to get memory
information
import cElementTree as ElementTree #meminfo: 2.3 Mb private, 1.6 Mb
shared
import gc #no memory change
et=ElementTree.parse(''primary.xml'') #meminfo: 34.6 Mb private, 1.6 Mb
shared
del et #no memory change
gc.collect() #no memory change
So how can I free the 32.3 Mb taken by ElementTree ??
The same problem here with a simple file.readlines()
#Python interpreter memory usage : 1.1 Mb private, 1.4 Mb shared
import gc #no memory change
f=open(''primary.xml'') #no memory change
data=f.readlines() #meminfo: 12 Mb private, 1.4 Mb shared
del data #meminfo: 11.5 Mb private, 1.4 Mb shared
gc.collect() # no memory change
But works great with file.read() :
#Python interpreter memory usage : 1.1 Mb private, 1.4 Mb shared
import gc #no memory change
f=open(''primary.xml'') #no memory change
data=f.read() #meminfo: 7.3Mb private, 1.4 Mb shared
del data #meminfo: 1.1 Mb private, 1.4 Mb shared
gc.collect() # no memory change
So as I can see, python maintain a memory pool for lists.
In my first example, if I reparse the xml file, the memory doesn''t
grow very much (0.1 Mb precisely)
So I think I''m right with the memory pool.
But is there a way to force python to release this memory ?!
Regards,
FP
推荐答案
AFAIK没有。但是,只要内存消耗不会不断增长,为什么这很重要呢?操作
系统的虚拟内存管理通常需要注意的是实际使用的内存只有物理内存。
Ciao ,
Marc''BlackJack''Rintsch
AFAIK not. But why is this important as long as the memory consumption
doesn''t grow constantly? The virtual memory management of the operating
system usually takes care that only actually used memory is in physical
RAM.
Ciao,
Marc ''BlackJack'' Rintsch
AFAIK没有。但是,只要内存消耗不会不断增长,为什么这很重要呢?操作
系统的虚拟内存管理通常需要注意的是实际使用的内存只有物理内存。
Ciao ,
Marc''BlackJack''Rintsch
AFAIK not. But why is this important as long as the memory consumption
doesn''t grow constantly? The virtual memory management of the operating
system usually takes care that only actually used memory is in physical
RAM.
Ciao,
Marc ''BlackJack'' Rintsch
因为我是小巧的擅长美女,当然操作系统会交换
如果需要可以使用未使用的内存。
如果我守护这个应用程序,我将使用一个恒定的40 Mb,而不是
但其他应用程序免费。如果另一个应用程序需要这个
内存,操作系统将不得不为其他
应用程序交换和松开时间......而且我不确定系统是否会首先交换这个
未使用的内存,它还可以交换第一个另一个应用程序... AFAIK。
这些40 Mb仅用于7 Mb xml文件,解析怎么样一个很大的
一个,比如50 Mb?
我宁愿选择手动释放这个
未使用的内存或手动设置内存池的大小
问候,
FP
Because I''m an adept of small is beautiful, of course the OS will swap
the unused memory if needed.
If I daemonize this application I will have a constant 40 Mb used, not
yet free for others applications. If another application need this
memory, the OS will have to swap and loose time for the other
application... And I''m not sure that the system will swap first this
unused memory, it could also swap first another application... AFAIK.
And these 40 Mb are only for a 7 Mb xml file, what about parsing a big
one, like 50 Mb ?
I would have preferred to have the choice of manually freeing this
unused memory or setting manually the size of the memory pool
Regards,
FP
[...]
[...]
这是来自2.5系列发行说明
(
" [...]
- 补丁#1123430:Python的小对象分配器现在返回竞技场
系统``free()``当竞技场内的所有内存都不再使用时
。在Python 2.5之前,arenas(256KB内存块)永远不会被释放。有些应用程序现在会看到虚拟内存大小的下降,特别是长时间运行的应用程序,有时会暂时使用大量的小对象。请注意,当Python向平台C'的'free()``返回
竞技场时,无法保证
平台C库将会反过来将该内存返回给操作
系统。
补丁的效果是停止使那个不可能,并在
测试它
似乎至少对基于Microsoft C和gcc的系统有效。
感谢Evan Jones的辛勤工作和耐心。
[...]"
因此在linux下使用2.4(正如您测试的那样),您将不会总是得到
使用过的内存,关于收集的大量小物件
。
因此(我认为)你在做f.read()和
f.readlines()是前者在整个文件中读取一个大的
字符串对象(即不是一个小对象),而后者返回一个列表
行,其中每行是一个python对象。
我想知道2.5在这种情况下你会如何在linux上运行。
Paul
This is from the 2.5 series release notes
(http://www.python.org/download/relea...5.1/NEWS.txt):
"[...]
- Patch #1123430: Python''s small-object allocator now returns an arena to
the system ``free()`` when all memory within an arena becomes unused
again. Prior to Python 2.5, arenas (256KB chunks of memory) were never
freed. Some applications will see a drop in virtual memory size now,
especially long-running applications that, from time to time, temporarily
use a large number of small objects. Note that when Python returns an
arena to the platform C''s ``free()``, there''s no guarantee that the
platform C library will in turn return that memory to the operating
system.
The effect of the patch is to stop making that impossible, and in
tests it
appears to be effective at least on Microsoft C and gcc-based systems.
Thanks to Evan Jones for hard work and patience.
[...]"
So with 2.4 under linux (as you tested) you will indeed not always get
the used memory back, with respect to lots of small objects being
collected.
The difference therefore (I think) you see between doing an f.read() and
an f.readlines() is that the former reads in the whole file as one large
string object (i.e. not a small object), while the latter returns a list
of lines where each line is a python object.
I wonder how 2.5 would work out on linux in this situation for you.
Paul
这篇关于Python内存处理的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!