permutation中的内存错误

permutation中的内存错误

本文介绍了防止itertools.permutation中的内存错误的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

首先,我要提到的是我有一个3 GB的内存.

Firstly I would like to mention that i have a 3 gb ram.

我正在研究一种在节点上时间呈指数形式的算法,因此对于我来说,它已包含在代码中

I am working on an algorithm that is exponential in time on the nodes so for it I have in the code

perm = list( itertools.permutations(list(graph.Nodes))) # graph.Nodes is a tuple of 1 , 2 , ... n integers

这将生成列表中所有顶点的组合,然后我可以处理其中一种排列.

which generates all the combinations of vertices in a list and then i can work on one of the permutation.

但是,当我在程序中运行40个顶点时,会出现内存错误.

However when i run the program for 40 vertices , it gives a memory error.

实现中是否有任何更简单的方法可以通过它生成所有顶点组合,而不会出现此错误.

Is there any simpler way in implementation via which i can generate all the combinations of the vertices and not have this error.

推荐答案

尝试使用排列生成的迭代器,而不要使用它重新创建列表:

Try to use the iterator generated by the permutations instead of recreating a list with it :

perm_iterator = itertools.permutations(list(graph.Nodes))

for item in perm_iterator:
   do_the_stuff(item)

通过这样做,python将仅将当前使用的排列保留在内存中,而不是所有排列(就内存使用而言,这确实更好;))

by doing this, python will keep in memory only the currently used permutation, not all the permutations (in term of memory usage, it is really better ;) )

另一方面,一旦解决了内存问题,处理所有排列的时间将随着顶点数量的增加而呈指数增长.

On the other side, once the memory problem solved, the time to treat all the permutations will be growing exponentially with the number of vertices....

这篇关于防止itertools.permutation中的内存错误的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

09-05 10:30