问题描述
我在磁盘上腌制了一个字典,字典大小约为780 Meg(在磁盘上).但是,当我将该字典加载到内存中时,它的大小意外地膨胀到大约6 GB.无论如何,是否也要在内存中保持实际文件大小的大小,(我的意思是,如果它在内存中占用大约1 gig,那将是可以的,但是6 gig是一种奇怪的行为). pickle模块是否有问题,还是应该以其他格式保存字典?
I have a dictionary pickled on disk with size of ~780 Megs (on disk). However, when I load that dictionary into the memory, its size swells unexpectedly to around 6 gigabytes. Is there anyway to keep the size around the actual filesize in the memory as well, (I mean it will be alright if it takes around 1 gigs in the memory, but 6 gigs is kind of a strange behavior). Is there a problem with the pickle module, or should I save the dictionary in some other format?
这是我加载文件的方式:
Here is how I am loading the file:
import pickle
with open('py_dict.pickle', 'rb') as file:
py_dict = pickle.load(file)
任何想法,帮助,将不胜感激.
Any ideas, help, would be greatly appreciated.
推荐答案
如果您使用pickle
仅用于在字典中存储大量值或大量键,则应考虑使用.
If you're using pickle
just for storing large values in a dictionary, or a very large number of keys, you should consider using shelve
instead.
import shelve
s=shelve.open('shelve.bin')
s['a']='value'
这仅根据需要加载每个键/值,将其余键/值保留在磁盘上
This loads each key/value only as needed, keeping the rest on disk
这篇关于从磁盘加载的Python词典占用了过多的内存空间的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!