本文介绍了Python中的MemoryError钩子?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

是否有一种全局捕获MemoryError异常的方法,以便库可以清除缓存而不是让用户代码看到MemoryError?

Is there a way to globally trap MemoryError exceptions so that a library can clear out caches instead of letting a MemoryError be seen by user code?

我正在用Python开发一个内存缓存库,用于存储非常大的对象,以至于用户通常希望使用所有可用的RAM来简化其脚本和/或加速它们.我希望能够在python解释器要求一个回调函数释放一些RAM的地方提供一个钩子,以避免在用户代码中调用MemoryError.

I'm developing a memory caching library in Python that stores very large objects, to the point where it's common for users to want to use all available RAM to simplify their scripts and/or speed them up. I'd like to be able to have a hook where the python interpreter asks a callback function to release some RAM as a way of avoiding a MemoryError being invoked in user code.

操作系统:Solaris和/或Linux

OS: Solaris and/or Linux

Python:cPython 2.6.*

Python: cPython 2.6.*

编辑:我正在寻找一种机制,except块无法处理.如果由于某种原因在任何代码中都会出现内存错误,我想让Python解释器首先尝试使用回调释放一些RAM,并且永远不会产生MemoryError异常.我不控制会产生错误的代码,我希望我的缓存能够根据需要积极使用尽可能多的RAM,并根据用户代码的需要自动释放RAM.

I'm looking for a mechanism that wouldn't be handled by an except block. If there would be a memory error in any code for any reason, I'd like to have the Python interpreter first try to use a callback to release some RAM and never have the MemoryError exception ever generated. I don't control the code that would generate the errors and I'd like my cache to be able to aggressively use as much RAM as it wants, automatically freeing up RAM as it's needed by the user code.

推荐答案

这不是处理内存管理的好方法.当您看到MemoryError时,您已经处于临界状态,其中内核可能接近终止进程以释放内存,在许多系统上,您将永远看不到它,因为它将进行交换或只是OOM-杀死您的进程,而不是失败的分配.

This is not a good way of handling memory management. By the time you see MemoryError, you're already in a critical state where the kernel is probably close to killing processes to free up memory, and on many systems you'll never see it because it'll go to swap or just OOM-kill your process rather than fail allocations.

您可能会看到MemoryError的唯一可恢复的情况是,尝试进行不适合可用地址空间的非常大的分配之后,这仅在32位系统上很常见.

The only recoverable case you're likely to see MemoryError is after trying to make a very large allocation that doesn't fit in available address space, only common on 32-bit systems.

如果您希望拥有一个可以释放其他分配所需的内存的缓存,则它不需要与错误交互,而可以与分配器本身交互.这样,当您需要为分配释放内存时,您会知道需要多少连续内存,否则您会盲目猜测.这也意味着您可以跟踪内存分配的发生情况,从而可以将内存使用情况保持在特定级别,而不是让其不受限制地增长,然后在内存分配过高时尝试恢复.

If you want to have a cache that frees memory as needed for other allocations, it needs to not interface with errors, but with the allocator itself. This way, when you need to release memory for an allocation you'll know how much contiguous memory is needed, or else you'll be guessing blindly. It also means you can track memory allocations as they happen, so you can keep memory usage at a specific level, rather than letting it grow unfettered and then trying to recover when it gets too high.

我强烈建议,对于大多数应用程序来说,这种缓存行为过于复杂-通常,最好只使用一定数量的内存进行缓存.

I'd strongly suggest that for most applications this sort of caching behavior is overcomplicated, though--you're usually better off just using a set amount of memory for cache.

这篇关于Python中的MemoryError钩子?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

10-20 14:21