本文介绍了依靠__del __()在Python中进行清理是否不可靠?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在阅读有关清理Python对象的不同方法,但我偶然发现了这些问题(, ),基本上说使用 __ del __()进行清理是不可靠,应避免使用以下代码:

I was reading about different ways to clean up objects in Python, and I have stumbled upon these questions (1, 2) which basically say that cleaning up using __del__() is unreliable and the following code should be avoid:

def __init__(self):
    rc.open()

def __del__(self):
    rc.close()

问题是,我恰好使用了这段代码,并且无法重现以上问题中引用的任何问题。就我所知,我无法使用语句,因为我为封闭源代码软件(testIDEA,有人吗?)提供了Python模块。该软件将创建特定类的实例并进行处理,因此这些实例必须准备好在两者之间提供服务。我看到的 __ del __()的唯一替代方法是手动调用 open()

The problem is, I'm using exactly this code, and I can't reproduce any of the issues cited in the questions above. As far as my knowledge goes, I can't go for the alternative with with statement, since I provide a Python module for a closed-source software (testIDEA, anyone?) This software will create instances of particular classes and dispose of them, these instances have to be ready to provide services in between. The only alternative to __del__() that I see is to manually call open() and close() as needed, which I assume will be quite bug-prone.

我知道,当我关闭时,解释器,不能保证我的对象会被正确销毁(并且不会打扰我,哎呀,甚至Python编写者都认为还可以)。除此之外,我是否使用<$玩火c $ c> __ del __()进行清理吗?

I understand that when I'll close the interpreter, there's no guarantee that my objects will be destroyed correctly (and it doesn't bother me much, heck, even Python authors decided it was OK). Apart from that, am I playing with fire by using __del__() for cleanup?

推荐答案

您观察到终结器的典型问题垃圾收集的语言。 Java拥有它,C#拥有它,并且它们都提供了基于范围的清理方法,例如Python with 关键字来处理它。

You observe the typical issue with finalizers in garbage collected languages. Java has it, C# has it, and they all provide a scope based cleanup method like the Python with keyword to deal with it.

主要问题是,垃圾收集器负责清理和销毁对象。在C ++中,对象超出范围时将被销毁,因此您可以使用RAII并具有明确定义的语义。在Python中,只要GC喜欢,对象就会超出范围并继续存在。根据您的Python实现,这可能有所不同。 CPython及其基于引用计数的GC相当不错(因此您很少看到问题),而PyPy,IronPython和Jython可能会使对象存活很长一段时间。

The main issue is, that the garbage collector is responsible for cleaning up and destroying objects. In C++ an object gets destroyed when it goes out of scope, so you can use RAII and have well defined semantics. In Python the object goes out of scope and lives on as long as the GC likes. Depending on your Python implementation this may be different. CPython with its refcounting based GC is rather benign (so you rarely see issues), while PyPy, IronPython and Jython might keep an object alive for a very long time.

对于例如:

def bad_code(filename):
    return open(filename, 'r').read()

for i in xrange(10000):
    bad_code('some_file.txt')

错误代码泄漏文件句柄。在CPython中没关系。引用计数降至零,并立即删除。在PyPy或IronPython中,由于耗尽所有可用的文件描述符(Unix上 ulimit 或Windows上的509句柄),您可能会遇到IOError或类似问题。

bad_code leaks a file handle. In CPython it doesn't matter. The refcount drops to zero and it is deleted right away. In PyPy or IronPython you might get IOErrors or similar issues, as you exhaust all available file descriptors (up to ulimit on Unix or 509 handles on Windows).

使用上下文管理器进行基于作用域的清理,使用使用进行清理是更可取的。您确切地知道何时完成对象。但是有时您无法轻松实施这种范围的清理。那就是当您可能使用 __ del __ atexit 或类似的结构来尽最大努力进行清理时。它不可靠,但总比没有强。

Scope based cleanup with a context manager and with is preferable if you need to guarantee cleanup. You know exactly when your objects will be finalized. But sometimes you cannot enforce this kind of scoped cleanup easily. Thats when you might use __del__, atexit or similar constructs to do a best effort at cleaning up. It is not reliable but better than nothing.

您可以通过显式清理来负担用户费用,也可以强制执行显式作用域,也可以使用进行赌博。 __del __ 并偶尔看到一些奇怪的地方(尤其是解释器关闭)。

You can either burden your users with explicit cleanup or enforcing explicit scopes or you can take the gamble with __del__ and see some oddities now and then (especially interpreter shutdown).

这篇关于依靠__del __()在Python中进行清理是否不可靠?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

09-05 00:29