本文介绍了.data 在 pytorch 中仍然有用吗?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我是 pytorch 的新手.我阅读了大量使用张量的 .data 成员的 pytorch 代码.但是我在官方文档和谷歌搜索.data,发现很少.我猜 .data 包含张量中的数据,但我不知道我们什么时候需要它,什么时候不需要?

I'm new to pytorch. I read much pytorch code which heavily uses tensor's .data member. But I search .data in the official document and Google, finding little. I guess .data contains the data in the tensor, but I don't know when we need it and when not?

推荐答案

.dataVariable 的一个属性(代表 Tensor 的对象,带有历史跟踪,例如自动更新),而不是 Tensor.实际上,.data 允许访问 Variable 的底层 Tensor.

.data was an attribute of Variable (object representing Tensor with history tracking e.g. for automatic update), not Tensor. Actually, .data was giving access to the Variable's underlying Tensor.

然而,从 PyTorch 版本 0.4.0 开始,VariableTensor 已经合并(变成一个更新的 Tensor> 结构),所以 .data 与之前的 Variable 对象一起消失了(Variable 仍然存在以实现向后兼容性,但已弃用).

However, since PyTorch version 0.4.0, Variable and Tensor have been merged (into an updated Tensor structure), so .data disappeared along the previous Variable object (well Variable is still there for backward-compatibility, but is deprecated).

来自发行说明的段落0.4.0(我建议阅读关于 Variable/Tensor 更新的整个部分):

Paragraph from Release Notes for version 0.4.0 (I recommend reading the whole section about Variable/Tensor updates):

.data 怎么样?

.data 是获取底层 Tensor 的主要方式变量.这次合并后,调用 y = x.data 还是有类似的语义.所以 y 将是一个 Tensor 与共享相同的数据x,与x的计算历史无关,并且有requires_grad=False.

.data was the primary way to get the underlying Tensor from a Variable. After this merge, calling y = x.data still has similar semantics. So y will be a Tensor that shares the same data with x, is unrelated with the computation history of x, and has requires_grad=False.

然而,.data 在某些情况下可能是不安全的.x.data 上的任何更改autograd 不会跟踪,并且计算出的梯度将是如果在向后传递中需要 x,则不正确.一个更安全的选择是使用 x.detach(),它也返回一个共享数据的 Tensor使用 requires_grad=False,但会进行就地更改如果向后需要 x,则由 autograd 报告.

However, .data can be unsafe in some cases. Any changes on x.data wouldn't be tracked by autograd, and the computed gradients would be incorrect if x is needed in a backward pass. A safer alternative is to use x.detach(), which also returns a Tensor that shares data with requires_grad=False, but will have its in-place changes reported by autograd if x is needed in backward.

这篇关于.data 在 pytorch 中仍然有用吗?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

07-27 20:21