pytorch复制张量的首选方法

pytorch复制张量的首选方法

本文介绍了pytorch复制张量的首选方法的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

似乎有几种方法可以在Pytorch中创建张量副本,包括

  y = tensor.new_tensor( x)#a 

y = x.clone()。detach()#b

y = torch.empty_like(x).copy_(x)#c

y = torch.tensor(x)#d

b 明显优于 a d code> a d 。为什么首选它?性能?我认为它的可读性较差。



是否/反对使用 c

解决方案

TL; DR


使用 .clone()。detach()(或者最好是 .detach()。clone()

as it's slightly fast and explicit in what it does.


Using perflot, I plotted the timing of various methods to copy a pytorch tensor.

y = tensor.new_tensor(x) # method a

y = x.clone().detach() # method b

y = torch.empty_like(x).copy_(x) # method c

y = torch.tensor(x) # method d

y = x.detach().clone() # method e

The x-axis is the dimension of tensor created, y-axis shows the time. The graph is in linear scale. As you can clearly see, the tensor() or new_tensor() takes more time compared to other three methods.

Note: In multiple runs, I noticed that out of b, c, e, any method can have lowest time. The same is true for a and d. But, the methods b, c, e consistently have lower timing than a and d.

import torch
import perfplot

perfplot.show(
    setup=lambda n: torch.randn(n),
    kernels=[
        lambda a: a.new_tensor(a),
        lambda a: a.clone().detach(),
        lambda a: torch.empty_like(a).copy_(a),
        lambda a: torch.tensor(a),
        lambda a: a.detach().clone(),
    ],
    labels=["new_tensor()", "clone().detach()", "empty_like().copy()", "tensor()", "detach().clone()"],
    n_range=[2 ** k for k in range(15)],
    xlabel="len(a)",
    logx=False,
    logy=False,
    title='Timing comparison for copying a pytorch tensor',
)

这篇关于pytorch复制张量的首选方法的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-13 08:52