我正在寻找一种重塑Tensorflow中的张量的方法。我有一个包含行序列的张量。我想重塑该张量以使给定序列的所有行都在重塑的张量中的单行上。

困难在于序列长度不同。在下面的示例中,我知道一个序列最多3行。第一个序列是2行,第二个序列是3行,第三个序列是1行。

#Data Tensor
[
[1,1,1],
[2,2,2],
[4,4,4],
[5,5,5],
[6,6,6],
[7,7,7]]

#To be reshaped into
[
[1,1,1,2,2,2,0,0,0],
[4,4,4,5,5,5,6,6,6],
[7,7,7,0,0,0,0,0,0]]

#Argument could be of the form: rows to pad
[1 0 2]

#Or its complementary: sequence length
[2 3 1]


有人知道该怎么做吗?

一种方法是在适当位置的初始张量中插入一些零行,然后使用简单的tf.reshape。但是我不知道如何插入零行。

另一种方法是直接重塑时执行此操作。而且我也不知道该怎么做。

最佳答案

这应该做到并且易于扩展(例如使用不同种类的填充等)。请让我知道它是否按预期工作!

import tensorflow as tf

def split_and_pad_tensor(tensor, lengths):
    """
    Input: a rank 2 tensor of shape (A,B) and a collection of indexes that
    sum up to A (otherwise tf.split crashes).
    The tensor is then split in len(lengths) tensors of the given lengths,
    and then each splitted tensor is zero-padded at the right until all have
    B*max(idxs) elements. Output is then a rank 2 tensor of shape
    (len(idxs), B*max(idxs))
    """
    length_result, max_length = len(lengths), max(lengths)
    splitted = tf.split(tensor, lengths, 0)
    # pad's second argument can be seen as [[left, right], [up, down]]
    padded = tf.stack([tf.pad(s, [[0,max_length-l],[0,0]]) for l,s in zip(lengths, splitted)])
    # flatten last two axes:
    return tf.reshape(padded, [length_result, tf.shape(tensor)[1]*max_length])

# make some data and test for different valid inputs:
DATA = tf.constant([[x,x,x] for x in [1,2,4,5,6,7]])
with tf.Session() as sess:
    for lengths in ([4,2], [2,3,1], [2,2,1,1]):
        print sess.run(split_and_pad_tensor(DATA, lengths))


输出:

[[1 1 1 2 2 2 4 4 4 5 5 5]
 [6 6 6 7 7 7 0 0 0 0 0 0]]
[[1 1 1 2 2 2 0 0 0]
 [4 4 4 5 5 5 6 6 6]
 [7 7 7 0 0 0 0 0 0]]
[[1 1 1 2 2 2]
 [4 4 4 5 5 5]
 [6 6 6 0 0 0]
 [7 7 7 0 0 0]]




带占位符的纯TF版本:

以下代码具有与上面相同的功能,但是输入是占位符,并且tf.map_fn + tf.gather组合键用于实现完整的动态性:

import tensorflow as tf

class SplitAndPadGraph(object):
    def __init__(self):
        # minimal assumptions on the placeholderes' shapes
        data_ph = tf.placeholder(tf.float32, shape=[None, None])
        lengths_ph = tf.placeholder(tf.int32, shape=[None])
        # extract information about input shapes
        data_len = tf.shape(data_ph)[0]
        out_dim0 = tf.shape(lengths_ph)[0]
        out_dim1 = tf.reduce_max(lengths_ph)
        out_dim2 = tf.shape(data_ph)[-1]
        # create a [[x,y,z], ...] tensor, where x=start_idx, y=length, z=pad_size
        start_idxs = tf.concat([[0], tf.cumsum(lengths_ph)], 0)[:-1]
        pads = tf.fill([out_dim0], out_dim1)-lengths_ph
        reconstruction_metadata = tf.stack([start_idxs, lengths_ph, pads], axis=1)
        # pass the xyz tensor to map_fn to create a tensor with the proper indexes.
        # then gather the indexes from data_ph and reshape
        reconstruction_data = tf.map_fn(lambda x: tf.concat([tf.range(x[0],x[0]+x[1]),
                                                             tf.fill([x[2]], data_len)],
                                                            0), reconstruction_metadata)
        output = tf.gather(tf.concat([data_ph, tf.zeros((1,out_dim2))], 0),
                           tf.reshape(reconstruction_data, [out_dim0*out_dim1]))
        output = tf.reshape(output, [out_dim0, out_dim1*out_dim2])
        # graph interface to access input and output nodes from outside
        self.data_ph = data_ph
        self.lengths_ph = lengths_ph
        self.output = output

DATA = [[x,x,x] for x in [1,2,4,5,6,7]]
g = SplitAndPadGraph()
with tf.Session() as sess:
    for lengths in [[4,2], [2,3,1], [2,2,1,1]]:
        print "lengths =", lengths
        print sess.run(g.output, feed_dict={g.data_ph:DATA, g.lengths_ph:lengths})


干杯!
安德烈斯

关于python - Tensorflow:Tensor重塑形状并在某些行的末尾填充零,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/49409488/

10-12 19:32