如何在Keras中创建可变长度输入LSTM

如何在Keras中创建可变长度输入LSTM

本文介绍了如何在Keras中创建可变长度输入LSTM?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试使用Keras通过LSTM进行一些香草模式识别,以预测序列中的下一个元素.

I am trying to do some vanilla pattern recognition with an LSTM using Keras to predict the next element in a sequence.

我的数据如下:

其中训练序列的标签是列表中的最后一个元素:X_train['Sequence'][n][-1].

where the label of the training sequence is the last element in the list: X_train['Sequence'][n][-1].

因为我的Sequence列中的序列中可以包含可变数量的元素,所以我认为RNN是最好的模型.以下是我在Keras中构建LSTM的尝试:

Because my Sequence column can have a variable number of elements in the sequence, I believe an RNN to be the best model to use. Below is my attempt to build an LSTM in Keras:

# Build the model

# A few arbitrary constants...
max_features = 20000
out_size = 128

# The max length should be the length of the longest sequence (minus one to account for the label)
max_length = X_train['Sequence'].apply(len).max() - 1

# Normal LSTM model construction with sigmoid activation
model = Sequential()
model.add(Embedding(max_features, out_size, input_length=max_length, dropout=0.2))
model.add(LSTM(128, dropout_W=0.2, dropout_U=0.2))
model.add(Dense(1))
model.add(Activation('sigmoid'))

# try using different optimizers and different optimizer configs
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])

这是我尝试训练模型的方法:

And here's how I attempt to train my model:

# Train the model
for seq in X_train['Sequence']:
    print("Length of training is {0}".format(len(seq[:-1])))
    print("Training set is {0}".format(seq[:-1]))
    model.fit(np.array([seq[:-1]]), [seq[-1]])

我的输出是这样

Length of training is 13
Training set is [1, 3, 13, 87, 1053, 28576, 2141733, 508147108, 402135275365, 1073376057490373, 9700385489355970183, 298434346895322960005291, 31479360095907908092817694945]

但是,出现以下错误:

Exception: Error when checking model input: expected embedding_input_1 to have shape (None, 347) but got array with shape (1, 13)

我相信我的训练步骤正确设置了,所以我的模型构造一定是错误的.请注意,347是max_length.

I believe my training step is correctly setup, so my model construction must be wrong. Note that 347 is max_length.

如何在Keras中正确构建可变长度输入LSTM?我不希望填充数据.不确定是否相关,但是我正在使用Theano后端.

How can I correctly build a variable-length input LSTM in Keras? I'd prefer not to pad the data. Not sure if it's relevant, but I'm using the Theano backend.

推荐答案

我不清楚嵌入过程.但是这里仍然是一种实现可变长度输入LSTM的方法.只是在构建LSTM时不指定时间跨度即可.

I am not clear about the embedding procedure. But still here is a way to implement a variable-length input LSTM. Just do not specify the timespan dimension when building LSTM.

import keras.backend as K
from keras.layers import LSTM, Input

I = Input(shape=(None, 200)) # unknown timespan, fixed feature size
lstm = LSTM(20)
f = K.function(inputs=[I], outputs=[lstm(I)])

import numpy as np
data1 = np.random.random(size=(1, 100, 200)) # batch_size = 1, timespan = 100
print f([data1])[0].shape
# (1, 20)

data2 = np.random.random(size=(1, 314, 200)) # batch_size = 1, timespan = 314
print f([data2])[0].shape
# (1, 20)

这篇关于如何在Keras中创建可变长度输入LSTM?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

09-02 16:37