问题描述
我已按照 https:/中的步骤进行操作/machinelearningmastery.com/return-sequences-and-return-states-for-lstms-in-keras/但是当涉及到双向lstm时,我尝试过
i have followed the steps in https://machinelearningmastery.com/return-sequences-and-return-states-for-lstms-in-keras/But when it comes to the Bidirectional lstm, i tried this
lstm, state_h, state_c = Bidirectional(LSTM(128, return_sequences=True, return_state= True))(input)
但是它不起作用.
有什么方法可以在使用双向包装器时在LSTM层中同时获取最终的隐藏状态和序列
is there some approach to get both the final hidden state and sequence in a LSTM layer when using a bidirectional wrapper
推荐答案
调用Bidirectional(LSTM(128, return_sequences=True, return_state=True))(input)
返回5个张量:
- 整个隐藏状态序列,默认情况下将是向前和向后状态的串联.
- 前向LSTM的最后一个隐藏状态
h
- 正向LSTM的最后一个单元状态
c
- 向后LSTM的最后一个隐藏状态
h
- 向后LSTM的最后一个单元状态
c
- The entire sequence of hidden states, by default it'll be the concatenation of forward and backward states.
- The last hidden state
h
for the forward LSTM - The last cell state
c
for the forward LSTM - The last hidden state
h
for the backward LSTM - The last cell state
c
for the backward LSTM
您要发布的行会引发错误,因为您希望将返回的值仅解压缩为三个变量(lstm, state_h, state_c
).
The line you've posted would raise an error since you want to unpack the returned value into just three variables (lstm, state_h, state_c
).
要更正它,只需将返回值解压缩为5个变量.如果要合并状态,可以将前向和后向状态与Concatenate
层连接起来.
To correct it, simply unpack the returned value into 5 variables. If you want to merge the states, you can concatenate the forward and backward states with Concatenate
layers.
lstm, forward_h, forward_c, backward_h, backward_c = Bidirectional(LSTM(128, return_sequences=True, return_state=True))(input)
state_h = Concatenate()([forward_h, backward_h])
state_c = Concatenate()([forward_c, backward_c])
这篇关于使用双向包装器时,如何在LSTM层中获得最终的隐藏状态和序列的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!