问题描述
我想将层归一化应用到使用 tf.keras 的循环神经网络.在 TensorFlow 2.0 中,tf.layers.experimental
中有一个 LayerNormalization
类,但不清楚如何在 之类的循环层中使用它LSTM
,在每个时间步(因为它被设计用于使用).我应该创建一个自定义单元格,还是有更简单的方法?
I would like to apply layer normalization to a recurrent neural network using tf.keras. In TensorFlow 2.0, there is a LayerNormalization
class in tf.layers.experimental
, but it's unclear how to use it within a recurrent layer like LSTM
, at each time step (as it was designed to be used). Should I create a custom cell, or is there a simpler way?
例如,在创建LSTM
层时,在每个时间步应用dropout就像设置recurrent_dropout
参数一样简单,但没有recurrent_layer_normalization
代码> 参数.
For example, applying dropout at each time step is as easy as setting the recurrent_dropout
argument when creating an LSTM
layer, but there is no recurrent_layer_normalization
argument.
推荐答案
在 tensorflow 插件中,有一个开箱即用的预构建 LayerNormLSTMCell
.
In tensorflow addons, there's a pre-built LayerNormLSTMCell
out of the box.
有关详细信息,请参阅此文档.您可能必须先安装 tensorflow-addons
,然后才能导入此单元格.
See this doc for more details. You may have to install tensorflow-addons
before you can import this cell.
pip install tensorflow-addons
这篇关于如何使用 tf.keras 在 RNN 中应用层归一化?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!