问题描述
我读了一个将 LSTM
与 CONV1
一起使用的示例。
(摘自:)
I read an example of using LSTM
with CONV1
.(Took it from: CNN LSTM)
Conv1D(filters=64, kernel_size=1, activation='relu')
- 我知道卷积的维数是1(大小为1的一个暗角)
- 卷积的值是多少? (矩阵1 * 1的值是多少?)
- 我不知道什么是
filters = 64
?是什么意思? -
relu
激活函数是否对卷积的输出起作用? (根据我的读物,似乎是这样,但我不确定) - 使用
kernel_size = 1 $ c $进行卷积的动机是什么c>,就像我们在这里所做的一样?
- what is the value of the convolution ? (what is the value of the matrix 1*1 ?)
- I can't figure out what is the
filters=64
? what does it mean ? - Is the
relu
activation function work on the output of the convolutional ? (from what I read it seems like that, but I'm not sure) - what is the motivation to use convolutional with
kernel_size = 1
, as we do here ?
推荐答案
过滤器
filters = 64
表示使用的单独过滤器数为64。
每个过滤器将输出1个通道。也就是说,这里有64个滤波器对输入进行运算,以产生64个不同的通道(或矢量)。因此 filters
参数确定输出通道的数量。
filters
filters = 64
means number of separate filters used is 64.Each filter will output 1 channel. i.e. here 64 filters operate on input to produce 64 different channels(or vectors). Hence filters
parameter determines number of output channels.
kernel_size
确定卷积窗口的大小。假设 kernel_size = 1
,则每个内核的维度将为 in_channels x 1
。因此,每个内核权重将是 in_channels x 1
维张量。
kernel_size
determines the size of the convolution window. Suppose kernel_size = 1
then each kernel will have dimension of in_channels x 1
. Hence each kernel weight will be in_channels x 1
dimension tensor.
这意味着 relu
激活将应用于卷积运算的输出。
That means relu
activation will be applied on the output of convolution operation.
用于通过应用非线性来减少深度通道。
Used to reduce depth channels with applying non-linearity. It will do something like weighted average across the channels while keeping receptive field.
在您的示例中,例如: filters = 64,kernel_size = 1,激活= relu
假设输入要素地图的大小为 100 x 10
(100个通道)。然后,图层权重将为 64 x 100 x 1
。输出大小将为 64 x 10
。
In your eg: filters = 64, kernel_size = 1, activation = relu
Suppose input feature map has size of 100 x 10
(100 channels). Then the layer weight will of dimension 64 x 100 x 1
. The output size will be 64 x 10
.
这篇关于内核大小等于1的Conv1D做什么?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!