我得到正确的评论吗?如下所述,那是我模型的5层吗?

模型

    # input - conv - conv - linear - linear(fc)
    def model(data): # input Layer

        # 1 conv Layer
        conv = tf.nn.conv2d(data, layer1_weights, [1, 2, 2, 1], padding='SAME')
        hidden = tf.nn.relu(conv + layer1_biases) # Activation function

        # 1 conv Layer
        conv = tf.nn.conv2d(hidden, layer2_weights, [1, 2, 2, 1], padding='SAME')
        hidden = tf.nn.relu(conv + layer2_biases) # Activation function

        # not a layer ( just reshape)
        shape = hidden.get_shape().as_list()
        reshape = tf.reshape(hidden, [shape[0], shape[1] * shape[2] * shape[3]])

        # 1 linear layer - not fc due to relu
        hidden = tf.nn.relu(tf.matmul(reshape, layer3_weights) + layer3_biases)

        # 1 linear fully connected layer
        return tf.matmul(hidden, layer4_weights) + layer4_biases

最佳答案

    # 1 linear layer - not fc due to relu
    hidden = tf.nn.relu(tf.matmul(reshape, layer3_weights) + layer3_biases)


在此层中,它是一个完全连接的层,并通过“ RELU”激活功能进行传递。此代码的层是这一部分

tf.matmul(reshape, layer3_weights) + layer3_biases


并且您正在通过relu激活功能发送该层

tf.nn.relu(tf.matmul(reshape, layer3_weights) + layer3_biases)


除此之外,这一切似乎还不错。

关于python - 识别卷积神经网络的层,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/47536405/

10-10 10:29