本文介绍了使用 RELU 的神经网络反向传播的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试使用 RELU 实现神经网络.

I am trying to implement neural network with RELU.

输入层 -> 1 个隐藏层 -> relu -> 输出层 -> softmax 层

input layer -> 1 hidden layer -> relu -> output layer -> softmax layer

以上是我的神经网络的架构.我对这个 relu 的反向传播感到困惑.对于 RELU 的导数,如果 x 0,则输出为 1.所以当你计算梯度时,这是否意味着如果 x

Above is the architecture of my neural network.I am confused about backpropagation of this relu.For derivative of RELU, if x <= 0, output is 0.if x > 0, output is 1.So when you calculate the gradient, does that mean I kill gradient decent if x<=0?

有人可以逐步"解释我的神经网络架构的反向传播吗?

Can someone explain the backpropagation of my neural network architecture 'step by step'?

推荐答案

ReLU 函数定义为:对于 x > 0,输出为 x,即 f(x) = max(0,x)

The ReLU function is defined as: For x > 0 the output is x, i.e. f(x) = max(0,x)

所以对于导数 f '(x) 它实际上是:

So for the derivative f '(x) it's actually:

如果 x 0,输出为1.

if x < 0, output is 0. if x > 0, output is 1.

未定义导数 f'(0).所以它通常设置为 0 或者你修改激活函数为 f(x) = max(e,x) 一个小的 e.

The derivative f '(0) is not defined. So it's usually set to 0 or you modify the activation function to be f(x) = max(e,x) for a small e.

一般来说:ReLU 是一个使用整流器激活函数的单元.这意味着它的工作原理与任何其他隐藏层完全一样,但除了 tanh(x)、sigmoid(x) 或您使用的任何激活之外,您将改为使用 f(x) = max(0,x).

Generally: A ReLU is a unit that uses the rectifier activation function. That means it works exactly like any other hidden layer but except tanh(x), sigmoid(x) or whatever activation you use, you'll instead use f(x) = max(0,x).

如果您已经为具有 sigmoid 激活功能的工作多层网络编写了代码,那么它实际上是 1 行更改.前向或反向传播在算法上没有任何变化.如果您还没有使用更简单的模型,请返回并首先开始.否则,您的问题实际上不是关于 ReLU,而是关于整体实现 NN.

If you have written code for a working multilayer network with sigmoid activation it's literally 1 line of change. Nothing about forward- or back-propagation changes algorithmically. If you haven't got the simpler model working yet, go back and start with that first. Otherwise your question isn't really about ReLUs but about implementing a NN as a whole.

这篇关于使用 RELU 的神经网络反向传播的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

07-12 02:02