本文介绍了神经网络初始化-Nguyen Widrow实施?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我已经尝试实现Nguyen Widrow算法(如下),它似乎可以正常运行,但是我还有一些后续问题:

I've had a go at implementing the Nguyen Widrow algorithm (below) and it appears to function correctly, but I have some follow-on questions:

  • 这看起来像是正确的实现吗?

  • Does this look like a correct implementation?

Nguyen Widrow初始化是否适用于任何网络拓扑/尺寸 ? (即5层AutoEncoder)

Does Nguyen Widrow initialization apply to any network topology /size ? (ie 5 layer AutoEncoder)

Nguyen Widrow初始化对于任何输入范围都有效吗? (0/1,-1/+ 1等)

Is Nguyen Widrow initialization valid for any input range? (0/1, -1/+1, etc)

Nguyen Widrow初始化是否对任何激活功能均有效? (即Logistic,Tanh,线性)

Is Nguyen Widrow initialization valid for any activation function? (Ie Logistic, Tanh, Linear)

下面的代码假定网络已经被随机分配为-1/+ 1:

The code below assumes that the network has already been randomized to -1/+1 :

        ' Calculate the number of hidden neurons
        Dim HiddenNeuronsCount As Integer = Me.TotalNeuronsCount - (Me.InputsCount - Me.OutputsCount)

        ' Calculate the Beta value for all hidden layers
        Dim Beta As Double = (0.7 * Math.Pow(HiddenNeuronsCount, (1.0 / Me.InputsCount)))

        ' Loop through each layer in neural network, skipping input layer
        For i As Integer = 1 To Layers.GetUpperBound(0)

            ' Loop through each neuron in layer
            For j As Integer = 0 To Layers(i).Neurons.GetUpperBound(0)

                Dim InputsNorm As Double = 0

                ' Loop through each weight in neuron inputs, add weight value to InputsNorm
                For k As Integer = 0 To Layers(i).Neurons(j).ConnectionWeights.GetUpperBound(0)
                    InputsNorm += Layers(i).Neurons(j).ConnectionWeights(k) * Layers(i).Neurons(j).ConnectionWeights(k)
                Next

                ' Add bias value to InputsNorm
                InputsNorm += Layers(i).Neurons(j).Bias * Layers(i).Neurons(j).Bias

                ' Finalize euclidean norm calculation
                InputsNorm = Math.Sqrt(InputsNorm)

                ' Loop through each weight in neuron inputs, scale the weight based on euclidean norm and beta
                For k As Integer = 0 To Layers(i).Neurons(j).ConnectionWeights.GetUpperBound(0)
                    Layers(i).Neurons(j).ConnectionWeights(k) = (Beta * Layers(i).Neurons(j).ConnectionWeights(k)) / InputsNorm
                Next

                ' Scale the bias based on euclidean norm and beta
                Layers(i).Neurons(j).Bias = (Beta * Layers(i).Neurons(j).Bias) / InputsNorm

            Next

        Next

推荐答案

阮& Widrow在他们的论文中假设输入在-1和+1之间.Nguyen Widrow初始化对于任何长度有限的激活函数均有效.再次在他们的论文中,他们只是在谈论2层神经网络,而不是5层神经网络.

Nguyen & Widrow in their paper assume that the inputs are between -1 and +1.Nguyen Widrow initialization is valid for any activation function which is finite in length.Again in their paper they are only talking about a 2 layer NN, not sure about a 5 layer one.

S

这篇关于神经网络初始化-Nguyen Widrow实施?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

10-11 22:22