本文介绍了在非常简单的KERAS二进制分类器中,LOSS不变的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试运行一个非常(简化)的Keras二进制分类器神经网络,但没有成功.损失保持恒定.到目前为止,我一直在使用Optimizer(SGD,Adam,RMSProp),Learningrate,Weight-Initializations,Batch Size和输入数据归一化.

I'm trying to get a very (over) simplified Keras binary classifier neural network running without success. The LOSS just stays constant. I've played around with Optimizers (SGD, Adam, RMSProp), Learningrates, Weight-Initializations, Batch Size and input data normalization so far.

什么都没有改变.我是从根本上做错了吗?这是代码:

Nothing changes at all. Am I doing something fundamentally wrong? Here is the code:

from tensorflow import keras
from keras import Sequential
from keras.layers import Dense
from keras.optimizers import SGD

data = np.array(
    [
        [100,35,35,12,0],
        [101,46,35,21,0],
        [130,56,46,3412,1],
        [131,58,48,3542,1]
    ]
)

x = data[:,1:-1]
y_target = data[:,-1]

x = x / np.linalg.norm(x)

model = Sequential()
model.add(Dense(3, input_shape=(3,), activation='softmax', kernel_initializer='lecun_normal',
                bias_initializer='lecun_normal'))
model.add(Dense(1, activation='softmax', kernel_initializer='lecun_normal',
                bias_initializer='lecun_normal'))

model.compile(optimizer=SGD(learning_rate=0.1),
              loss='binary_crossentropy',
              metrics=['accuracy'])

model.fit(x, y_target, batch_size=2, epochs=10,
          verbose=1)

推荐答案

Softmax 的定义是:

exp(a) / sum(exp(a)

因此,当您使用单个神经元时,您将得到:

so when you use with a single neuron you will get:

exp(a) / exp(a) = 1

这就是为什么您的分类器不适用于单个神经元的原因.

That is why your classifier doesn't work with a single neuron.

在这种特殊情况下,您可以改用 Sigmoid :

You can use sigmoid instead in this special case:

exp(a) / (exp(a) + 1)

此外, Sigmoid 函数适用于两个类别分类器. Softmax 是针对多类分类器的Sigmoid扩展.

Furthermore sigmoid function is for two class classifiers. Softmax is an extension of sigmoid for multiclass classifers.

对于第一层,您应该使用 relu Sigmoid 函数而不是softmax.

For the first layer you should use relu or sigmoid function instead of softmax.

这篇关于在非常简单的KERAS二进制分类器中,LOSS不变的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

10-26 19:50