问题描述
在使用以下Keras网络训练和分类9个班级时:
When using the following Keras network to train and classify 9 classes:
from keras.models import Model
from keras.layers import Convolution1D, Input, Dropout, GlobalMaxPooling1D, Dense, merge
input_window3 = Input(shape=(MEANLEN, W2VLEN))
input_window4 = Input(shape=(MEANLEN, W2VLEN))
conv_w3 = Convolution1D(MEANLEN*2, 3, activation='tanh', border_mode='valid')(input_window3)
drop_w3 = Dropout(0.7)(conv_w3),
pool_w3 = GlobalMaxPooling1D(name='pool_w3')(drop_w3[0])
conv_w4 = Convolution1D(MEANLEN, 5, activation='tanh', border_mode='valid')(input_window4)
drop_w4 = Dropout(0.7)(conv_w4),
pool_w4 = GlobalMaxPooling1D(name='pool_w4')(drop_w4[0])
print(conv_w4.shape)
x = merge([pool_w3, pool_w4], mode='concat', concat_axis=1)
print(x.shape)
x = Dense(MEANLEN*3, activation='relu')(x)
drop_dense = Dropout(0.5)(x)
main_output = Dense(num_categories, activation='sigmoid', name='main_output')(drop_dense)
model = Model(input=[input_window3, input_window4], output=[main_output])
model.compile(optimizer='adam', loss='mse', metrics=['accuracy', f1_score])
预测:
result = model.predict([X_test, X_test])
将与向量相似的向量数组返回:
returns arrays of vectors simillar to these ones:
array([[ 0.08401331, 0.1911521 , 0.14310306, 0.07138534, 0.19428432,
0.15808958, 0.16400988, 0.27708355, 0.09983496],
[ 0.02074078, 0.08897329, 0.03244834, 0.00112842, 0.04122255,
0.03494435, 0.17535761, 0.55671334, 0.04375785],
[ 0.04897207, 0.06169643, 0.00313113, 0.002085 , 0.00275023,
0.00131959, 0.09961601, 0.56414878, 0.02338091]], dtype=float32)
我认为是类概率的数组中的值不能总计为1.如何获得类概率?
Values in arrays, that I assume to be class probabilities, do not sum up to 1. How to get class probabilities?
推荐答案
基于发布的数组,您有9个类别.在这种情况下,您应该使用 softmax 代替Sigmoid来替换最终的激活功能.另外,如果您还没有做过,则需要将标签转换为一键向量.您可以使用 to_categorical 来实现.最后,作为损失函数,应使用 categorical_crossentropy 损失,而不是mse. 此处 a>.
Based on the array that you posted, you have 9 categories. In such case, you should replace your final activation function with softmax instead of sigmoid. In addition, if you haven't done it yet, you need to transform your labels into one-hot vectors. You can do that using the function to_categorical. Finally, as a loss function, you should use categorical_crossentropy loss, instead of mse. A tutorial on using keras for classification (using the functions mentioned above) is provided here.
这篇关于Keras多类别分类概率总和不超过1的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!