我的模型准确度迅速提高到94.3%,但在接下来的几个时期中都保持在那里。
这是我的模型和代码:

model = Sequential()
model.add(Conv2D(5, (3,3),  strides=(2,2), kernel_initializer='normal', activation='sigmoid', input_shape=(dim, dim, 3)))
model.add(MaxPooling2D(pool_size=(2, 2), strides=(2,2)))
model.add(Conv2D(5, (3,3),  strides=(2,2), activation='sigmoid'))
model.add(MaxPooling2D(pool_size=(2, 2), strides=(2,2)))

# Create the feature vector
model.add(Flatten())
model.add(Dense(12288, activation='sigmoid'))
model.add(Dropout(0.2))
model.add(Dense(1536, activation='sigmoid'))
model.add(Dropout(0.3))
model.add(Dense(384, activation='sigmoid'))
model.add(Dropout(0.4))
model.add(Dense(1, activation='sigmoid'))

sgd = SGD(lr=0.001, momentum=0.9)
model.compile(loss="binary_crossentropy", optimizer="sgd", metrics=["accuracy"])
model.fit(data, labels, epochs=20, batch_size=100, callbacks=callbacks_list, verbose=1)
CNN_output = model.predict(data)


培训的输出如下所示:
CNN Output

检查CNN的输出(根据预测),我得到以下信息(请注意,这只是一个示例):

ACTUAL: train_0:
[ 1.]
PREDICTION: train_0:
[ 0.]
ACTUAL: train_1:
[ 0.]
PREDICTION: train_1:
[ 0.]
ACTUAL: train_2:
[ 0.]
PREDICTION: train_2:
[ 0.]
ACTUAL: train_3:
[ 0.]
PREDICTION: train_3:
[ 0.]
ACTUAL: train_4:
[ 0.]
PREDICTION: train_4:
[ 0.]
ACTUAL: train_5:
[ 1.]
PREDICTION: train_5:
[ 0.]
ACTUAL: train_6:
[ 0.]
PREDICTION: train_6:
[ 0.]
ACTUAL: train_7:
[ 1.]
PREDICTION: train_7:
[ 0.]
ACTUAL: train_8:
[ 0.]
PREDICTION: train_8:
[ 0.]
ACTUAL: train_9:
[ 0.]
PREDICTION: train_9:
[ 0.]
ACTUAL: train_10:
[ 0.]
PREDICTION: train_10:
[ 0.]
ACTUAL: train_11:
[ 0.]
PREDICTION: train_11:
[ 0.]
ACTUAL: train_12:
[ 0.]
PREDICTION: train_12:
[ 0.]
ACTUAL: train_13:
[ 0.]
PREDICTION: train_13:
[ 0.]
ACTUAL: train_14:
[ 0.]
PREDICTION: train_14:
[ 0.]
ACTUAL: train_15:
[ 0.]
PREDICTION: train_15:
[ 0.]
ACTUAL: train_16:
[ 0.]
PREDICTION: train_16:
[ 0.]
ACTUAL: train_17:
[ 0.]
PREDICTION: train_17:
[ 0.]
ACTUAL: train_18:
[ 0.]
PREDICTION: train_18:
[ 0.]
ACTUAL: train_19:
[ 0.]
PREDICTION: train_19:
[ 0.]
ACTUAL: train_20:
[ 0.]
PREDICTION: train_20:
[ 0.]
ACTUAL: train_21:
[ 0.]
PREDICTION: train_21:
[ 0.]
ACTUAL: train_22:
[ 0.]
PREDICTION: train_22:
[ 0.]
ACTUAL: train_23:
[ 0.]
PREDICTION: train_23:
[ 0.]
ACTUAL: train_24:
[ 0.]
PREDICTION: train_24:
[ 0.]
ACTUAL: train_25:
[ 0.]
PREDICTION: train_25:
[ 0.]
ACTUAL: train_26:
[ 0.]
PREDICTION: train_26:
[ 0.]
ACTUAL: train_27:
[ 0.]
PREDICTION: train_27:
[ 0.]
ACTUAL: train_28:
[ 0.]
PREDICTION: train_28:
[ 0.]
ACTUAL: train_29:
[ 0.]
PREDICTION: train_29:
[ 0.]
ACTUAL: train_30:
[ 0.]
PREDICTION: train_30:
[ 0.]
ACTUAL: train_31:
[ 0.]
PREDICTION: train_31:
[ 0.]
ACTUAL: train_32:
[ 0.]
PREDICTION: train_32:
[ 0.]
ACTUAL: train_33:
[ 0.]
PREDICTION: train_33:
[ 0.]
ACTUAL: train_34:
[ 0.]
PREDICTION: train_34:
[ 0.]
ACTUAL: train_35:
[ 0.]
PREDICTION: train_35:
[ 0.]
ACTUAL: train_36:
[ 0.]
PREDICTION: train_36:
[ 0.]
ACTUAL: train_37:
[ 0.]
PREDICTION: train_37:
[ 0.]
ACTUAL: train_38:
[ 0.]
PREDICTION: train_38:
[ 0.]
ACTUAL: train_39:
[ 0.]
PREDICTION: train_39:
[ 0.]
ACTUAL: train_40:
[ 0.]
PREDICTION: train_40:
[ 0.]
ACTUAL: train_41:
[ 0.]
PREDICTION: train_41:
[ 0.]
ACTUAL: train_42:
[ 0.]
PREDICTION: train_42:
[ 0.]
ACTUAL: train_43:
[ 1.]
PREDICTION: train_43:
[ 0.]
ACTUAL: train_44:
[ 0.]
PREDICTION: train_44:
[ 0.]
ACTUAL: train_45:
[ 0.]
PREDICTION: train_45:
[ 0.]
ACTUAL: train_46:
[ 0.]
PREDICTION: train_46:
[ 0.]
ACTUAL: train_47:
[ 0.]
PREDICTION: train_47:
[ 0.]
ACTUAL: train_48:
[ 0.]
PREDICTION: train_48:
[ 0.]
ACTUAL: train_49:
[ 0.]
PREDICTION: train_49:
[ 0.]
ACTUAL: train_50:
[ 0.]
PREDICTION: train_50:
[ 0.]
ACTUAL: train_51:
[ 0.]
PREDICTION: train_51:
[ 0.]
ACTUAL: train_52:
[ 0.]
PREDICTION: train_52:
[ 0.]
ACTUAL: train_53:
[ 0.]
PREDICTION: train_53:
[ 0.]
ACTUAL: train_54:
[ 0.]
PREDICTION: train_54:
[ 0.]
ACTUAL: train_55:
[ 0.]
PREDICTION: train_55:
[ 0.]
ACTUAL: train_56:
[ 0.]
PREDICTION: train_56:
[ 0.]
ACTUAL: train_57:
[ 0.]
PREDICTION: train_57:
[ 0.]
ACTUAL: train_58:
[ 0.]
PREDICTION: train_58:
[ 0.]
ACTUAL: train_59:
[ 0.]
PREDICTION: train_59:
[ 0.]
ACTUAL: train_60:
[ 1.]
PREDICTION: train_60:
[ 0.]
ACTUAL: train_61:
[ 0.]
PREDICTION: train_61:
[ 0.]
ACTUAL: train_62:
[ 0.]
PREDICTION: train_62:
[ 0.]
ACTUAL: train_63:
[ 0.]
PREDICTION: train_63:
[ 0.]
ACTUAL: train_64:
[ 0.]
PREDICTION: train_64:
[ 0.]
ACTUAL: train_65:
[ 0.]
PREDICTION: train_65:
[ 0.]
ACTUAL: train_66:
[ 0.]
PREDICTION: train_66:
[ 0.]
ACTUAL: train_67:
[ 0.]
PREDICTION: train_67:
[ 0.]
ACTUAL: train_68:
[ 0.]
PREDICTION: train_68:
[ 0.]
ACTUAL: train_69:
[ 0.]
PREDICTION: train_69:
[ 0.]
ACTUAL: train_70:
[ 0.]
PREDICTION: train_70:
[ 0.]
ACTUAL: train_71:
[ 0.]
PREDICTION: train_71:
[ 0.]
ACTUAL: train_72:
[ 0.]
PREDICTION: train_72:
[ 0.]
ACTUAL: train_73:
[ 0.]
PREDICTION: train_73:
[ 0.]
ACTUAL: train_74:
[ 0.]
PREDICTION: train_74:
[ 0.]
ACTUAL: train_75:
[ 0.]
PREDICTION: train_75:
[ 0.]
ACTUAL: train_76:
[ 0.]
PREDICTION: train_76:
[ 0.]
ACTUAL: train_77:
[ 0.]
PREDICTION: train_77:
[ 0.]
ACTUAL: train_78:
[ 0.]
PREDICTION: train_78:
[ 0.]
ACTUAL: train_79:
[ 0.]
PREDICTION: train_79:
[ 0.]
ACTUAL: train_80:
[ 0.]
PREDICTION: train_80:
[ 0.]
ACTUAL: train_81:
[ 0.]
PREDICTION: train_81:
[ 0.]
ACTUAL: train_82:
[ 0.]
PREDICTION: train_82:
[ 0.]
ACTUAL: train_83:
[ 0.]
PREDICTION: train_83:
[ 0.]
ACTUAL: train_84:
[ 0.]
PREDICTION: train_84:
[ 0.]
ACTUAL: train_85:
[ 0.]
PREDICTION: train_85:
[ 0.]
ACTUAL: train_86:
[ 0.]
PREDICTION: train_86:
[ 0.]
ACTUAL: train_87:
[ 0.]
PREDICTION: train_87:
[ 0.]
ACTUAL: train_88:
[ 0.]
PREDICTION: train_88:
[ 0.]
ACTUAL: train_89:
[ 0.]
PREDICTION: train_89:
[ 0.]
ACTUAL: train_90:
[ 0.]
PREDICTION: train_90:
[ 0.]
ACTUAL: train_91:
[ 0.]
PREDICTION: train_91:
[ 0.]
ACTUAL: train_92:
[ 0.]
PREDICTION: train_92:
[ 0.]
ACTUAL: train_93:
[ 0.]
PREDICTION: train_93:
[ 0.]
ACTUAL: train_94:
[ 0.]
PREDICTION: train_94:
[ 0.]
ACTUAL: train_95:
[ 0.]
PREDICTION: train_95:
[ 0.]
ACTUAL: train_96:
[ 0.]
PREDICTION: train_96:
[ 0.]
ACTUAL: train_97:
[ 0.]
PREDICTION: train_97:
[ 0.]
ACTUAL: train_98:
[ 0.]
PREDICTION: train_98:
[ 0.]
ACTUAL: train_99:
[ 0.]
PREDICTION: train_99:
[ 0.]
ACTUAL: train_100:
[ 0.]
PREDICTION: train_100:
[ 0.]
ACTUAL: train_101:
[ 0.]
PREDICTION: train_101:
[ 0.]
ACTUAL: train_102:
[ 0.]
PREDICTION: train_102:
[ 0.]
ACTUAL: train_103:
[ 0.]
PREDICTION: train_103:
[ 0.]
ACTUAL: train_104:
[ 1.]
PREDICTION: train_104:
[ 0.]
ACTUAL: train_105:
[ 0.]
PREDICTION: train_105:
[ 0.]
ACTUAL: train_106:
[ 0.]
PREDICTION: train_106:
[ 0.]
ACTUAL: train_107:
[ 0.]
PREDICTION: train_107:
[ 0.]
ACTUAL: train_108:
[ 0.]
PREDICTION: train_108:
[ 0.]
ACTUAL: train_109:
[ 0.]
PREDICTION: train_109:
[ 0.]
ACTUAL: train_110:
[ 0.]
PREDICTION: train_110:
[ 0.]
ACTUAL: train_111:
[ 0.]
PREDICTION: train_111:
[ 0.]
ACTUAL: train_112:
[ 0.]
PREDICTION: train_112:
[ 0.]
ACTUAL: train_113:
[ 0.]
PREDICTION: train_113:
[ 0.]
ACTUAL: train_114:
[ 0.]
PREDICTION: train_114:
[ 0.]
ACTUAL: train_115:
[ 0.]
PREDICTION: train_115:
[ 0.]
ACTUAL: train_116:
[ 0.]
PREDICTION: train_116:
[ 0.]
ACTUAL: train_117:
[ 0.]
PREDICTION: train_117:
[ 0.]
ACTUAL: train_118:
[ 0.]
PREDICTION: train_118:
[ 0.]
ACTUAL: train_119:
[ 0.]
PREDICTION: train_119:
[ 0.]
ACTUAL: train_120:
[ 0.]
PREDICTION: train_120:
[ 0.]
ACTUAL: train_121:
[ 0.]
PREDICTION: train_121:
[ 0.]
ACTUAL: train_122:
[ 0.]
PREDICTION: train_122:
[ 0.]
ACTUAL: train_123:
[ 0.]
PREDICTION: train_123:
[ 0.]
ACTUAL: train_124:
[ 0.]
PREDICTION: train_124:
[ 0.]
ACTUAL: train_125:
[ 0.]
PREDICTION: train_125:
[ 0.]
ACTUAL: train_126:
[ 0.]
PREDICTION: train_126:
[ 0.]
ACTUAL: train_127:
[ 0.]
PREDICTION: train_127:
[ 0.]
ACTUAL: train_128:
[ 0.]
PREDICTION: train_128:
[ 0.]
ACTUAL: train_129:
[ 0.]
PREDICTION: train_129:
[ 0.]
ACTUAL: train_130:
[ 0.]
PREDICTION: train_130:
[ 0.]
ACTUAL: train_131:
[ 0.]
PREDICTION: train_131:
[ 0.]
ACTUAL: train_132:
[ 1.]
PREDICTION: train_132:
[ 0.]
ACTUAL: train_133:
[ 1.]
PREDICTION: train_133:
[ 0.]
ACTUAL: train_134:
[ 0.]
PREDICTION: train_134:
[ 0.]
ACTUAL: train_135:
[ 0.]
PREDICTION: train_135:
[ 0.]
ACTUAL: train_136:
[ 1.]
PREDICTION: train_136:
[ 0.]
ACTUAL: train_137:
[ 0.]
PREDICTION: train_137:
[ 0.]
ACTUAL: train_138:
[ 0.]
PREDICTION: train_138:
[ 0.]
ACTUAL: train_139:
[ 0.]
PREDICTION: train_139:
[ 0.]
ACTUAL: train_140:
[ 0.]
PREDICTION: train_140:
[ 0.]
ACTUAL: train_141:
[ 0.]
PREDICTION: train_141:
[ 0.]
ACTUAL: train_142:
[ 0.]
PREDICTION: train_142:
[ 0.]
ACTUAL: train_143:
[ 0.]
PREDICTION: train_143:
[ 0.]
ACTUAL: train_144:
[ 0.]
PREDICTION: train_144:
[ 0.]
ACTUAL: train_145:
[ 0.]
PREDICTION: train_145:
[ 0.]
ACTUAL: train_146:
[ 0.]
PREDICTION: train_146:
[ 0.]
ACTUAL: train_147:
[ 0.]
PREDICTION: train_147:
[ 0.]
ACTUAL: train_148:
[ 0.]
PREDICTION: train_148:
[ 0.]
ACTUAL: train_149:
[ 0.]
PREDICTION: train_149:
[ 0.]
ACTUAL: train_150:
[ 0.]
PREDICTION: train_150:
[ 0.]
ACTUAL: train_151:
[ 0.]
PREDICTION: train_151:
[ 0.]
ACTUAL: train_152:
[ 1.]
PREDICTION: train_152:
[ 0.]
ACTUAL: train_153:
[ 0.]
PREDICTION: train_153:
[ 0.]
ACTUAL: train_154:
[ 0.]
PREDICTION: train_154:
[ 0.]
ACTUAL: train_155:
[ 0.]
PREDICTION: train_155:
[ 0.]
ACTUAL: train_156:
[ 0.]
PREDICTION: train_156:
[ 0.]
ACTUAL: train_157:
[ 0.]
PREDICTION: train_157:
[ 0.]
ACTUAL: train_158:
[ 0.]
PREDICTION: train_158:
[ 0.]
ACTUAL: train_159:
[ 0.]
PREDICTION: train_159:
[ 0.]
ACTUAL: train_160:
[ 0.]
PREDICTION: train_160:
[ 0.]
ACTUAL: train_161:
[ 0.]
PREDICTION: train_161:
[ 0.]
ACTUAL: train_162:
[ 0.]
PREDICTION: train_162:
[ 0.]
ACTUAL: train_163:
[ 0.]
PREDICTION: train_163:
[ 0.]
ACTUAL: train_164:
[ 0.]
PREDICTION: train_164:
[ 0.]
ACTUAL: train_165:
[ 0.]
PREDICTION: train_165:
[ 0.]
ACTUAL: train_166:
[ 0.]
PREDICTION: train_166:
[ 0.]
ACTUAL: train_167:
[ 0.]
PREDICTION: train_167:
[ 0.]
ACTUAL: train_168:
[ 0.]
PREDICTION: train_168:
[ 0.]
ACTUAL: train_169:
[ 0.]
PREDICTION: train_169:
[ 0.]
ACTUAL: train_170:
[ 0.]
PREDICTION: train_170:
[ 0.]
ACTUAL: train_171:
[ 0.]
PREDICTION: train_171:
[ 0.]
ACTUAL: train_172:
[ 0.]
PREDICTION: train_172:
[ 0.]
ACTUAL: train_173:
[ 0.]
PREDICTION: train_173:
[ 0.]
ACTUAL: train_174:
[ 0.]
PREDICTION: train_174:
[ 0.]
ACTUAL: train_175:
[ 0.]
PREDICTION: train_175:
[ 0.]
ACTUAL: train_176:
[ 0.]
PREDICTION: train_176:
[ 0.]
ACTUAL: train_177:
[ 0.]
PREDICTION: train_177:
[ 0.]
ACTUAL: train_178:
[ 0.]
PREDICTION: train_178:
[ 0.]
ACTUAL: train_179:
[ 0.]
PREDICTION: train_179:
[ 0.]
ACTUAL: train_180:
[ 1.]
PREDICTION: train_180:
[ 0.]
ACTUAL: train_181:
[ 0.]
PREDICTION: train_181:
[ 0.]
ACTUAL: train_182:
[ 0.]
PREDICTION: train_182:
[ 0.]
ACTUAL: train_183:
[ 0.]
PREDICTION: train_183:
[ 0.]
ACTUAL: train_184:
[ 0.]
PREDICTION: train_184:
[ 0.]
ACTUAL: train_185:
[ 0.]
PREDICTION: train_185:
[ 0.]
ACTUAL: train_186:
[ 0.]
PREDICTION: train_186:
[ 0.]
ACTUAL: train_187:
[ 0.]
PREDICTION: train_187:
[ 0.]
ACTUAL: train_188:
[ 0.]
PREDICTION: train_188:
[ 0.]
ACTUAL: train_189:
[ 0.]
PREDICTION: train_189:
[ 0.]
ACTUAL: train_190:
[ 0.]
PREDICTION: train_190:
[ 0.]
ACTUAL: train_191:
[ 0.]
PREDICTION: train_191:
[ 0.]
ACTUAL: train_192:
[ 0.]
PREDICTION: train_192:
[ 0.]
ACTUAL: train_193:
[ 0.]
PREDICTION: train_193:
[ 0.]
ACTUAL: train_194:
[ 0.]
PREDICTION: train_194:
[ 0.]
ACTUAL: train_195:
[ 0.]
PREDICTION: train_195:
[ 0.]
ACTUAL: train_196:
[ 1.]
PREDICTION: train_196:
[ 0.]
ACTUAL: train_197:
[ 0.]
PREDICTION: train_197:
[ 0.]
ACTUAL: train_198:
[ 0.]
PREDICTION: train_198:
[ 0.]
ACTUAL: train_199:
[ 0.]
PREDICTION: train_199:
[ 0.]

最佳答案

您的数据集非常不平衡/偏斜。您有94%的标签0和6%的标签1。神经网络只是了解到,如果对所有内容都预测为0,它的性能可能很高。

您可以避免这种情况的方法是将数据集更改为具有标签1的50%和标签0的50%,或者可以使用fit函数的“ class_weight”参数:


  class_weight:将类映射到权重值的字典,用于缩放损失函数(仅在训练期间)。 source


在你的情况下,我会用

fit(..., class_weight = {0:1, 1:15.5})


因为0类中的样本是1的15.5倍。这里的数字只是说,当您对0进行错误分类时,损失乘以1,而当您对1进行不分类时,损失乘以15.5。...更多信息。

另外,我不会使用准确度指标来真正评估您的情况下的结果,而是会看到f1分数指标,它更适合此类数据集。 here

我希望这有帮助?

关于machine-learning - 精度提高,但在许多时期保持不变,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/44248861/

10-10 22:16