我正在使用Tensorflow-gpu后端在Keras中训练模型。
任务是检测卫星图像中的建筑物。
损耗在下降(这很好),但在负方向上,准确性下降了。但好的部分是,模型的预测正在改善。我担心的是为什么损失为负数。此外,为什么模型会在精度下降的同时得到改善?
from tensorflow.keras.layers import Conv2D
from tensorflow.keras.layers import BatchNormalization
from tensorflow.keras.layers import Activation
from tensorflow.keras.layers import MaxPool2D as MaxPooling2D
from tensorflow.keras.layers import UpSampling2D
from tensorflow.keras.layers import concatenate
from tensorflow.keras.layers import Input
from tensorflow.keras import Model
from tensorflow.keras.optimizers import RMSprop
# LAYERS
inputs = Input(shape=(300, 300, 3))
# 300
down0 = Conv2D(32, (3, 3), padding='same')(inputs)
down0 = BatchNormalization()(down0)
down0 = Activation('relu')(down0)
down0 = Conv2D(32, (3, 3), padding='same')(down0)
down0 = BatchNormalization()(down0)
down0 = Activation('relu')(down0)
down0_pool = MaxPooling2D((2, 2), strides=(2, 2))(down0)
# 150
down1 = Conv2D(64, (3, 3), padding='same')(down0_pool)
down1 = BatchNormalization()(down1)
down1 = Activation('relu')(down1)
down1 = Conv2D(64, (3, 3), padding='same')(down1)
down1 = BatchNormalization()(down1)
down1 = Activation('relu')(down1)
down1_pool = MaxPooling2D((2, 2), strides=(2, 2))(down1)
# 75
center = Conv2D(1024, (3, 3), padding='same')(down1_pool)
center = BatchNormalization()(center)
center = Activation('relu')(center)
center = Conv2D(1024, (3, 3), padding='same')(center)
center = BatchNormalization()(center)
center = Activation('relu')(center)
# center
up1 = UpSampling2D((2, 2))(center)
up1 = concatenate([down1, up1], axis=3)
up1 = Conv2D(64, (3, 3), padding='same')(up1)
up1 = BatchNormalization()(up1)
up1 = Activation('relu')(up1)
up1 = Conv2D(64, (3, 3), padding='same')(up1)
up1 = BatchNormalization()(up1)
up1 = Activation('relu')(up1)
up1 = Conv2D(64, (3, 3), padding='same')(up1)
up1 = BatchNormalization()(up1)
up1 = Activation('relu')(up1)
# 150
up0 = UpSampling2D((2, 2))(up1)
up0 = concatenate([down0, up0], axis=3)
up0 = Conv2D(32, (3, 3), padding='same')(up0)
up0 = BatchNormalization()(up0)
up0 = Activation('relu')(up0)
up0 = Conv2D(32, (3, 3), padding='same')(up0)
up0 = BatchNormalization()(up0)
up0 = Activation('relu')(up0)
up0 = Conv2D(32, (3, 3), padding='same')(up0)
up0 = BatchNormalization()(up0)
up0 = Activation('relu')(up0)
# 300x300x3
classify = Conv2D(1, (1, 1), activation='sigmoid')(up0)
# 300x300x1
model = Model(inputs=inputs, outputs=classify)
model.compile(optimizer=RMSprop(lr=0.0001),
loss='binary_crossentropy',
metrics=[dice_coeff, 'accuracy'])
history = model.fit(sample_input, sample_target, batch_size=4, epochs=5)
OUTPUT:
Epoch 6/10
500/500 [==============================] - 76s 153ms/step - loss: -293.6920 -
dice_coeff: 1.8607 - acc: 0.2653
Epoch 7/10
500/500 [==============================] - 75s 150ms/step - loss: -309.2504 -
dice_coeff: 1.8730 - acc: 0.2618
Epoch 8/10
500/500 [==============================] - 75s 150ms/step - loss: -324.4123 -
dice_coeff: 1.8810 - acc: 0.2659
Epoch 9/10
136/500 [=======>......................] - ETA: 55s - loss: -329.0757 - dice_coeff: 1.8940 - acc: 0.2757
已预测
目标
问题出在哪里? (让dice_coeff成为自定义损失)
最佳答案
您的输出未针对二进制分类进行规范化。 (数据也可能未规范化)。
如果加载图像,则可能为0到255,甚至0到65355。
您应该规范化y_train
(除以y_train.max()
),并在模型末尾使用'sigmoid'
激活函数。
关于tensorflow - Keras损失为负数,准确性下降,但预测是否良好?,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/51658122/