我正在编写训练自定义人脸重新识别系统的基本版本(使用mnist数据作为构建基块和张量流定义的半硬三元组损失函数),但是损失/ acc在多个时期后绝对没有变化。下面的代码
def kerasTriplet( label, pred ):
print('-------------------------')
print( label )
print( pred )
def lossFunc( y_true, y_pred ):
return tf.contrib.losses.metric_learning.triplet_semihard_loss( label, pred, 0.6 )
#return nonTFTripletLoss.batch_hard_triplet_loss( label, pred, 0.6 )
return lossFunc
def gen( trg, tgt ):
batch_sz = BATCH_SZ
start = np.random.randint( 0, len( trg ) - BATCH_SZ )
return trg[ start: start+batch_sz] , tgt[ start: start+batch_sz ]
(x_train, y_train), (x_test, y_test) = tf.keras.datasets.mnist.load_data()
n_train, height, width = x_train.shape
x_train = x_train.reshape(n_train, height, width, 1).astype('float32')
x_train = x_train[ :(int(len(x_train)/BATCH_SZ))*BATCH_SZ ]
x_train /= 255
num_classes = 10
y_train_orig = y_train
y_train_orig = y_train_orig[ :(int(len(x_train)/BATCH_SZ))*BATCH_SZ ]
y_train = tf.keras.utils.to_categorical(y_train, num_classes)
input_shape = (28, 28, 1)
sequence_input = tf.keras.layers.Input(shape=input_shape , dtype='float32')
batch_inp, batch_tgt = gen( x_train, y_train_orig )
x = tf.keras.layers.Conv2D( 512, (3,3), activation='relu')( batch_inp )
x = tf.keras.layers.Conv2D( 256, (3,3), activation='relu')( x )
x = tf.keras.layers.Conv2D( 128, (3,3), activation='relu')( x )
x = tf.keras.layers.Flatten()(x)
img_embedding = tf.keras.layers.Dense( 128 )(x)
## since triplet loss requires embedding to be l2 normalized
l2_embed = tf.keras.backend.l2_normalize( img_embedding, -1 )
model = tf.keras.models.Model( sequence_input , l2_embed )
model.compile( loss=kerasTriplet( batch_tgt, img_embedding ) , optimizer='adam', metrics=['acc'] )
model.fit(x_train, y_train_orig, batch_size=BATCH_SZ, epochs=10 , verbose=1)
我希望损失和acc能够移动,即使幅度不大(因为im仅运行10个纪元),但绝对相同。我确信这与我的代码有关。只是不能用手指
最佳答案
您正在错误地计算l2_embedding。尝试这个
l2_embed = tf.keras.backend.l2_normalize(img_embedding,axis = 1)
关于python - 在多个时期后,tensorflow Triplet_semihard_loss不变,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/58320998/