简体   繁体   English

tensorflow Triplet_semihard_loss 在多个时期后不会改变

[英]tensorflow triplet_semihard_loss doesnt change after multiple epochs

i am writing a basic version of training a custom face re-identification system (using mnist data as building blocks and tensorflow defined semi hard triplet loss function) but the loss /acc shows absolutely no change after multiple epochs.我正在编写一个训练自定义人脸重新识别系统的基本版本(使用 mnist 数据作为构建块,tensorflow 定义了半硬三元组损失函数),但损失 /acc 显示在多个时期后绝对没有变化。 Code below下面的代码


def kerasTriplet( label, pred ):
        print('-------------------------')
        print( label )
        print( pred )
        def lossFunc( y_true, y_pred ):
                return tf.contrib.losses.metric_learning.triplet_semihard_loss( label, pred, 0.6 )
                #return nonTFTripletLoss.batch_hard_triplet_loss( label, pred, 0.6 )

        return lossFunc

def gen( trg, tgt ):
        batch_sz = BATCH_SZ
        start = np.random.randint( 0, len( trg ) - BATCH_SZ )
        return trg[ start: start+batch_sz] , tgt[ start: start+batch_sz ]

(x_train, y_train), (x_test, y_test) = tf.keras.datasets.mnist.load_data()
n_train, height, width = x_train.shape
x_train = x_train.reshape(n_train, height, width, 1).astype('float32')
x_train = x_train[ :(int(len(x_train)/BATCH_SZ))*BATCH_SZ ]
x_train /= 255
num_classes = 10
y_train_orig = y_train
y_train_orig = y_train_orig[ :(int(len(x_train)/BATCH_SZ))*BATCH_SZ ]
y_train = tf.keras.utils.to_categorical(y_train, num_classes)

input_shape = (28, 28, 1)

sequence_input = tf.keras.layers.Input(shape=input_shape , dtype='float32')
batch_inp, batch_tgt = gen( x_train, y_train_orig )


x = tf.keras.layers.Conv2D( 512, (3,3), activation='relu')( batch_inp )
x = tf.keras.layers.Conv2D( 256, (3,3), activation='relu')( x )
x = tf.keras.layers.Conv2D( 128, (3,3), activation='relu')( x )

x = tf.keras.layers.Flatten()(x)
img_embedding = tf.keras.layers.Dense( 128 )(x)

## since triplet loss requires embedding to be l2 normalized
l2_embed = tf.keras.backend.l2_normalize( img_embedding, -1 )

model = tf.keras.models.Model( sequence_input , l2_embed )

model.compile( loss=kerasTriplet( batch_tgt, img_embedding ) , optimizer='adam', metrics=['acc'] )

model.fit(x_train, y_train_orig, batch_size=BATCH_SZ,  epochs=10 , verbose=1)



i would expect the loss and acc to move, even if not by much (since im only running 10 epochs) but its absolutely the same.我预计损失和 acc 会发生变化,即使幅度不大(因为我只运行了 10 个 epoch),但它完全一样。 I am sure it has something to do with my code.我确信这与我的代码有关。 Just can't put a finger to it就是指不上它

You are calculating l2_embedding incorrectly.您计算 l2_embedding 不正确。 Try this尝试这个

l2_embed = tf.keras.backend.l2_normalize( img_embedding, axis = 1 ) l2_embed = tf.keras.backend.l2_normalize(img_embedding,轴= 1)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM