简体   繁体   中英

How to implement triplet loss in my project?

Im doing a project on speaker recognition and I have data set of audio files which from them I take feature of sound ( array of 1x13 ) as input and as output I pick a random integer for each speaker ( let's say speaker 1 - output = 1 speaker 2 output = 2...)

I was offered to use triplet loss in order to achieve better accuracy but I don't understand how we implement it in tensor-flow ( there is no examples at all )

From what I understood until now, I take 2 positive and one negative values for each speaker ( for example 2 features of 2 audios files of the same speaker, and one feature of synthetic audio file of another speaker which I will create with wavenet)

But what I do with this features in order to achieve the triplet loss, meaning how I really implement it using tensor flow

The Triplet loss function working is explained in below steps.

Training Data:
Like you said, you need to have triplet for each training sample, Anchor,Positive,Negetive .

Architecture of the model:

The idea is to have 3 identical networks having the same neural net architecture and they should share weights.

在此处输入图片说明

Model Learning:

The model not only learned to formulate clusters for different classes at the same time, but it's also successful in projecting similar-looking data into their neighborhood region. In the case of classification architecture, the model tries to learn a decision boundary between a pair of classes, but the model doesn't take care of the integrity between similar and dissimilar data within a class.

You can follow this link and this link which has the implementation for the triplet loss on TensorFlow, you can follow the same architecture for your data with some changes in it.

Hope this helps you, Happy Learning!

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM