简体   繁体   中英

How to reduce overfiting while using VGG16 for regression?

i'm using transfer learning from VGG16 for a regression task but i get over-fit very quickly. I want to reduce the number of parameters for the regression (last layer), how can i do it?

Assuming you're not re-training the inception layers and just adding an output layer there's no way to reduce the number of trainable parameters, you could however try to fight overfitting by adding a Dropout layer in between or something like that.

Although you should bear in mind that VGG16's weights were trained with a loss function suitable for classification tasks (like categorical crossentropy), which gives me the impression that your model will only guess the length based on what the object is (which may even be the idea but a large car that looks a bit like an insect might be given a smaller size than a small car that looks like a building).

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM