简体   繁体   中英

keras custom sigmoid adding bias

There is a need to add bias to my custom sigmod function and apply this as a last activation layer in NN. But my recall goes rightly into 1. That shows me that something is wrong with the formula.

Custom sigmoid function

在此处输入图片说明

Recall goes strictly into 1

在此处输入图片说明

def custom_sigmoid(x):
    return 1 / (1 + K.exp(-20*x - 0.5))

At the same time, custom sigmoid without multiplier and bias works great.

def custom_sigmoid(x):
    return 1 / (1 + K.exp(x))

as can be seen here

在此处输入图片说明

self.model_.add(keras.layers.Dense(1, activation=custom_sigmoid))
self.model_.compile(loss='binary_crossentropy', optimizer=optimizer, metrics=[precision_threshold(0.7), recall_threshold(0.7)])

How to modify the custom sigmoid function to make it work?

Your formula has no apparent problem, but it's likely to cause arithmetic overflow for -20*x - 0.5 , can you check the range of x . For example, if x is in [-100, 100] , the original sigmoid won't overflow while your customized sigmoid will. You can do a simple experiment in numpy:

import numpy as np

def original_sigmoid(x):
    return 1 / (1 + np.exp(x))


def custom_sigmoid(x):
    return 1 / (1 + np.exp(-20 * x - 0.5))


x = np.linspace(-100, 100)
print(original_sigmoid(x)) 
print(custom_sigmoid(x)) # this will output a warning: "RuntimeWarning: overflow encountered in exp"

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM