简体   繁体   中英

How relate ReLU to Sigmoid

Given a network, input -> hidden layer with ReLU activation -> output layer with Sigmoid activation

I wonder how that can output values between 0 and 0.5, since ReLU returns positive values by definition, and a sigmoid is defined by 1/(1+e^-x), the smallest value would be 1/(1+e^-0) = 0.5

Is the sigmoid activation shifted or anything? Because I get output between 0 and 1 (as the manual says), but why?

Edit

here is a code snipped that prints the function output in the console , it shows what i expected: it starts with 0.5, and increases when x>0.

    import numpy as np
    import tensorflow as tf
    
    x = np.array([x/10 for x in range(-20,20)])
    
    y = tf.keras.activations.sigmoid(
        tf.keras.activations.relu(x)
    )
    for a,b in zip(x,y):
        print(a,"\t",float(b))

output

-0.2     0.5
-0.1     0.5
0.0      0.5
0.1      0.52497918747894
0.2      0.549833997312478
0.3      0.574442516811659
0.4      0.5986876601124521

but if i do

    x = Dense(64, activation="relu")(input)
    output = Dense(1, activation="sigmoid")(x)

I receive values between 0 and 1.

x = Dense(64, activation="relu")(input)
output = Dense(1, activation="sigmoid")(x)

Here, the output layer x is wrapped with the ReLU activation. So the output is a range from 0 to ∞
But when it is passed on to layer output , it is multiplied with weights (weights can be negative), whose output can be both negative & positive (therefore a range from -∞ to ∞). As the range is -∞ to ∞, passing it to the sigmoid function will give a value from 0 to 1

In conclusion, wrapping a sigmoid on ReLU will return a value in range 0.5 to ∞. And wrapping a sigmoid on top of a weighted ReLU will return a value in range -∞ to ∞

It is wrong to apply Sigmoid on top of ReLU function. In implementation you seeing, probably output layer has its own weights which get multiplied to ReLU output. So theoretically sigmoid gets input between -inf to inf.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM