简体   繁体   中英

Problem with keras functional api and leaky relu

I'm trying to use leaky relu. I tried using the mtd given by

Keras Functional API and activations

It doesn't work. I got the error:

TypeError: activation() missing 1 required positional argument: 'activation_type'

Also, should Activation be capital throughout or not?

I use it as:

def activation(x, activation_type):
    if activation_type == 'leaky_relu':
        return activations.relu(x, alpha=0.3)
    else:
        return activations.get(activation_type)(x)
...

input_data = layers.Input(shape=(3,))
...
hiddenOut = Dense(units=2)(input_data)
hiddenOut = activation(lambda hiddenOut: activation(hiddenOut, 'LeakyReLU'))(hiddenOut)
u_out = Dense(1, activation='linear', name='u')(hiddenOut)   
...

You're doing something extra complicated, you can just

hiddenOut = keras.layers.LeakyReLU(alpha=0.3)(hiddenOut)
import keras

def my_activation(x, activation_type):
    if activation_type == 'LeakyReLU':
        return keras.activations.relu(x, alpha=0.3)
    else:
        return keras.activations.get(activation_type)(x)

input_data = keras.layers.Input(shape=(3,))
hiddenOut = keras.layers.Dense(units=2)(input_data)
hiddenOut = keras.layers.Activation(lambda hiddenOut: my_activation(hiddenOut, 'LeakyReLU'))(hiddenOut)

Why

  • Activation is a layer and activations is a set of available activation's.
  • To emulate Leaky ReLu we have to change the slope of the negative part. The slope is 0 for ReLu and this can be changed using the alpha parameter.
  • What we are doing is a write a wrapper function called my_activation which will return a Leaky ReLu with negative slope of 0.3 if the parameter is LeakyReLU else it will return the normal activation.

Example:

input_data = keras.layers.Input(shape=(3,))
a = keras.layers.Dense(units=2)(input_data)
a = keras.layers.Activation(lambda hiddenOut: my_activation(hiddenOut, 'LeakyReLU'))(a)
a = keras.layers.Activation(lambda hiddenOut: my_activation(hiddenOut, 'sigmoid'))(a)
a = keras.layers.Activation(lambda hiddenOut: my_activation(hiddenOut, 'tanh'))(a)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM