简体   繁体   中英

Error using 'selu' activation function with Keras

I'm using Keras with Tensorflow backend. When I'm trying to use the 'selu' activation function using:

model.add(Dense(32, input_shape=(input_length - 1,)))
model.add(Activation('selu'))

The error I get is:

ValueError: Unknown activation function:selu

Is there any solution to this?

Selu is not in your activations.py of keras (most likely because it was added Jun 14, 2017, only 22 days ago). You can just add the missing code in the activations.py file or create your own selu activation in the script.

Example code

from keras.activations import elu

def selu(x):
    """Scaled Exponential Linear Unit. (Klambauer et al., 2017)
    # Arguments
        x: A tensor or variable to compute the activation function for.
    # References
        - [Self-Normalizing Neural Networks](https://arxiv.org/abs/1706.02515)
    """
    alpha = 1.6732632423543772848170429916717
    scale = 1.0507009873554804934193349852946
    return scale * elu(x, alpha)

model.add(Dense(32, input_shape=(input_length - 1,)), activation=selu)

NOTE:

With tensorflow 2.0 keras is included. You can get the selu activation with:

from tensorflow.keras.activations import selu

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM