简体   繁体   English

Keras 中的“无法解释激活 function 标识符:256”错误

[英]"Could not interpret activation function identifier: 256" error in Keras

I'm trying to run the following code but I got an error.我正在尝试运行以下代码,但出现错误。 Did I miss something in the codes?我错过了代码中的某些内容吗?

from keras.layers.core import Dense, Activation, Dropout
from keras.layers.recurrent import LSTM
from keras.models import Sequential
from keras.callbacks import ModelCheckpoint
from keras.models import load_model
from keras.optimizers import Adam
from keras.regularizers import l2
from keras.activations import relu, elu, linear, sigmoid

def build_fc_model(layers):
    fc_model = Sequential()
    for i in range(len(layers)-1):
        fc_model.add( Dense(layers[i],layers[i+1]) )#, W_regularizer=l2(0.1)) )
        fc_model.add( Dropout(0.5) )
        if i < (len(layers) - 2):
            fc_model.add( Activation('relu') )
    fc_model.summary()
    return fc_model
fc_model_1 = build_fc_model([2, 256, 512, 1024, 1])

and here is the error message:这是错误消息:

TypeError: Could not interpret activation function identifier: 256

This error indicates that, you have defined an activation function that is not interpretable .此错误表明,您定义了一个不可解释的激活 function In your definition of a dense layer you have passed two argument as layers[i] and layers[i+1] .在密集层的定义中,您传递了两个参数作为layers[i]layers[i+1]

Based on the docs here for the Dense function: The first argument is number of units (neurons) and the second parameter is activation function.基于Dense function的文档:第一个参数是单元数(神经元),第二个参数是激活 function。 So, it considers layers[i+1] as an activation function that could not be recognized by the Dense function.因此,它将layers[i+1]视为Dense function 无法识别的激活 function。

Inference : You do not need to pass next layer neurons to your dense layer.推理:您不需要将下一层神经元传递给密集层。 So remove layers[i+1] argument.所以删除layers[i+1]参数。

Furthermore, you have to define an input layer for your model and pass the input shape to it for your model.此外,您必须为您的 model 定义一个输入层,并将输入形状传递给您的 model。

Therefore, modified code should be like this:因此,修改后的代码应该是这样的:

from keras.layers.core import Dense, Activation, Dropout
from keras.layers.recurrent import LSTM
from keras.models import Sequential
from keras.callbacks import ModelCheckpoint
from keras.models import load_model
from keras.optimizers import Adam
from keras.regularizers import l2
from keras.activations import relu, elu, linear, sigmoid
from keras.layers import InputLayer  #import input layer 

def build_fc_model(layers):
    fc_model = Sequential()
    fc_model.add(InputLayer(input_shape=(784,))) #add input layer and specify it's shape
    for i in range(len(layers)-1):
        fc_model.add( Dense(layers[i]) ) #remove unnecessary second argument 
        if i < (len(layers) - 2):
            fc_model.add( Activation('relu') )
        fc_model.add( Dropout(0.5) )
    fc_model.summary()
    return fc_model
fc_model_1 = build_fc_model([2, 256, 512, 1024, 1])

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM