簡體   English   中英

LSTM的網格搜索Hyperparametertuning

[英]Grid search Hyperparametertuning for LSTM

嘗試在我的LSTM模型中實現網格搜索時出現不同的錯誤。 我正在嘗試與此類似的東西。

# train the model
def build_model(train, n_back=1, n_predict=1, epochs=10, batch_size=10, neurons=100, activation='relu', optimizer='adam'):
    # define model
    model = Sequential()
    model.add(LSTM(neurons, activation=activation, input_shape=(n_timesteps, n_features)))
    model.add(RepeatVector(n_outputs))
    model.add(LSTM(neurons, activation=activation, return_sequences=True))
    model.add(TimeDistributed(Dense(neurons)))
    model.add(TimeDistributed(Dense(1)))
    model.compile(loss='mse', optimizer=optimizer)
    # fit network
    model.fit(train_x, train_y, epochs=epochs, batch_size=batch_size, verbose=1)
    return model
#### Epochs and Batch Size
batch_size = [10, 20]
epochs = [1, 10]

# Optimizer: Select!
#### Optimizer
optimizer = ['Adam', 'Adamax'] #'SGD', 'RMSprop', 'Adagrad', 'Adadelta', 'Adam', 'Adamax', 'Nadam'

#### Learning Rate and Momentum
learn_rate = [0.01, 0.2] #0.001, 0.01, 0.1, 0.2, 0.3
momentum = [0.0, 0.2, 0.9] #0.0, 0.2, 0.4, 0.6, 0.8, 0.9
lr_optimizer = SGD(lr=learn_rate, momentum=momentum)

#### Tune Network Weight Initialization
init_mode = ['lecun_uniform','zero', 'he_normal'] #'uniform', 'lecun_uniform', 'normal', 'zero', 'glorot_normal', 'glorot_uniform', 'he_normal', 'he_uniform'

#### Neuron Activation Function
activation = ['relu', 'softmax'] #'softmax', 'softplus', 'softsign', 'relu', 'tanh', 'sigmoid', 'hard_sigmoid', 'linear'

#### Tune Dropout Regularization
weight_constraint = [2, 3] #1, 2, 3, 4, 5
dropout_rate = [0.0, 0.1, 0.5, 0.9] #0.0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9

#### Tune the Number of Neurons in the Hidden Layer
neurons = [100, 200] #10, 50, 100, 200
# create model
model = KerasClassifier(build_fn=build_model(train, n_back, n_predict, epochs, batch_size, neurons, activation, optimizer), verbose=1)
param_grid = dict(batch_size=batch_size, epochs=epochs, optimizer=optimizer,
                  activation=activation,neurons=neurons)

grid = GridSearchCV(estimator=model, param_grid=param_grid, n_jobs=-1)
grid_result = grid.fit(train_x, train_y)

例如,一個錯誤是:

('Could not interpret activation function identifier:', ['relu', 'softmax'])

我究竟做錯了什么?

有沒有更好的方法來“調整”我的LSTM?

簡短:GridSearchCV只是工作2D而不是3D,換句話說,只是3D而不是4D(隨着時間的推移)。 在這種情況下,您必須設置自己的網格搜索。

如果您對此有任何疑問,可以與我聯系。 希望這有幫助。

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM