简体   繁体   English

如何使用 optuna 试验在 sklearn MLPRegressor 中设置 hidden_​​layer_sizes

[英]How to set hidden_layer_sizes in sklearn MLPRegressor using optuna trial

I would like to use [OPTUNA][1] with sklearn [MLPRegressor][1] model.我想将[OPTUNA][1]sklearn [MLPRegressor][1]模型一起使用。

For almost all hyperparameters it is quite straightforward how to set OPTUNA for them.对于几乎所有的超参数,如何为它们设置 OPTUNA 是非常简单的。 For example, to set the learning rate: learning_rate_init = trial.suggest_float('learning_rate_init ',0.0001, 0.1001, step=0.005)例如设置学习率: learning_rate_init = trial.suggest_float('learning_rate_init ',0.0001, 0.1001, step=0.005)

My problem is how to set it for hidden_layer_sizes since it is a tuple.我的问题是如何为hidden_layer_sizes设置它,因为它是一个元组。 So let's say I would like to have two hidden layers where the first will have 100 neurons and the second will have 50 neurons.所以假设我想要有两个隐藏层,其中第一个将有 100 个神经元,第二个将有 50 个神经元。 Without OPTUNA I would do:没有 OPTUNA 我会这样做:

MLPRegressor( hidden_layer_sizes =(100,50))

But what if I want OPTUNA to try different neurons in each layer?但是如果我想让 OPTUNA 在每一层尝试不同的神经元怎么办? eg, from 100 to 500, how can I set it?例如,从 100 到 500,我该如何设置? the MLPRegressor expects a tuple MLPRegressor需要一个元组

You could set up your objective function as follows:您可以按如下方式设置目标函数:

import optuna
import warnings
from sklearn.datasets import make_regression
from sklearn.model_selection import train_test_split
from sklearn.neural_network import MLPRegressor
from sklearn.metrics import mean_squared_error
warnings.filterwarnings('ignore')

X, y = make_regression(random_state=1)

X_train, X_valid, y_train, y_valid = train_test_split(X, y, random_state=1)

def objective(trial):

    params = {
        'learning_rate_init': trial.suggest_float('learning_rate_init ', 0.0001, 0.1, step=0.005),
        'first_layer_neurons': trial.suggest_int('first_layer_neurons', 10, 100, step=10),
        'second_layer_neurons': trial.suggest_int('second_layer_neurons', 10, 100, step=10),
        'activation': trial.suggest_categorical('activation', ['identity', 'tanh', 'relu']),
    }

    model = MLPRegressor(
        hidden_layer_sizes=(params['first_layer_neurons'], params['second_layer_neurons']),
        learning_rate_init=params['learning_rate_init'],
        activation=params['activation'],
        random_state=1,
        max_iter=100
    )

    model.fit(X_train, y_train)

    return mean_squared_error(y_valid, model.predict(X_valid), squared=False)

study = optuna.create_study(direction='minimize')
study.optimize(objective, n_trials=3)
# [I 2021-11-11 18:04:02,216] A new study created in memory with name: no-name-14c92e38-b8cd-4b8d-8a95-77158d996f20
# [I 2021-11-11 18:04:02,283] Trial 0 finished with value: 161.8347337123744 and parameters: {'learning_rate_init ': 0.0651, 'first_layer_neurons': 20, 'second_layer_neurons': 40, 'activation': 'tanh'}. Best is trial 0 with value: 161.8347337123744.
# [I 2021-11-11 18:04:02,368] Trial 1 finished with value: 159.55535852658082 and parameters: {'learning_rate_init ': 0.0551, 'first_layer_neurons': 90, 'second_layer_neurons': 70, 'activation': 'relu'}. Best is trial 1 with value: 159.55535852658082.
# [I 2021-11-11 18:04:02,440] Trial 2 finished with value: 161.73980822730888 and parameters: {'learning_rate_init ': 0.0051, 'first_layer_neurons': 100, 'second_layer_neurons': 30, 'activation': 'identity'}. Best is trial 1 with value: 159.55535852658082.

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 为 MLPRegressor 使用 GridSearchCV:如何适应多个 hidden_​​layer_sizes - Using GridSearchCV for MLPRegressor: How to fit multiple hidden_layer_sizes 具有 hidden_​​layer_sizes 的 GridSearchCV 的奇怪行为 - Strange behaviour of GridSearchCV with hidden_layer_sizes 尝试使用 BayesSearchCV 调整 MLPClassifier hidden_layer_sizes 时出错 - Error when trying to tune MLPClassifier hidden_layer_sizes using BayesSearchCV Python scikit 学习 MLPClassifier "hidden_​​layer_sizes" - Python scikit learn MLPClassifier "hidden_layer_sizes" 如何在SciKitLearn中为MLPRegressor确定隐藏层大小? - How is the hidden layer size determined for MLPRegressor in SciKitLearn? 由于参数子空间无效,如何手动终止 Optuna 试验? - How to manually terminate an Optuna trial due to an invalid parameter subspace? 如何使用optuna搜索一组正态分布参数? - How to search a set of normally distributed parameters using optuna? Python - sklearn.MLPClassifier:如何获取第一个隐藏层的输出 - Python - sklearn.MLPClassifier: How to obtain output of the first hidden layer 使用sklearn MLPRegressor每次迭代获取损失值(MAE) - Fetching the loss values (MAE) per iteration using sklearn MLPRegressor 如何使用 optuna 训练 LGBMClassifier - How to train LGBMClassifier using optuna
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM