简体   繁体   English

在超参数调整期间,简单参数是否也会更改

[英]Does the simple parameters also change during Hyper-parameter tuning

During hyper-parameter tuning do the parameters (weights already learned during model training) are also optimized or are they fixed and only optimal values are found for the hyper-parameters? 在超参数调整过程中,是否还优化了参数(在模型训练过程中已学习的权重)或参数是否固定,并且仅找到了超参数的最佳值? Please explain. 请解释。

Short is NO , they are not fixed. 总之就是没有 ,他们是不固定的。

Because, Hyper-parameters directly influence your simple parameters . 因为,超参数直接影响您的简单参数 So for a neural network, no of hidden layers to use is a hyper-parameter, while weights and biases in each layer can be called simple parameter. 因此,对于神经网络而言,要使用的任何隐藏层都不是超参数,而每层中的权重和偏差可以称为简单参数。 Of course, you can't make weights of individual layers constant when the number of layers of the network (hyper-parameter) itself is variable . 当然,当网络(超参数)本身的层数可变时,您不能使各个层的权重保持恒定。 Similarly in linear regression, your regularization hyper-parameter directly impacts the weights learned. 同样,在线性回归中,您的正则化超参数会直接影响所学习的权重。

So goal of tuning hyper-parameter is to get a value that leads to best set of those simple parameters . 因此,调整超参数的目标是获得一个值,该值可导致这些简单参数的最佳集合。 Those simple parameters are the ones you actually care about, and they are the ones to be used in final prediction/deployment. 这些简单的参数是您真正关心的参数 ,它们是最终预测/部署中要使用的参数。 So tuning hyper-parameter while keeping them fixed is meaningless. 因此,调整超参数并使其固定不变是没有意义的。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 机器学习的超参数调优 model - Hyper-parameter Tuning for a machine learning model 使用 GridSearchCV 进行神经网络的超参数调优 - Hyper-parameter Tuning Using GridSearchCV for Neural Network 如何使用 GridSearchCV 比较多个模型以及 python 中的管道和超参数调整 - How to use GridSearchCV for comparing multiple models along with pipeline and hyper-parameter tuning in python 在超参数调整期间分数保持不变 - Score remains same during hyper parameter tuning 使用 Keras-tuner 进行超参数调整时关于“准确性”的错误 - Error regarding "accuracy" in hyper-parameter tuning using Keras-tuner 使用前馈神经网络进行超参数调整和过拟合 - Mini-Batch Epoch 和交叉验证 - Hyper-parameter tuning and Over-fitting with Feed-Forward Neural Network - Mini-Batch Epoch and Cross Validation 调整 MLPRegressor 超参数 - Tuning MLPRegressor hyper parameters Optuna 超参数优化:定义目标之外的超参数空间 function - Optuna hyper-parameter optimization: define hyper-parameter space outside the objective function LogisticRegression() vs LogisticRegressionCV() 及其 Cs 超参数 - LogisticRegression() vs LogisticRegressionCV() and its Cs hyper-parameter 如何使用smac进行卷积神经网络的超参数优化? - How to use smac for hyper-parameter optimization of Convolution Neural Network?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM