[英]Keras hyperparameter tuning with hyperas using manual metric
I'm using the hyperas document example to tune the network parameters but based on f1 score instead of accuracy. 我正在使用hyperas文档示例来调整网络参数,但基于f1得分而不是准确性。
I'm using the following implementation for f1 score: 我将以下实现用于f1得分:
from keras import backend as K
def f1(y_true, y_pred):
def recall(y_true, y_pred):
"""Recall metric.
Only computes a batch-wise average of recall.
Computes the recall, a metric for multi-label classification of
how many relevant items are selected.
"""
true_positives = K.sum(K.round(K.clip(y_true * y_pred, 0, 1)))
possible_positives = K.sum(K.round(K.clip(y_true, 0, 1)))
recall = true_positives / (possible_positives + K.epsilon())
return recall
def precision(y_true, y_pred):
"""Precision metric.
Only computes a batch-wise average of precision.
Computes the precision, a metric for multi-label classification of
how many selected items are relevant.
"""
true_positives = K.sum(K.round(K.clip(y_true * y_pred, 0, 1)))
predicted_positives = K.sum(K.round(K.clip(y_pred, 0, 1)))
precision = true_positives / (predicted_positives + K.epsilon())
return precision
precision = precision(y_true, y_pred)
recall = recall(y_true, y_pred)
return 2*((precision*recall)/(precision+recall+K.epsilon()))
with updating the metric parameter for compile function in following code line: 在以下代码行中更新用于编译功能的度量参数:
model.compile(loss='categorical_crossentropy', metrics=['accuracy'],
optimizer={{choice(['rmsprop', 'adam', 'sgd'])}})
to 至
model.compile(loss='categorical_crossentropy', metrics=[f1],
optimizer={{choice(['rmsprop', 'adam', 'sgd'])}})
the above metric works perfectly without using hyperas, while when I try to use it with the tuning process, I get the following error: 上面的指标在不使用hyperas的情况下可以完美运行,而当我尝试在调整过程中使用它时,出现以下错误:
Traceback (most recent call last):
File "D:/path/test.py", line 96, in <module>
trials=Trials())
File "C:\Python35\lib\site-packages\hyperas\optim.py", line 67, in minimize
verbose=verbose)
File "C:\Python35\lib\site-packages\hyperas\optim.py", line 133, in base_minimizer
return_argmin=True),
File "C:\Python35\lib\site-packages\hyperopt\fmin.py", line 367, in fmin
return_argmin=return_argmin,
File "C:\Python35\lib\site-packages\hyperopt\base.py", line 635, in fmin
return_argmin=return_argmin)
File "C:\Python35\lib\site-packages\hyperopt\fmin.py", line 385, in fmin
rval.exhaust()
File "C:\Python35\lib\site-packages\hyperopt\fmin.py", line 244, in exhaust
self.run(self.max_evals - n_done, block_until_done=self.asynchronous)
File "C:\Python35\lib\site-packages\hyperopt\fmin.py", line 218, in run
self.serial_evaluate()
File "C:\Python35\lib\site-packages\hyperopt\fmin.py", line 137, in serial_evaluate
result = self.domain.evaluate(spec, ctrl)
File "C:\Python35\lib\site-packages\hyperopt\base.py", line 840, in evaluate
rval = self.fn(pyll_rval)
File "D:\path\temp_model.py", line 86, in keras_fmin_fnct
NameError: name 'f1' is not defined
If you are following the code example you linked to, you are not making hyperas aware of the custom f1 function. 如果您正在遵循链接到的代码示例,则不会使hyperas意识到自定义f1函数。 The package author provides an example to do that , as well.
包作者也提供了一个示例来执行此操作 。
In short, you need to add an additional functions
argument to your optim.minimize()
call. 简而言之,您需要在
optim.minimize()
调用中添加一个附加的functions
参数。 Something like 就像是
best_run, best_model = optim.minimize(model=model,
data=data,
functions=[f1],
algo=tpe.suggest,
max_evals=5,
trials=Trials())
I actually just implemented it today, so I'm confident you can get it working, too :) 我实际上只是在今天实现了它,所以我相信您也可以使它起作用:)
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.