繁体   English   中英

使用 tensorboard 和 keras-tuner 时没有找到 hparams 数据

[英]No hparams data was found when using tensorboard with keras-tuner

版本: tensorboard==2.9.0keras-tuner==1.1.2

这是一个简单的二元分类模型,使用 keras-tuner 在模型中添加了要搜索的超参数。

def build_model(hp):
n_layers = 4

n_features = len(X_train.columns)
inputs = tf.keras.Input(shape=(n_features,))

dense = tf.keras.layers.Dense(hp.Int("input_units", min_value=128, max_value=256, step=32),
                              activation=hp.Choice("activation", ['relu', 'tanh'])
                             )(inputs)
dense = tf.keras.layers.Dropout(0.2)(dense)

# num_layer as hyperparameter
for i in range(hp.Int("dense_layer", 1, n_layers)):
    dense = tf.keras.layers.Dense(hp.Int(f"hidden_unit_{i}", 128, 256, 32),
                                  activation=hp.Choice("activation", ['relu', 'tanh'])
                                 )(dense)
    
output = tf.keras.layers.Dense(1, activation='sigmoid')(dense)
model = tf.keras.Model(inputs=inputs, outputs=output)

lr = hp.Float("lr", min_value=1e-4, max_value=1e-1, sampling="log")
model.compile(optimizer=tf.keras.optimizers.Adam(lr),
          loss=tf.keras.losses.BinaryCrossentropy(),
          metrics=metrics)
return model

超参数搜索空间将是

 {neurons:[128, 160, 192, 224, 256], 
  num_hidden_layers:[1,2,3], 
  activation_function = ['relu', 'tanh'],
  learning_rate = [0.0001, 0.001, 0.01]}

现在开始搜索

tuner = RandomSearch(
    build_model,
    objective = kt.Objective("val_binary_accuracy", direction="max"),
    max_trials = 3,
    executions_per_trial = 1,
    directory=LOG_DIR
)
tensorboard_cb = tf.keras.callbacks.TensorBoard('logs/hyp_tune/')

tuner.search(X_train, y_train, epochs=10, batch_size=512, 
            validation_data=(X_test, y_test),
            callbacks=[tensorboard_cb]
            )

来自 keras-tuner 指南https://keras.io/guides/keras_tuner/visualize_tuning/这应该可以正常工作,在打开张量板时显示 Hparams。

但是,当我选择 HPARAMS 选项卡时,它会输出以下消息:

No hparams data was found.
Probable causes:

You haven’t written any hparams data to your event files.
Event files are still being loaded (try reloading this page).
TensorBoard can’t find your event files.
If you’re new to using TensorBoard, and want to find out how to add data and set up your event files, check out the README and perhaps the TensorBoard tutorial.

If you think TensorBoard is configured properly, please see the section of the README devoted to missing data problems and consider filing an issue on GitHub.

我已经尝试重新搜索,重新启动笔记本,但仍然不能没有运气。

[编辑] 当我加载 tensorboard tensorboard --logdir='logs/t1'它应该在屏幕左侧显示logs/t1下方 Runs 但是它显示logs/t0这是以前的运行(简单模型运行没有超参数调整)我认为,由于它显示之前的运行没有超参数调整,它没有显示在 HPARAMS 选项卡中的数据。 如何删除以前的日志并加载新日志? (用“logs/t0”覆盖超参数调整模型效果很好)

我编写了这段代码并正确运行它:

最后,使用这两个命令并获取输出:

%load_ext tensorboard
%tensorboard --logdir /logs/hyp_tune/

完整代码:

# !pip install keras-tuner -q

import numpy as np
import keras_tuner
import tensorflow as tf
from tensorflow import keras
from tensorflow.keras import layers

(x_train, y_train), (x_test, y_test) = (np.random.rand(1000,4), np.random.rand(1000)) , (np.random.rand(100,4), np.random.rand(100))



def build_model(hp):
    n_layers = 4
    n_features = x_train.shape[1]
    inputs = tf.keras.Input(shape=(n_features,))
    dense = tf.keras.layers.Dense(hp.Int("input_units", min_value=128, max_value=256, step=32),
                                activation=hp.Choice("activation", ['relu', 'tanh'])
                                )(inputs)
    dense = tf.keras.layers.Dropout(0.2)(dense)

    # num_layer as hyperparameter
    for i in range(hp.Int("dense_layer", 1, n_layers)):
        dense = tf.keras.layers.Dense(hp.Int(f"hidden_unit_{i}", 128, 256, 32),
                                    activation=hp.Choice("activation", ['relu', 'tanh'])
                                    )(dense)
        
    output = tf.keras.layers.Dense(1, activation='sigmoid')(dense)
    model = tf.keras.Model(inputs=inputs, outputs=output)

    lr = hp.Float("lr", min_value=1e-4, max_value=1e-1, sampling="log")
    model.compile(optimizer=tf.keras.optimizers.Adam(lr),
            loss=tf.keras.losses.BinaryCrossentropy(),
            metrics=["accuracy"])
    return model



hp = keras_tuner.HyperParameters()
model = build_model(hp)
model.summary()
tuner = keras_tuner.RandomSearch(
    build_model,
    max_trials=10,
    overwrite=True,
    objective="val_accuracy",
    # Set a directory to store the intermediate results.
    directory="/logs/hyp_tune/",
)

tensorboard_cb = tf.keras.callbacks.TensorBoard('/logs/hyp_tune/')
tuner.search(
    x_train,
    y_train,
    validation_data=(x_test, y_test),
    batch_size=512, 
    epochs=10,
    callbacks=[tensorboard_cb],
)

输出:

%load_ext tensorboard
%tensorboard --logdir /logs/hyp_tune/

在此处输入图像描述

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM