简体   繁体   English

如何防止 Weights & Biases 保存不必要的参数

[英]How to prevent Weights & Biases from saving unnecessary parameters

I am using Weights & Biases ( link ) to manage hyperparameter optimization and log the results.我正在使用权重和偏差(链接)来管理超参数优化并记录结果。 I am training using Keras with a Tensorflow backend, and I am using the out-of-the-box logging functionality of Weights & Biases, in which I run我正在使用带有 Tensorflow 后端的 Keras 进行训练,我正在使用 Weights & Biases 的开箱即用的日志记录功能,我在其中运行

wandb.init(project='project_name', entity='username', config=config)

and then add a WandbCallback() to the callbacks of classifier.fit() .然后将WandbCallback()添加到classifier.fit() () 的回调中。 By default, Weights & Biases appears to save the model parameters (ie, the model's weights and biases) and store them in the cloud.默认情况下,Weights & Biases 似乎保存了 model 个参数(即模型的权重和偏差)并将它们存储在云端。 This eats up my account's storage quota, and it is unnecessary --- I only care about tracking the model loss/accuracy as a function of the hyperparameters.这耗尽了我帐户的存储配额,这是不必要的——我只关心跟踪 model 损失/准确度作为超参数的 function。

Is it possible for me to train a model and log the loss and accuracy using Weights & Biases, but not store the model parameters in the cloud?我是否可以训练 model 并使用权重和偏差记录损失和准确性,但不将 model 参数存储在云端? How can I do this?我怎样才能做到这一点?

In order to not save the trained model weights during hyperparam optimization you do something like this:为了在超参数优化期间不保存经过训练的 model 权重,您可以执行以下操作:

classifier.fit(..., callbacks=[WandbCallback(.., save_model=False)]

This will only track the metrics (train/validation loss/acc, etc.).这只会跟踪指标(训练/验证损失/acc 等)。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM