简体   繁体   English

如何在TensorFlow 2.0中控制详细程度

[英]How to control verbosity in TensorFlow 2.0

In TensorFlow 1.x I had great freedom in choosing how and when to print accuracy/loss scores during training. 在TensorFlow 1.x中,我可以自由选择在培训期间以及何时打印准确性/损失得分的方式。 Fore example, if I wanted to print training loss every 100 epochs, in a tf.Session() I'd write: 例如,如果我想每隔100个时间打印一次训练损失,则可以在tf.Session()编写:

if epoch % 100 == 0:
    print(str(epoch) + '. Training Loss: ' + str(loss))

After the release of TF 2.0 (alpha), it seems that the Keras API forces to stick with its standard output. TF 2.0(alpha)发行之后,Keras API似乎不得不坚持其标准输出。 Is there a way to take that flexibility back? 有没有办法恢复这种灵活性?

If you don't use the Keras Model methods ( .fit , .train_on_batch , ...) and you write your own training loop using eager execution (and optionally wrapping it in a tf.function to convert it in its graph representation) you can control the verbosity as you're used to do in 1.x 如果您不使用.fit Model方法( .fit.train_on_batch ,...),而是使用急切的执行编写自己的训练循环(并将其包装在tf.function以将其转换为图形表示形式),可以像您在1.x中一样控制详细程度

training_epochs = 10
step = 0
for epoch in range(training_epochs)
    print("starting ",epoch)
    for features, labels in dataset:
        with tf.GradientTape() as tape:
            loss = compute_loss(model(features),labels)
        gradients = tape.gradients(loss, model.trainable_variables)
        optimizer.apply_gradients(zip(gradients, model.trainable_variables))
        step += 1
        if step % 10 == 0:
            # measure other metrics if needed
            print("loss: ", loss)
    print("Epoch ", epoch, " finished.")

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM