简体   繁体   English

Keras 的 TensorFlow 后端是否依赖于 Eager Execution?

[英]Does the TensorFlow backend of Keras rely on the eager execution?

Does the TensorFlow backend of Keras rely on the eager execution? Keras 的 TensorFlow 后端是否依赖于 Eager Execution?

If it isn't the case, can I build a TensorFlow graph based on Keras and TensorFlow operations, then train the whole model using Keras high-level API?如果不是这种情况,我是否可以基于 Keras 和 TensorFlow 操作构建一个 TensorFlow 图,然后使用 Keras 高级 API 训练整个模型?

It is for a research purpose which I can't present here.这是出于研究目的,我无法在这里介绍。

That makes it really difficult to answer your question.这使得回答你的问题真的很困难。 It would be better if you could find a toy example -- unrelated with your research -- of what you want and we try to build something from there.如果你能找到一个玩具示例——与你的研究无关——你想要什么,我们会尝试从那里构建一些东西,那就更好了。

Does the TensorFlow backend of Keras rely on the eager execution? Keras 的 TensorFlow 后端是否依赖于 Eager Execution?

No, it doesn't.不,它没有。 Keras was built before eager execution introduction. Keras 是在引入 Eager Execution 之前构建的。 Keras (the one inside tf) can, however, work in eager execution mode (see fchollet's answer ).然而,Keras(tf 中的那个)可以在急切执行模式下工作(请参阅 fchollet 的回答)。

can I build a TensorFlow graph and combine it with a Keras model then train them jointly using Keras high-level API?我可以构建一个 TensorFlow 图并将其与 Keras 模型结合,然后使用 Keras 高级 API 联合训练它们吗?

I'm not sure what you mean by "build a TensorFlow graph", because a graph already exists whenever you use keras.我不确定“构建 TensorFlow 图”是什么意思,因为每当您使用 keras 时,图就已经存在。 If you are talking about adding a bunch of operations to the existing graph, then it's definitely possible.如果您正在谈论向现有图形添加一堆操作,那么这绝对是可能的。 You just need to wrap it up with a Lambda layer, just like you'd do if using Keras on symbolic mode:你只需要用一个 Lambda 层把它包裹起来,就像在符号模式下使用 Keras 一样:

import tensorflow as tf
from sacred import Experiment

ex = Experiment('test-18')

tf.enable_eager_execution()


@ex.config
def my_config():
    pass


@ex.automain
def main():
    (x_train, y_train), (x_test, y_test) = tf.keras.datasets.mnist.load_data()

    x_train, x_test = (e.reshape(e.shape[0], -1) for e in (x_train, x_test))
    y_train, y_test = (tf.keras.utils.to_categorical(e) for e in (y_train, y_test))

    def complex_tf_fn(x):
        u, v = tf.nn.moments(x, axes=[1], keep_dims=True)
        return (x - u) / tf.sqrt(v)

    with tf.device('/cpu:0'):
        model = tf.keras.Sequential([
            tf.keras.layers.Lambda(complex_tf_fn, input_shape=[784]),
            tf.keras.layers.Dense(1024, activation='relu'),
            tf.keras.layers.Lambda(complex_tf_fn),
            tf.keras.layers.Dense(10, activation='softmax')
        ])
        model.compile(optimizer=tf.train.AdamOptimizer(),
                      loss='categorical_crossentropy')

        model.fit(x_train, y_train,
                  epochs=10,
                  validation_data=(x_test, y_test),
                  batch_size=1024,
                  verbose=2)
python test-18.py with seed=21

INFO - test-18 - Running command 'main'
INFO - test-18 - Started
Train on 60000 samples, validate on 10000 samples
Epoch 1/10
 - 9s - loss: 3.4012 - val_loss: 1.3575
Epoch 2/10
 - 9s - loss: 0.9870 - val_loss: 0.7270
Epoch 3/10
 - 9s - loss: 0.6097 - val_loss: 0.6071
Epoch 4/10
 - 9s - loss: 0.4459 - val_loss: 0.4824
Epoch 5/10
 - 9s - loss: 0.3352 - val_loss: 0.4436
Epoch 6/10
 - 9s - loss: 0.2661 - val_loss: 0.3997
Epoch 7/10
 - 9s - loss: 0.2205 - val_loss: 0.4048
Epoch 8/10
 - 9s - loss: 0.1877 - val_loss: 0.3788
Epoch 9/10
 - 9s - loss: 0.1511 - val_loss: 0.3506
Epoch 10/10
 - 9s - loss: 0.1304 - val_loss: 0.3330
INFO - test-18 - Completed after 0:01:31

Process finished with exit code 0

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM