简体   繁体   English

在 tensorflow keras 中使用中间 model 输出时出现 _SymbolicException

[英]_SymbolicException when using intermediate model outputs in tensorflow keras

I am using tensorflow to train a VAE on MNIST datasets.我正在使用 tensorflow 在 MNIST 数据集上训练 VAE。 Training basic AE using similar code worked, and compiling the model was also done successfully.使用类似代码训练基本 AE 是有效的,并且编译 model 也成功完成。 But when I tried to fit this model, I got an error message as followed.但是当我尝试安装这个 model 时,我收到如下错误消息。 I guess the problem is because self.log_var is a symbolic tensor, but other examples( https://keras.io/examples/variational_autoencoder/ ) also implemented VAE in a similar way without a problem.我想问题是因为 self.log_var 是一个符号张量,但其他示例( https://keras.io/examples/variational_autoencoder/ )也以类似的方式实现了 VAE 没有问题。

_SymbolicException: Inputs to eager execution function cannot be Keras symbolic tensors, but found [<tf.Tensor 'dense_1/Identity:0' shape=(None, 2) dtype=float32>, <tf.Tensor 'dense/Identity:0' shape=(None, 2) dtype=float32>]


class VAE():
  def __init__(self,input_dim, dec_input_dim, enc_channels, enc_kernel_size, enc_strides, dec_channels, dec_kernel_size, dec_strides, z):
    n_enc_layers=len(enc_channels)
    n_dec_layers=len(dec_channels)

    model_input=Input(shape=input_dim)
    e=model_input
    for x in range(n_enc_layers):
      e=Conv2D(enc_channels[x], enc_kernel_size[x], strides=enc_strides[x], padding='same')(e)
      e=BatchNormalization()(e)
      e=LeakyReLU()(e)

    e=Flatten()(e)
    self.mu=Dense(z)(e)
    self.log_var=Dense(z)(e)

    def reparameterize(args):
      mu,log_var=args
      epsilon=tf.random.normal(tf.shape(mu))
      return mu+epsilon*tf.exp(log_var/2)

    encoder_output=Lambda(reparameterize)([self.mu,self.log_var])

    self.encoder=tf.keras.models.Model(model_input,encoder_output)

    decoder_input=Input(shape=(z))
    d=decoder_input
    d=Dense(np.prod(dec_input_dim))(d)
    d=Reshape(dec_input_dim)(d)
    for x in range(n_dec_layers):
      d=Conv2DTranspose(dec_channels[x], dec_kernel_size[x], strides=dec_strides[x], padding='same')(d)
      if x==n_dec_layers-1:
        d=Activation(tf.nn.tanh)(d)
      else:
        d=LeakyReLU()(d)
    dec_output=d

    self.encoder=tf.keras.models.Model(model_input,encoder_output)
    self.decoder=tf.keras.models.Model(decoder_input, dec_output)
    self.model=tf.keras.models.Model(model_input,self.decoder(encoder_output))

  def compile(self,lr,r_loss_factor):
    def r_loss(y_true,y_pred):
      return tf.reduce_mean(tf.square(y_true-y_pred))
    def kl_loss(y_true,y_pred):
      return -0.5*tf.reduce_sum(1+self.log_var-self.mu**2-tf.exp(self.log_var),axis=1)
    def vae_loss(y_true,y_pred):
      return r_loss(y_true,y_pred)+kl_loss(y_true,y_pred)

    optimizer=tf.keras.optimizers.Adam(lr)
    self.model.compile(optimizer=optimizer,loss=vae_loss,metrics = [r_loss, kl_loss])

If you are using Keras = 2.4.0 try downgrading to 2.3.0, for me this solved the error如果您使用的是 Keras = 2.4.0 尝试降级到 2.3.0,对我来说这解决了错误

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 使用中间层作为输入和输出的 keras 模型 - keras model using intermediate layers as inputs and outputs 如何在tensorflow keras中定义model输入和输出? - How to define model input and outputs in tensorflow keras? 在具有两个输出的 model 中使用自定义 keras 层创建时出错 - Error when creating using a custom keras layer in a model with two outputs 将输入输入到 tensorflow.keras model 的中间层 - Feed input to intermediate layer of tensorflow.keras model 使用Tensorflow后端时,如何打印Keras SGD优化器的变体的中间状态 - How can I print intermediate states for a variation of a Keras' SGD optimizer when using Tensorflow backend 使用Keras和张量流的预测模型 - Predictive model using Keras and tensorflow In Tensorflow.keras 2.0, when a model has multiple outputs, how to define a flexible loss function for model.fit()? - In Tensorflow.keras 2.0, when a model has multiple outputs, how to define a flexible loss function for model.fit()? Keras,使用 model.predict 访问带有 spektral GCN 的中间层输出时的行为不一致 - Keras, Inconsistent behavior when using model.predict for accessing intermediate layers output with spektral GCN 在训练TensorFlow模型(,?不是Keras模型)时,如何获取模型中间层(op)的输入输出? - During training the TensorFlow model(!!Not the Keras model), How to get the input and output of the intermediate layer(op) of the model? 如何使用Keras API在TensorFlow Eager中将压差应用于RNN的输出? - How to apply dropout to the outputs of an RNN in TensorFlow Eager using the Keras API?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM