简体   繁体   English

如何在训练运行之间的中间层内的每个时期更新参数? (张量流急切执行)

[英]How to update parameter at each epoch within an intermediate Layer between training runs ? (tensorflow eager execution)

I have a sequential keras model and there i have a custom Layer similar to the following example named 'CounterLayer'.我有一个连续的 keras 模型,我有一个自定义层,类似于以下名为“CounterLayer”的示例。 I am using tensorflow 2.0 (eager execution)我正在使用 tensorflow 2.0(急切执行)

class CounterLayer(tf.keras.layers.Layer):
  def __init__(self, stateful=False,**kwargs):
    self.stateful = stateful
    super(CounterLayer, self).__init__(**kwargs)


  def build(self, input_shape):
    self.count = tf.keras.backend.variable(0, name="count")
    super(CounterLayer, self).build(input_shape)

  def call(self, input):
    updates = []
    updates.append((self.count, self.count+1))
    self.add_update(updates)
    tf.print('-------------')
    tf.print(self.count)
    return input

when i run this for example epoch=5 or something, the value of self.count does not get updated with each run.当我运行它时,例如 epoch=5 或其他东西, self.count的值不会随着每次运行而更新。 It always remains the same.它始终保持不变。 I got this example from https://stackoverflow.com/a/41710515/10645817 here.我从https://stackoverflow.com/a/41710515/10645817这里得到了这个例子。 I need something almost similar to this but i was wondering does this work in eager execution of tensorflow or what would i have to do to get the expected output.我需要一些几乎与此类似的东西,但我想知道这在 tensorflow 的急切执行中是否有效,或者我必须做什么才能获得预期的输出。

I have been trying to implement this for quite a while but could not figure it out.我一直试图实现这一点很长一段时间,但无法弄清楚。 Can somebody help me please.有人可以帮我吗。 Thank you...谢谢...

yes, my issue got resolved.是的,我的问题得到了解决。 I have come across some of the built-in methods to update this sort of variables (which is to maintain the persistent state in between epochs like my case mentioned above).我遇到了一些内置的方法来更新这类变量(这是为了像上面提到的我的情况一样,在不同时期之间保持持久状态)。 Basically what i needed to do is for example:基本上我需要做的是例如:

  def build(self, input_shape):
    self.count = tf.Variable(0, dtype=tf.float32, trainable=False)
    super(CounterLayer, self).build(input_shape)

  def call(self, input):
    ............
    self.count.assign_add(1)
    ............
    return input

One can use to calculate the updated value in the call function and can also assign it by calling self.count.assign(some_updated_value) .可以使用在call函数中计算更新值,也可以通过调用self.count.assign(some_updated_value) The details to this sort of operations are available in https://www.tensorflow.org/api_docs/python/tf/Variable .此类操作的详细信息可在https://www.tensorflow.org/api_docs/python/tf/Variable 中找到 Thanks.谢谢。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 如何在 keras 自定义层中更新参数(在每个时期基于该时期的输入状态)? - how to update a parameter (at each epoch based on the input state of that epoch) within a keras custom layer? 如何使用TensorFlow Eager执行更新权重? - How to update weights with TensorFlow Eager Execution? 如何在 tensorflow 1.x 的每个训练时期保持模型的输出? - How to hold the output of a model at each training epoch in tensorflow 1.x? Tensorflow 2 在自定义层内禁用了急切执行 - Tensorflow 2 eager execution disabled inside a custom layer Tensorflow 每个 epoch 后的分布式训练暂停 - Tensorflow distributed training pause after each epoch Tensorflow 每个时期对数据集的不同子集进行训练 - Tensorflow training on different subset of dataset each epoch 如何在 keras model 中的每个时期更新参数(即辍学率或单位) - how to update a parameter (i.e. dropout rate or units) at each epoch within a keras model 如何将每个时期的张量值保存在一层中并将其传递给 tensorflow 中的下一个时期 - how to keep the values of tensors in each epoch in one layer and pass it to Next epoch in tensorflow 在训练TensorFlow模型(,?不是Keras模型)时,如何获取模型中间层(op)的输入输出? - During training the TensorFlow model(!!Not the Keras model), How to get the input and output of the intermediate layer(op) of the model? 在张量流的急切执行训练期间修复部分变量 - Fix part of a variable during eager execution training in tensorflow
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM