简体   繁体   English

TensorFlow火车批次有多个时期?

[英]TensorFlow train batches for multiple epochs?

I don't understand how to run the result of tf.train.batch for multiple epochs. 我不明白如何针对多个时期运行tf.train.batch的结果。 It runs out once of course and I don't know how to restart it. 它当然会用完一次,我不知道如何重新启动它。

  • Maybe I can repeat it using tile , which is complicated but described in full here . 也许我可以使用tile重复它,这很复杂, 但在此完整介绍
  • If I can redraw a batch each time that would be fine -- I would need batch_size random integers between 0 and num_examples. 如果我每次都可以重画一个批次,那将很好-我需要在0到num_examples之间的batch_size随机整数。 (My examples all sit in local RAM). (我的示例全部位于本地RAM中)。 I haven't found an easy way to get these random draws at once. 我还没有找到一种简单的方法来立即获得这些随机抽奖。
  • Ideally there is a reshuffle too when the batch is repeated, but it makes more sense to me to run an epoch then reshuffle, etc., instead of join the training space to itself num_epochs size, then shuffle. 理想情况下 ,当重复批处理时,也应该进行改组,但是对我来说,运行一个纪元然后进行改组等更有意义,而不是将训练空间加入其自身的num_epochs大小,然后进行num_epochs

I think this is confusing because I'm not really building an input pipeline since my input fits in memory, but yet I still need to be building out batching, shuffling and multiple epochs which possibly requires more knowledge of input pipeline. 我认为这很令人困惑,因为由于我的输入适合内存,所以我并没有真正建立输入管道,但是我仍然需要建立批处理,改组和多个时期,这可能需要更多关于输入管道的知识。

tf.train.batch simply groups upstream samples into batches, and nothing more. tf.train.batch只是将上游样本分为几批,仅此而已。 It is meant to be used at the end of an input pipeline. 它打算在输入管道的末尾使用。 Data and epochs are dealt with upstream. 数据和纪元在上游处理。

For example, if your training data fits into a tensor, you could use tf.train.slice_input_producer to produce samples. 例如,如果您的训练数据适合张量,则可以使用tf.train.slice_input_producer生成样本。 This function has arguments for shuffling and epochs. 此函数具有改组和时期的参数。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Tensorflow-正确地(本机地?)处理多个时期的数据批处理重叠(迷你批处理?) - Tensorflow - properly (natively?) handling data batching overlaps (mini-batches?) for multiple epochs TensorFlow估算器中的“批次”和“步骤”是什么,它们与历时有何不同? - What are 'batches' and 'steps' in TensorFlow Estimators and how do they differ from epochs? 训练张量流模型以最大程度减少几批损失 - Train a tensorflow model minimizing the loss of several batches Keras 中的时代和批次控制 - Epochs and batches control in Keras Keras:如果我多次训练 10 个 epoch,是否需要重新加载模型? - Keras: Is there a need to reload the model if I train for 10 epochs multiple times? 一起训练多个 tensorflow NN - Train multiple tensorflow NN together TensorFlow keras 模型 fit() 参数 steps_per_epoch 和训练集上的 epochs 行为 - TensorFlow keras model fit() parameters steps_per_epoch and epochs behavior on train set 如何使用TensorFlow tf.train.string_input_producer生成多个epochs数据? - How to use TensorFlow tf.train.string_input_producer to produce several epochs data? 从 Tensorflow/Keras 中最佳神经网络中提取权重 - 多个时期 - Extracting weights from best Neural Network in Tensorflow/Keras - multiple epochs tensorflow Triplet_semihard_loss 在多个时期后不会改变 - tensorflow triplet_semihard_loss doesnt change after multiple epochs
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM