简体   繁体   English

在TensorFlow中,如何清除CNN模型中中间变量的GPU内存?

[英]In TensorFlow, how to clear the GPU memory of an intermediate variable in a CNN model?

I am just using TensorFlow to realise a CNN model. 我只是使用TensorFlow来实现CNN模型。 During the training process, there is an intermediate variable which occupies a large GPU memory and I want to clear the memory of this variable. 在训练过程中,有一个中间变量占用较大的GPU内存,我想清除该变量的内存。

This variable is called 'rgb_concat', I just tried to use 'rgb_concat=[]' to clear its memory, not sure if it is useful in TensorFlow? 此变量称为'rgb_concat',我只是尝试使用'rgb_concat = []'清除其内存,不确定是否在TensorFlow中有用?

How could I achieve this in TensorFlow? 我如何在TensorFlow中实现这一目标? Thanks in advance! 提前致谢!

An intermediate variable called 'rgb_concat' which occupies a large GPU memory and I want to clear it and save GPU memory for other layers in a CNN model. 一个名为“ rgb_concat”的中间变量,它占用较大的GPU内存,我想清除它并为CNN模型中的其他层保存GPU内存。 How could I realise it in TensorFlow? 如何在TensorFlow中实现它?

x = input_image
for j in range(n_sub_layers):
    nn = Conv2dLayer(x, j)     #
    rgb_concat.append(nn)
    x = nn
rgb_concat_sublayer = ConcatLayer([rgb_concat[0], rgb_concat[1]], concat_dim=3, name='rgb_concat_sublayer_{}_{}'.format(i,1))
for sub_layer in range(2, n_sub_layers): #Second 'for' loop!!!
        rgb_concat_sublayer = ConcatLayer([rgb_concat_sublayer, rgb_concat[sub_layer]], concat_dim=3, name='rgb_concat_sublayer_{}_{}'.format(i,sub_layer))

Since I do not need 'rgb_concat' after the second 'for' loop any more, it should be cleared after 'for' loop. 由于在第二个“ for”循环之后不再需要“ rgb_concat”,因此应在“ for”循环后将其清除。

Have you tried the del keyword? 您是否尝试过del关键字?

del rgb_concat

You could also just set the variable to None. 您也可以将变量设置为“无”。

rgb_concat = None

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 如何在CNN模型中使用tensorflow-gpu获得可重现的结果? - How to get reproducible results using tensorflow-gpu in CNN model? 如何衡量 TensorFlow 模型的 GPU 内存使用情况 - How to measure GPU memory usage of TensorFlow model 为什么即使在TensorFlow中使用delete命令,CNN模型仍然超过GPU内存? - Why does the CNN model still exceed the GPU memory even using the delete command in TensorFlow? 分配器用完了 memory - 如何从 Z074DD699710DA0EC1EB45F13Z31 数据集中清除 GPU memory - Allocator ran out of memory - how to clear GPU memory from TensorFlow dataset? 在 Z20F35E630DAF44DBFA4C3F68F5399D8Z 执行后清除 Tensorflow GPU memory - Clearing Tensorflow GPU memory after model execution 我的 CNN 模型在我的 GPU 上使用了太多内存。 如何在我的 CPU 内存上托管一些张量? - My CNN Model uses too much memory on my GPU. How can I host some Tensors on my CPU memory? 如何在Tensorflow CNN模型中初始化权重? - How to initialize weights in tensorflow CNN model? 如何在张量流中改变帧来生成CNN模型? - How to generate CNN model with changing frames in tensorflow? 如何使用 Tensorflow 数据集进行 CNN 模型训练 - How to use Tensorflow Dataset for CNN Model Training PyTorch模型训练后如何在不重启内核的情况下清除GPU内存 - How to clear GPU memory after PyTorch model training without restarting kernel
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM