简体   繁体   English

删除Tensorflow中的“初始化程序”图层

[英]Removing 'initializer' layers in Tensorflow

I am trying to save my trained model to a file to ship to my inference server, and then benchmark it. 我试图将我训练过的模型保存到文件中以发送到我的推理服务器,然后对其进行基准测试。 The only issue is that my model (using keras) has init layers (such as conv1/weights/RandomUniform, for example) still in it, which can not be run on the device I am benchmarking. 唯一的问题是我的模型(使用keras)还有init层(例如conv1 / weights / RandomUniform),它不能在我正在进行基准测试的设备上运行。

So, how do I go about removing all of these layers from my graph? 那么,我该如何从图表中删除所有这些图层呢?

I've tried using tfgraph.finalize() , convert_variables_to_constants , and remove_training_nodes , and none seem to remove these nodes. 我尝试过使用tfgraph.finalize()convert_variables_to_constantsremove_training_nodes ,似乎没有人删除这些节点。

The exact layer it breaks on is: 'res_net50/conv1/kernel/Initializer/random_uniform/RandomUniform' 它打破的确切层是: 'res_net50/conv1/kernel/Initializer/random_uniform/RandomUniform'

The strip_unused_nodes transformation allows you to provide a set of input and output nodes. strip_unused_nodes转换允许您提供一组输入和输出节点。 The graph transformation tool will remove all branches of the graph that do not feed into the output nodes. 图形转换工具将删除图形中未输入到输出节点的所有分支。

Having said that, my gut feeling is that removing the initialisation nodes will not have a significant impact on inference time because the operations will not be evaluated. 话虽如此,我的直觉是删除初始化节点不会对推理时间产生重大影响,因为不会评估操作。 They do however consume some memory if they are not pruned. 然而,如果它们没有被修剪,它们会消耗一些内存。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM