繁体   English   中英

无法将 keras/tensorflow h5/json 转换为 tensorflow pb

[英]Trouble converting keras/tensorflow h5/json into tensorflow pb

我用 keras(tensorflow 后端)训练了一个网络,并将模型保存为 json,权重保存为 h5。 我现在正在尝试将其转换为单个 tensorflow pb 文件,但它抱怨输出节点的名称。

系统信息:Tensorflow 2.3.0 Keras 2.4.3 Cuda 10.1 Cudnn 7

转换脚本非常简单:

import json
from tensorflow import keras
from keras import backend as K
import tensorflow as tf

json_file = "my-trained-model.json"
h5_file = "my-trained-model.h5"
Output_Path = "./trained_models/"
Frozen_pb_File = "my-trained-model.pb"

def freeze_session(session, keep_var_names=None, output_names=None, clear_devices=True):
   
    from tensorflow.python.framework.graph_util import convert_variables_to_constants
    graph = session.graph
    with graph.as_default():
        freeze_var_names = list(set(v.op.name for v in tf.compat.v1.global_variables()).difference(keep_var_names or []))
        output_names = output_names or []
        output_names += [v.op.name for v in tf.compat.v1.global_variables()]
        # Graph -> GraphDef ProtoBuf
        input_graph_def = graph.as_graph_def()
        if clear_devices:
            for node in input_graph_def.node:
                node.device = ""
        frozen_graph = convert_variables_to_constants(session, input_graph_def,
                                                      output_names, freeze_var_names)
        return frozen_graph


with open(json_file, 'r') as json_file:
    model = keras.models.model_from_json(json_file.read())
model.load_weights(h5_file)

model.summary()

# get output node names
OutputNames = [out.op.name for out in model.outputs]
print("\nOutput Names:\n", OutputNames)  # this prints "concatenate/concat" as the only output node name


# freeze the model
frozen_graph = freeze_session(tf.compat.v1.keras.backend.get_session(), output_names=OutputNames)

# save the output files
# this is the .pb file (a binary file)
tf.io.write_graph(frozen_graph, Output_Path, Frozen_pb_File, as_text=False)

当我运行这个时,

AssertionError: concatenate/concat is not in graph

因此,出于某种原因,它正在读取“concatenate/concat”的输出节点名称。 模型总结如下,可以看到输出节点是“concatenate”; 但是,即使我将输出节点名称硬编码为“连接”,我也会收到类似的断言错误:

AssertionError: concatenate is not in graph

这是 keras 模型摘要:

Layer (type)                    Output Shape         Param #     Connected to                     
==================================================================================================
input (InputLayer)              [(None, None, None,  0                                            
__________________________________________________________________________________________________
conv2d (Conv2D)                 (None, None, None, 1 448         input[0][0]                      
__________________________________________________________________________________________________
batch_normalization (BatchNorma (None, None, None, 1 64          conv2d[0][0]                     
__________________________________________________________________________________________________
activation (Activation)         (None, None, None, 1 0           batch_normalization[0][0]        
__________________________________________________________________________________________________
conv2d_1 (Conv2D)               (None, None, None, 1 2320        activation[0][0]                 
__________________________________________________________________________________________________
batch_normalization_1 (BatchNor (None, None, None, 1 64          conv2d_1[0][0]                   
__________________________________________________________________________________________________
activation_1 (Activation)       (None, None, None, 1 0           batch_normalization_1[0][0]      
__________________________________________________________________________________________________
max_pooling2d (MaxPooling2D)    (None, None, None, 1 0           activation_1[0][0]               
__________________________________________________________________________________________________
conv2d_2 (Conv2D)               (None, None, None, 3 4640        max_pooling2d[0][0]              
__________________________________________________________________________________________________
batch_normalization_2 (BatchNor (None, None, None, 3 128         conv2d_2[0][0]                   
__________________________________________________________________________________________________
activation_2 (Activation)       (None, None, None, 3 0           batch_normalization_2[0][0]      
__________________________________________________________________________________________________
conv2d_3 (Conv2D)               (None, None, None, 3 9248        activation_2[0][0]               
__________________________________________________________________________________________________
batch_normalization_3 (BatchNor (None, None, None, 3 128         conv2d_3[0][0]                   
__________________________________________________________________________________________________
activation_3 (Activation)       (None, None, None, 3 0           batch_normalization_3[0][0]      
__________________________________________________________________________________________________
conv2d_4 (Conv2D)               (None, None, None, 3 9248        activation_3[0][0]               
__________________________________________________________________________________________________
batch_normalization_4 (BatchNor (None, None, None, 3 128         conv2d_4[0][0]                   
__________________________________________________________________________________________________
add (Add)                       (None, None, None, 3 0           batch_normalization_4[0][0]      
                                                                 activation_2[0][0]               
__________________________________________________________________________________________________
activation_4 (Activation)       (None, None, None, 3 0           add[0][0]                        
__________________________________________________________________________________________________
max_pooling2d_1 (MaxPooling2D)  (None, None, None, 3 0           activation_4[0][0]               
__________________________________________________________________________________________________
conv2d_5 (Conv2D)               (None, None, None, 6 18496       max_pooling2d_1[0][0]            
__________________________________________________________________________________________________
batch_normalization_5 (BatchNor (None, None, None, 6 256         conv2d_5[0][0]                   
__________________________________________________________________________________________________
activation_5 (Activation)       (None, None, None, 6 0           batch_normalization_5[0][0]      
__________________________________________________________________________________________________
conv2d_6 (Conv2D)               (None, None, None, 6 36928       activation_5[0][0]               
__________________________________________________________________________________________________
batch_normalization_6 (BatchNor (None, None, None, 6 256         conv2d_6[0][0]                   
__________________________________________________________________________________________________
activation_6 (Activation)       (None, None, None, 6 0           batch_normalization_6[0][0]      
__________________________________________________________________________________________________
conv2d_7 (Conv2D)               (None, None, None, 6 36928       activation_6[0][0]               
__________________________________________________________________________________________________
batch_normalization_7 (BatchNor (None, None, None, 6 256         conv2d_7[0][0]                   
__________________________________________________________________________________________________
add_1 (Add)                     (None, None, None, 6 0           batch_normalization_7[0][0]      
                                                                 activation_5[0][0]               
__________________________________________________________________________________________________
activation_7 (Activation)       (None, None, None, 6 0           add_1[0][0]                      
__________________________________________________________________________________________________
conv2d_8 (Conv2D)               (None, None, None, 6 36928       activation_7[0][0]               
__________________________________________________________________________________________________
batch_normalization_8 (BatchNor (None, None, None, 6 256         conv2d_8[0][0]                   
__________________________________________________________________________________________________
activation_8 (Activation)       (None, None, None, 6 0           batch_normalization_8[0][0]      
__________________________________________________________________________________________________
conv2d_9 (Conv2D)               (None, None, None, 6 36928       activation_8[0][0]               
__________________________________________________________________________________________________
batch_normalization_9 (BatchNor (None, None, None, 6 256         conv2d_9[0][0]                   
__________________________________________________________________________________________________
add_2 (Add)                     (None, None, None, 6 0           batch_normalization_9[0][0]      
                                                                 activation_7[0][0]               
__________________________________________________________________________________________________
activation_9 (Activation)       (None, None, None, 6 0           add_2[0][0]                      
__________________________________________________________________________________________________
max_pooling2d_2 (MaxPooling2D)  (None, None, None, 6 0           activation_9[0][0]               
__________________________________________________________________________________________________
conv2d_10 (Conv2D)              (None, None, None, 6 36928       max_pooling2d_2[0][0]            
__________________________________________________________________________________________________
batch_normalization_10 (BatchNo (None, None, None, 6 256         conv2d_10[0][0]                  
__________________________________________________________________________________________________
activation_10 (Activation)      (None, None, None, 6 0           batch_normalization_10[0][0]     
__________________________________________________________________________________________________
conv2d_11 (Conv2D)              (None, None, None, 6 36928       activation_10[0][0]              
__________________________________________________________________________________________________
batch_normalization_11 (BatchNo (None, None, None, 6 256         conv2d_11[0][0]                  
__________________________________________________________________________________________________
activation_11 (Activation)      (None, None, None, 6 0           batch_normalization_11[0][0]     
__________________________________________________________________________________________________
conv2d_12 (Conv2D)              (None, None, None, 6 36928       activation_11[0][0]              
__________________________________________________________________________________________________
batch_normalization_12 (BatchNo (None, None, None, 6 256         conv2d_12[0][0]                  
__________________________________________________________________________________________________
add_3 (Add)                     (None, None, None, 6 0           batch_normalization_12[0][0]     
                                                                 activation_10[0][0]              
__________________________________________________________________________________________________
activation_12 (Activation)      (None, None, None, 6 0           add_3[0][0]                      
__________________________________________________________________________________________________
conv2d_13 (Conv2D)              (None, None, None, 6 36928       activation_12[0][0]              
__________________________________________________________________________________________________
batch_normalization_13 (BatchNo (None, None, None, 6 256         conv2d_13[0][0]                  
__________________________________________________________________________________________________
activation_13 (Activation)      (None, None, None, 6 0           batch_normalization_13[0][0]     
__________________________________________________________________________________________________
conv2d_14 (Conv2D)              (None, None, None, 6 36928       activation_13[0][0]              
__________________________________________________________________________________________________
batch_normalization_14 (BatchNo (None, None, None, 6 256         conv2d_14[0][0]                  
__________________________________________________________________________________________________
add_4 (Add)                     (None, None, None, 6 0           batch_normalization_14[0][0]     
                                                                 activation_12[0][0]              
__________________________________________________________________________________________________
activation_14 (Activation)      (None, None, None, 6 0           add_4[0][0]                      
__________________________________________________________________________________________________
max_pooling2d_3 (MaxPooling2D)  (None, None, None, 6 0           activation_14[0][0]              
__________________________________________________________________________________________________
conv2d_15 (Conv2D)              (None, None, None, 1 73856       max_pooling2d_3[0][0]            
__________________________________________________________________________________________________
batch_normalization_15 (BatchNo (None, None, None, 1 512         conv2d_15[0][0]                  
__________________________________________________________________________________________________
activation_15 (Activation)      (None, None, None, 1 0           batch_normalization_15[0][0]     
__________________________________________________________________________________________________
conv2d_16 (Conv2D)              (None, None, None, 1 147584      activation_15[0][0]              
__________________________________________________________________________________________________
batch_normalization_16 (BatchNo (None, None, None, 1 512         conv2d_16[0][0]                  
__________________________________________________________________________________________________
activation_16 (Activation)      (None, None, None, 1 0           batch_normalization_16[0][0]     
__________________________________________________________________________________________________
conv2d_17 (Conv2D)              (None, None, None, 1 147584      activation_16[0][0]              
__________________________________________________________________________________________________
batch_normalization_17 (BatchNo (None, None, None, 1 512         conv2d_17[0][0]                  
__________________________________________________________________________________________________
add_5 (Add)                     (None, None, None, 1 0           batch_normalization_17[0][0]     
                                                                 activation_15[0][0]              
__________________________________________________________________________________________________
activation_17 (Activation)      (None, None, None, 1 0           add_5[0][0]                      
__________________________________________________________________________________________________
conv2d_18 (Conv2D)              (None, None, None, 1 147584      activation_17[0][0]              
__________________________________________________________________________________________________
batch_normalization_18 (BatchNo (None, None, None, 1 512         conv2d_18[0][0]                  
__________________________________________________________________________________________________
activation_18 (Activation)      (None, None, None, 1 0           batch_normalization_18[0][0]     
__________________________________________________________________________________________________
conv2d_19 (Conv2D)              (None, None, None, 1 147584      activation_18[0][0]              
__________________________________________________________________________________________________
batch_normalization_19 (BatchNo (None, None, None, 1 512         conv2d_19[0][0]                  
__________________________________________________________________________________________________
add_6 (Add)                     (None, None, None, 1 0           batch_normalization_19[0][0]     
                                                                 activation_17[0][0]              
__________________________________________________________________________________________________
activation_19 (Activation)      (None, None, None, 1 0           add_6[0][0]                      
__________________________________________________________________________________________________
conv2d_20 (Conv2D)              (None, None, None, 1 147584      activation_19[0][0]              
__________________________________________________________________________________________________
batch_normalization_20 (BatchNo (None, None, None, 1 512         conv2d_20[0][0]                  
__________________________________________________________________________________________________
activation_20 (Activation)      (None, None, None, 1 0           batch_normalization_20[0][0]     
__________________________________________________________________________________________________
conv2d_21 (Conv2D)              (None, None, None, 1 147584      activation_20[0][0]              
__________________________________________________________________________________________________
batch_normalization_21 (BatchNo (None, None, None, 1 512         conv2d_21[0][0]                  
__________________________________________________________________________________________________
add_7 (Add)                     (None, None, None, 1 0           batch_normalization_21[0][0]     
                                                                 activation_19[0][0]              
__________________________________________________________________________________________________
activation_21 (Activation)      (None, None, None, 1 0           add_7[0][0]                      
__________________________________________________________________________________________________
conv2d_22 (Conv2D)              (None, None, None, 1 147584      activation_21[0][0]              
__________________________________________________________________________________________________
batch_normalization_22 (BatchNo (None, None, None, 1 512         conv2d_22[0][0]                  
__________________________________________________________________________________________________
activation_22 (Activation)      (None, None, None, 1 0           batch_normalization_22[0][0]     
__________________________________________________________________________________________________
conv2d_23 (Conv2D)              (None, None, None, 1 147584      activation_22[0][0]              
__________________________________________________________________________________________________
batch_normalization_23 (BatchNo (None, None, None, 1 512         conv2d_23[0][0]                  
__________________________________________________________________________________________________
add_8 (Add)                     (None, None, None, 1 0           batch_normalization_23[0][0]     
                                                                 activation_21[0][0]              
__________________________________________________________________________________________________
activation_23 (Activation)      (None, None, None, 1 0           add_8[0][0]                      
__________________________________________________________________________________________________
conv2d_24 (Conv2D)              (None, None, None, 2 2306        activation_23[0][0]              
__________________________________________________________________________________________________
conv2d_25 (Conv2D)              (None, None, None, 6 6918        activation_23[0][0]              
__________________________________________________________________________________________________
concatenate (Concatenate)       (None, None, None, 8 0           conv2d_24[0][0]                  
                                                                 conv2d_25[0][0]                  
==================================================================================================
Total params: 1,648,184
Trainable params: 1,644,344
Non-trainable params: 3,840
__________________________________________________________________________________________________

我一定是在俯视一些简单的东西,但盯着它看了这么久,我再也见不到树木了。 :-(

感谢您的任何建议。 我很乐意将 json/h5 分享给任何感兴趣的人。

看起来这都是由于试图冻结 tensorflow 2.3 模型造成的。 显然,Tensorflow 2.0+ 已经弃用了“冻结”概念,转而使用“保存模型”概念。 一旦发现这一点,我就能够立即将 h5/json 保存到已保存的模型 pb 中。

我仍然不确定这种格式是否针对推理进行了优化,因此我将对此进行一些跟进,但是由于我的问题是关于我看到的错误,我想我会发布导致问题的原因。

作为参考,这是我的 python 脚本,用于从 keras h5/json 文件转换为 Tensorflow 保存的模型格式。

import os
from keras.models import model_from_json
import tensorflow as tf
import genericpath
from genericpath import *

def splitext(p):
    p = os.fspath(p)
    if isinstance(p, bytes):
        sep = b'/'
        extsep = b'.'
    else:
        sep = '/'
        extsep = '.'
    return genericpath._splitext(p, sep, None, extsep)

def load_model(path,custom_objects={},verbose=0):
    from keras.models import model_from_json

    path = splitext(path)[0]
    with open('%s.json' % path,'r') as json_file:
        model_json = json_file.read()
    model = model_from_json(model_json, custom_objects=custom_objects)
    model.load_weights('%s.h5' % path)
    # if verbose: print 'Loaded from %s' % path
    return model


json_file = "model.json"  # the h5 file should be "model.h5"

model = load_model(json_file) # load the json/h5 pair
model.save('my_saved_model') # this is a directory name to store the saved model

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM