简体   繁体   English

保存一个Tensorflow模型并将其加载到Tensorflow.js中

[英]Save a Tensorflow Model and Load it in Tensorflow.js

In my regular python code, I have a CNN implemented. 在我的常规python代码中,我实现了CNN。 I save it using model.save and produce some four files (checkpoint, meta, index, and some other file). 我使用model.save保存它,并生成了四个文件(检查点,元数据,索引和其他一些文件)。 However, I cannot load these four files directly to tensorflow.js. 但是,我无法将这四个文件直接加载到tensorflow.js。 Here is the sample CNN: 这是示例CNN:

import tflearn
from tflearn.layers.conv import conv_2d, max_pool_2d
from tflearn.layers.core import input_data, dropout, fully_connected
from tflearn.layers.estimator import regression

convnet = input_data(shape=[None, IMG_SIZE, IMG_SIZE, 1], name='input')

convnet = conv_2d(convnet, FIRST_NUM_CHANNEL, FILTER_SIZE, activation='relu')
convnet = max_pool_2d(convnet, 2)

convnet = conv_2d(convnet, FIRST_NUM_CHANNEL*2, FILTER_SIZE, activation='relu')
convnet = max_pool_2d(convnet, 2)

convnet = conv_2d(convnet, FIRST_NUM_CHANNEL*4, FILTER_SIZE, activation='relu')
convnet = max_pool_2d(convnet, 2)

convnet = fully_connected(convnet, FIRST_NUM_CHANNEL*8, activation='relu')
convnet = dropout(convnet, 0.7)

convnet = fully_connected(convnet, NUM_OUTPUT, activation='softmax')
convnet = regression(convnet, optimizer='adam', learning_rate=LR, loss='categorical_crossentropy', name='targets')

model = tflearn.DNN(convnet, tensorboard_dir='log')

train = train_data[:7000]
test = train_data[-1000:]


X = np.array([i[0] for i in train]).reshape(-1,IMG_SIZE,IMG_SIZE,1)
Y = [i[1] for i in train]

test_x = np.array([i[0] for i in test]).reshape(-1,IMG_SIZE,IMG_SIZE,1)
test_y = [i[1] for i in test]


model.fit({'input': X}, {'targets': Y}, n_epoch=NUM_EPOCHS, validation_set=({'input': test_x}, {'targets': test_y}), 
    snapshot_step=500, show_metric=True, run_id=MODEL_NAME)

model.save(MODEL_NAME)
print('MODEL SAVED:', MODEL_NAME)

The last two lines of this code snippet is for saving the model. 此代码段的最后两行用于保存模型。 I can just load the model in a flask app but I want to port it to tensorflow.js. 我可以将模型加载到flask应用程序中,但我想将其移植到tensorflow.js。 Can anyone give me a tutorial on how to do this? 谁能给我有关如何执行此操作的教程?

tensorflowjs_converted outputs among other files a weight file weights_manifest.json and a model topology file tensorflowjs_model.pb . tensorflowjs_converted输出的文件包括权重文件weights_manifest.json和模型拓扑文件tensorflowjs_model.pb To load this model in tensorflow.js, follow the steps outllined below. 要在tensorflow.js中加载此模型,请遵循下面列出的步骤。

  • serve the folder containing the files using a server 使用服务器提供包含文件的文件夹
// cd to the directory containing the files

// then launch the python server
python3 -m http-server

// or install and launch npm module http-server
npm install -g http-server
http-server --cors -c1 .
  • create a js script to load the model 创建一个js脚本来加载模型
(async () => {
   ...
   const model = await tf.loadFrozenModel('http://localhost:8080/tensorflowjs_model.pb', 'http://localhost:8080/weights_manifest.json')
 })()

There is a difference between loadModel and loadFrozenModel . loadModelloadFrozenModel之间有区别。

  • loadModel is for loading model that were saved locally. loadModel用于加载在本地保存的模型。 The model could either be retrieved from the indexDB of the browser or from the localStorage. 可以从浏览器的indexDB或从localStorage检索模型。 Maybe it can be used to retrieve a model saved by another tensorflow API different than Js, but the user will be required to select the file by using tf.io.browserFiles ( I haven't tried it) 也许可以使用它来检索另一个不同于Js的tensorflow API保存的模型,但是将要求用户使用tf.io.browserFiles选择文件(我没有尝试过)

  • loadFrozenModel is for loading model that are served by a server loadFrozenModel用于加载服务器提供的模型

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM