简体   繁体   English

如何使用Tensorflow C-API保存Tensorflow模型

[英]How to save Tensorflow model using Tensorflow C-API

Using TF_GraphToGraphDef one can export a graph and using TF_GraphImportGraphDef one can import a Tensorflow graph. 使用TF_GraphToGraphDef可以导出图,使用TF_GraphImportGraphDef可以导入Tensorflow图。 There also is a method TF_LoadSessionFromSavedModel which seems to offer loading of a Tensorflow model (ie graph including variables). 还有一个方法TF_LoadSessionFromSavedModel似乎提供了Tensorflow模型的加载(即,包括变量的图形)。 But how does one save a Tensorflow model (graph including variables) using the C API? 但是,如何使用C API保存Tensorflow模型(包含变量的图形)呢?

Model saving in tensorflow is one of the worst programing experiences I have encountered. 张量流中的模型保存是我遇到的最糟糕的编程经验之一。 Never in my life have I been so frustrated with such horrible documentation I do not wish this to the worst of my enemies. 我一生中从来没有对如此可怕的文档感到沮丧,我不希望这对我的敌人最不利。

All actions in the C api are executed via the TF_SessionRun() function. C api中的所有操作都通过TF_SessionRun()函数执行。 This function has 12 arguments: 此函数有12个参数:

TF_CAPI_EXPORT extern void TF_SessionRun(
  TF_Session *session,            // Pointer to a TF session
  const TF_Buffer *run_options,   // No clue what this does
  const TF_Output *inputs,        // Your model inputs (not the actual input data)
  TF_Tensor* const* input_values, // Your input tensors (the actual data)
  int ninputs,                    // Number of inputs for a given operation (operations will be clear in a bit)
  const TF_Output* outputs,       // Your model outputs (not the actual output data)
  TF_Tensor** output_values,      // Your output tensors (the actual data)
  int noutputs,                   // Number of inputs for a given operation

  const TF_Operation* const* target_opers, // Your model operation (the actual computation to be performed for example training(fitting), computing metric, saving)
  int ntargets,                            // Number of targets (in case of multi output models for example)
  TF_Buffer* run_metadata,        // Absolutely no clue what this is
  TF_Status*);                    // Model status for when all fails with some cryptic error no one will help you debug

So what you want is to tell TF_SessionRun to execute an operation that will "save" the current model to a file. 因此,您想要的是告诉TF_SessionRun执行将当前模型“保存”到文件的操作。

The way I do it is by allocating a tensor and feeding it the name of the file to saves the model to. 我这样做的方法是分配张量并将其输入文件名以保存模型。 This saves the weights of the model, not sure if the model itself. 这样可以节省模型的权重,无需确定模型本身是否合适。

Here is an example execution of TF_SessionRun I know it's quite cryptic, I'll provide a whole script in a couple hours. 这是TF_SessionRun的示例执行,我知道它很神秘,我将在几个小时内提供整个脚本。

TF_Output inputs[1] = {model->checkpoint_file}; // Input
  TF_Tensor* t = Belly_ScalarStringTensor(str, model->status); // This does the tensor allocation with the output filename
  TF_Tensor* input_values[1] = {t}; // Input data, the actual tensor
  //TF_Operation* op[1] = {model->save_op}; // Tha "save" operation
  // Run and pray
  TF_SessionRun(model->session,
                NULL,
                inputs, input_values, 1,
                /* No outputs */
                NULL, NULL, 0,
                /* The operation */
                op, 1,
                NULL,
                model->status);
  TF_DeleteTensor(t);

This is an incomplete answer, I promise I will edit in a couple of hours, 这是一个不完整的答案,我保证我会在几个小时内进行编辑,

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM