简体   繁体   English

如何为serve_input_receiver_fn BERT Tensorflow制作功能

[英]How to make features for serving_input_receiver_fn BERT Tensorflow

I have created a binary classifier with Tensorflow BERT language model. 我已经使用Tensorflow BERT语言模型创建了一个二进制分类器。 Here is the link to sample code. 这是示例代码的链接 I am able to do predictions. 我能够做预测。 Now I want to export this model. 现在,我要导出此模型。 I am not sure if I have defined feature_spec correctly. 我不确定是否正确定义了feature_spec。

Code to export model. 代码以导出模型。

feature_spec = {'x': tf.VarLenFeature(tf.string)}  

def serving_input_receiver_fn():  
  serialized_tf_example = tf.placeholder(dtype=tf.string, shape=[1],name='input_example_tensor')
  receiver_tensors = {'examples': serialized_tf_example}
  features = tf.parse_example(serialized_tf_example, feature_spec)
  return tf.estimator.export.ServingInputReceiver(features, receiver_tensors)

# Export the estimator
export_path = f'/content/drive/My Drive/binary_class/bert/export'

estimator.export_saved_model(
    export_path,
    serving_input_receiver_fn=serving_input_receiver_fn)

Error 错误

---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
<ipython-input-71-56ff3fb3e002> in <module>()
     16 estimator.export_saved_model(
     17     export_path,
---> 18     serving_input_receiver_fn=serving_input_receiver_fn)

4 frames
/usr/local/lib/python3.6/dist-packages/tensorflow_estimator/python/estimator/estimator.py in export_saved_model(self, export_dir_base, serving_input_receiver_fn, assets_extra, as_text, checkpoint_path, experimental_mode)
    730         as_text=as_text,
    731         checkpoint_path=checkpoint_path,
--> 732         strip_default_attrs=True)
    733 
    734   def experimental_export_all_saved_models(

/usr/local/lib/python3.6/dist-packages/tensorflow_estimator/python/estimator/estimator.py in _export_all_saved_models(self, export_dir_base, input_receiver_fn_map, assets_extra, as_text, checkpoint_path, strip_default_attrs)
    854             builder, input_receiver_fn_map, checkpoint_path,
    855             save_variables, mode=ModeKeys.PREDICT,
--> 856             strip_default_attrs=strip_default_attrs)
    857         save_variables = False
    858 

/usr/local/lib/python3.6/dist-packages/tensorflow_estimator/python/estimator/estimator.py in _add_meta_graph_for_mode(self, builder, input_receiver_fn_map, checkpoint_path, save_variables, mode, export_tags, check_variables, strip_default_attrs)
    927           labels=getattr(input_receiver, 'labels', None),
    928           mode=mode,
--> 929           config=self.config)
    930 
    931       export_outputs = export_lib.export_outputs_for_mode(

/usr/local/lib/python3.6/dist-packages/tensorflow_estimator/python/estimator/estimator.py in _call_model_fn(self, features, labels, mode, config)
   1144 
   1145     logging.info('Calling model_fn.')
-> 1146     model_fn_results = self._model_fn(features=features, **kwargs)
   1147     logging.info('Done calling model_fn.')
   1148 

<ipython-input-17-119a3167bf33> in model_fn(features, labels, mode, params)
      5     """The `model_fn` for TPUEstimator."""
      6 
----> 7     input_ids = features["input_ids"]
      8     input_mask = features["input_mask"]
      9     segment_ids = features["segment_ids"]

KeyError: 'input_ids'

The create_model function present in notebook takes some arguments. 笔记本中存在的create_model函数带有一些参数。 Those are the features which will be passed to the model. 这些是将传递给模型的功能。

By updating the serving_input_fn function to following, the serving function works properly. 通过将serving_input_fn函数更新为以下内容,服务函数可以正常工作。

Updated Code 更新的代码

def serving_input_fn():
  feature_spec = {
      "input_ids" : tf.FixedLenFeature([MAX_SEQ_LENGTH], tf.int64),
      "input_mask" : tf.FixedLenFeature([MAX_SEQ_LENGTH], tf.int64),
      "segment_ids" : tf.FixedLenFeature([MAX_SEQ_LENGTH], tf.int64),
      "label_ids" :  tf.FixedLenFeature([], tf.int64)

  }
  serialized_tf_example = tf.placeholder(dtype=tf.string, 
                                         shape=[None],
                                         name='input_example_tensor')
  receiver_tensors = {'example': serialized_tf_example}
  features = tf.parse_example(serialized_tf_example, feature_spec)
  return tf.estimator.export.ServingInputReceiver(features, receiver_tensors)

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 带参数的Tensorflow serving_input_receiver_fn - Tensorflow serving_input_receiver_fn with arguments 如何为导出的 tensorflow 2.0 keras 模型的输入层设计/预处理特征以用于 tensorflow 服务 - How to engineer/preprocess features for the input layer of a exported tensorflow 2.0 keras model for tensorflow serving Tensorflow Custom Estimator:如何实现一个input_fn函数,该函数返回标签列表和功能列表? - Tensorflow Custom Estimator: How to implement a `input_fn` function that returns a list of labels, and a list of features ? 如何在&#39;input_fn&#39;中使用张量流的迭代器&#39;make_initializable_iterator&#39;? - How to use the iterator 'make_initializable_iterator' of tensorflow within a 'input_fn'? Tensorflow服务:如何在输入占位符上进行迭代 - Tensorflow serving: how to iterate over input placeholder Tensorflow v1.10 +为什么在没有它的情况下制作检查点时需要输入服务接收器功能? - Tensorflow v1.10+ why is an input serving receiver function needed when checkpoints are made without it? 如何将多个二维数组作为输入传递给为 api 服务的 tensorflow? - How to pass multiple 2d array as input to tensorflow serving api? 如何将bert的embeddins向量与其他特征结合起来? - How to combine embeddins vectors of bert with other features? 在DNNClassifier TensorFlow中创建适当的input_fn - Creating appropriate input_fn in DNNClassifier TensorFlow Tensorflow分区的csv input_fn - Tensorflow partitioned csv input_fn
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM