如何使用tf.saved_model加载模型并调用预测函数[TENSORFLOW 2.0 API]

我对tensorflow尤其是2.0版本非常陌生,因为关于该API的示例不足,但它似乎比1.x方便得多 到目前为止,我设法使用tf.estimator api训练线性模型,然后设法使用tf.estimator.exporter保存它。

此后,我想使用tf.saved_model api加载此模型,我认为可以成功完成此操作,但是我对我的过程有一些疑问,因此可以快速浏览一下我的代码:

所以我有一系列使用tf.feature_column api创建的功能,它看起来像这样:

feature_columns = 
[NumericColumn(key='geoaccuracy',shape=(1,),default_value=None,dtype=tf.float32,normalizer_fn=None),NumericColumn(key='longitude',NumericColumn(key='latitude',NumericColumn(key='bidfloor',VocabularyListCategoricalColumn(key='adid',vocabulary_list=('115','124','139','122','121','146','113','103','123','104','147','114','149','148'),dtype=tf.string,default_value=-1,num_oov_buckets=0),VocabularyListCategoricalColumn(key='campaignid',vocabulary_list=('36','31','33','28'),VocabularyListCategoricalColumn(key='exchangeid',vocabulary_list=('1241','823','1240','1238'),...]

之后,我以这种方式使用要素列数组定义一个估算器,并对其进行训练。直到这里,没问题。

linear_est = tf.estimator.LinearClassifier(feature_columns=feature_columns)

在训练完模型后,我想保存它,所以这里开始产生疑问,这是我的操作方法,但不确定这是正确的方法:

serving_input_parse = tf.feature_column.make_parse_example_spec(feature_columns=feature_columns)

""" view of the variable : serving_input_parse = 
 {'adid': VarLenFeature(dtype=tf.string),'at': VarLenFeature(dtype=tf.string),'basegenres': VarLenFeature(dtype=tf.string),'bestkw': VarLenFeature(dtype=tf.string),'besttopic': VarLenFeature(dtype=tf.string),'bidfloor': FixedLenFeature(shape=(1,default_value=None),'browserid': VarLenFeature(dtype=tf.string),'browserlanguage': VarLenFeature(dtype=tf.string)
 ...} """

# exporting the model :
linear_est.export_saved_model(export_dir_base='./saved',serving_input_receiver_fn=tf.estimator.export.build_parsing_serving_input_receiver_fn(serving_input_receiver_fn),as_text=True)

现在我试图加载它,而且我不知道如何使用加载的模型使用例如来自pandas数据帧的原始数据对其进行调用

loaded = tf.saved_model.load('saved/1573144361/')

还有一件事,我试图看一下模型的签名,但是我真的无法理解输入形状的作用

MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:

signature_def['classification']:
  The given Savedmodel SignatureDef contains the following input(s):
    inputs['inputs'] tensor_info:
        dtype: DT_STRING
        shape: (-1)
        name: input_example_tensor:0
  The given Savedmodel SignatureDef contains the following output(s):
    outputs['classes'] tensor_info:
        dtype: DT_STRING
        shape: (-1,2)
        name: head/Tile:0
    outputs['scores'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1,2)
        name: head/predictions/probabilities:0
  Method name is: tensorflow/serving/classify

signature_def['predict']:
  The given Savedmodel SignatureDef contains the following input(s):
    inputs['examples'] tensor_info:
        dtype: DT_STRING
        shape: (-1)
        name: input_example_tensor:0
  The given Savedmodel SignatureDef contains the following output(s):
    outputs['all_class_ids'] tensor_info:
        dtype: DT_INT32
        shape: (-1,2)
        name: head/predictions/Tile:0
    outputs['all_classes'] tensor_info:
        dtype: DT_STRING
        shape: (-1,2)
        name: head/predictions/Tile_1:0
    outputs['class_ids'] tensor_info:
        dtype: DT_INT64
        shape: (-1,1)
        name: head/predictions/ExpandDims:0
    outputs['classes'] tensor_info:
        dtype: DT_STRING
        shape: (-1,1)
        name: head/predictions/str_classes:0
    outputs['logistic'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1,1)
        name: head/predictions/logistic:0
    outputs['logits'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1,1)
        name: linear/linear_model/linear/linear_model/linear/linear_model/weighted_sum:0
    outputs['probabilities'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1,2)
        name: head/predictions/probabilities:0
  Method name is: tensorflow/serving/predict

signature_def['regression']:
  The given Savedmodel SignatureDef contains the following input(s):
    inputs['inputs'] tensor_info:
        dtype: DT_STRING
        shape: (-1)
        name: input_example_tensor:0
  The given Savedmodel SignatureDef contains the following output(s):
    outputs['outputs'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1,1)
        name: head/predictions/logistic:0
  Method name is: tensorflow/serving/regress

signature_def['serving_default']:
  The given Savedmodel SignatureDef contains the following input(s):
    inputs['inputs'] tensor_info:
        dtype: DT_STRING
        shape: (-1)
        name: input_example_tensor:0
  The given Savedmodel SignatureDef contains the following output(s):
    outputs['classes'] tensor_info:
        dtype: DT_STRING
        shape: (-1,2)
        name: head/predictions/probabilities:0
  Method name is: tensorflow/serving/classify
nielovezxy 回答:如何使用tf.saved_model加载模型并调用预测函数[TENSORFLOW 2.0 API]

saved_model.load(...) documentation演示了这样的基本机制:

imported = tf.saved_model.load(path)
f = imported.signatures["serving_default"]
print(f(x=tf.constant([[1.]])))

我本人还是新手,但是使用serving_default时,saved_model.save(...)似乎是默认签名。

(我的理解是saved_model.save(...)不会保存模型,而是保存图形。为了解释图形,您需要在图形上显式存储“签名”定义操作。如果不这样做没有明确地执行此操作,那么“ serve_default”将是您唯一的签名。)

我在下面提供了一个实现。有一些细节值得注意:

  1. 输入需要为张量;所以我需要手动进行转换。
  2. 输出是一个字典。该文档将其描述为“具有从签名密钥到功能的签名属性映射的可跟踪对象。”

在我的情况下,字典的键是一个相对任意的“ dense_83”。这似乎有点...具体。因此,我概括了使用迭代器忽略密钥的解决方案:

import tensorflow as tf
input_data = tf.constant(input_data,dtype=tf.float32)
prediction_tensors = signature_collection.signatures["serving_default"](input_data)
for _,values in prediction_tensors.items():
    predictions = values.numpy()[0]
    return predictions
raise Exception("Expected a response from predict(...).")
,

看起来您在输出的最后一部分中使用了saved_model_cli命令行工具。由此,您有了一个“预测”功能,该功能显示输入类型,列等。执行此操作时,我将看到所有输入列。对于您的情况,它仅显示一个输入,该输入是一个名为examples的字符串。这看起来不正确。

这里是$ saved_model_cli show --dir /somedir/export/exporter/123456789 --all输出的摘录。在输出中,点显示已删除的行,因为它们看起来很相似。

signature_def['predict']:
  The given SavedModel SignatureDef contains the following input(s):
    inputs['feature_num_1'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1)
        name: Placeholder_29:0
...
...
 The given SavedModel SignatureDef contains the following output(s):
    outputs['all_class_ids'] tensor_info:
        dtype: DT_INT32
        shape: (-1,2)
        name: dnn/head/predictions/Tile:0
    outputs['all_classes'] tensor_info:
        dtype: DT_STRING
        shape: (-1,2)
        name: dnn/head/predictions/Tile_1:0
...
...
  Method name is: tensorflow/serving/predict
本文链接:https://www.f2er.com/3141994.html

大家都在问