我遵循了这个很棒的教程并成功地训练了一个模型(在 CloudML 上)。我的代码也进行离线预测,但现在我正在尝试使用 Cloud ML 进行预测并遇到一些问题。
为了部署我的模型,我遵循了本教程。现在我有一个生成TFRecords
via的代码,apache_beam.io.WriteToTFRecord
我想对这些TFRecords
. 为此,我正在关注这篇文章,我的命令如下所示:
gcloud ml-engine jobs submit prediction $JOB_ID --model $MODEL --input-paths gs://"$FILE_INPUT".gz --output-path gs://"$OUTPUT"/predictions --region us-west1 --data-format TF_RECORD_GZIP
但我只得到错误: 'Exception during running the graph: Expected serialized to be a scalar, got shape: [64]
似乎它需要不同格式的数据。我在这里找到了 JSON 的格式规范,但找不到如何使用 TFrecords 来实现。
更新:这是输出 saved_model_cli show --all --dir
MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:
signature_def['prediction']:
The given SavedModel SignatureDef contains the following input(s):
inputs['example_proto'] tensor_info:
dtype: DT_STRING
shape: unknown_rank
name: input:0
The given SavedModel SignatureDef contains the following output(s):
outputs['probability'] tensor_info:
dtype: DT_FLOAT
shape: (1, 1)
name: probability:0
Method name is: tensorflow/serving/predict
signature_def['serving_default']:
The given SavedModel SignatureDef contains the following input(s):
inputs['example_proto'] tensor_info:
dtype: DT_STRING
shape: unknown_rank
name: input:0
The given SavedModel SignatureDef contains the following output(s):
outputs['probability'] tensor_info:
dtype: DT_FLOAT
shape: (1, 1)
name: probability:0
Method name is: tensorflow/serving/predict
相关分类