简体   繁体   English

在 EDGE TPU 上加速多模型推理

[英]Speed Up Multiple Model Inference on EDGE TPU

I have retrained a RESNET50 model for reidentification on EDGE TPU.我已经重新训练了一个 RESNET50 模型,用于在 EDGE TPU 上重新识别。 However, it seems to be no way to fetch a batch of image to EDGE_TPU.但是,似乎没有办法将一批图像提取到EDGE_TPU。

I have come up with a solution of running multiple same model for images.我想出了一个为图像运行多个相同模型的解决方案。

However, is there anyway to speed up the model inference for multiple model?但是,无论如何可以加快多个模型的模型推理? The threading now is even slower than single model inference现在的线程甚至比单模型推理还要慢

Yeah, the edgetpu's architect won't allow processing in batch size.是的,edgetpu 的架构师不允许批量处理。 Have you tried model pipelining?您是否尝试过模型流水线? https://coral.ai/docs/edgetpu/pipeline/ https://coral.ai/docs/edgetpu/pipeline/

Unfortunately only available in C++ right now, but we're looking to extends it to python in mid Q4.不幸的是,目前仅在 C++ 中可用,但我们希望在第四季度中期将其扩展到 python。

Because batch inference is not available now, so pipelining is another secondary option.因为批量推理现在不可用,所以流水线是另一个次要选项。 However, after experiencing with my model, we can make a psuedo batch by feeding multiple single input for EDGE_TPU as another option但是,在体验了我的模型之后,我们可以通过为 EDGE_TPU 提供多个单一输入作为另一种选择来制作伪批处理

在此处输入图片说明

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Coral Edge TPU Compiler 无法转换 tflite 模型:模型未量化 - Coral Edge TPU Compiler cannot convert tflite model: Model not quantized 是否有办法在 Edge TPU 上获取一批图像到 TFLite model? - Is There Anyway to Fetch a Batch of Image to TFLite model on Edge TPU? 如何使用 TFLite 和 Edge TPU 指定特定数据流进行推理? - How can I specify a particular dataflow for inference using TFLite and an Edge TPU? 为什么 tflite model 在多批次推理中速度变慢 - Why tflite model got slow speed in multi batch inference 如何转换在 Edge TPU 板的自定义数据集上训练的模型? - How to convert model trained on custom data-set for the Edge TPU board? 将 Keras model 转换为量化的 Tensorflow Lite Z20F35E630DAF44DBFA4C3F68F5399D8 - Convert Keras model to quantized Tensorflow Lite model that can be used on Edge TPU 为 Edge TPU 构建 Tensorflow Lite 失败 - Building Tensorflow Lite for Edge TPU fails 如何在计算机而不是 TPU 上运行这些 Coral AI 模型推理? - How to run these Coral AI models inference in a computer rather than on the TPU? TfLite Android:运行多个推理时的垃圾值 output model - TfLite Android: Garbage values when running inference for multiple output model 如何对具有多个输入和输出的 Tensorflow lite model 进行推断? - How to do the inference for a Tensorflow lite model with multiple inputs and outputs?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM