简体   繁体   English

Nvidia Jetson 上的 Tensorflow Lite

[英]Tensorflow Lite on Nvidia Jetson

Has anyone used Tensorflow Lite on any Nvidia Jetson product?有没有人在任何 Nvidia Jetson 产品上使用过 Tensorflow Lite? I want to use my Jetson Nano for inference and would like to so with tf-lite utilizing the GPU.我想使用我的 Jetson Nano 进行推理,并希望使用 GPU 的 tf-lite 进行推理。

Confusingly, there does not seem to be a Python API for creating a GPU Delegate in tf-lite .令人困惑的是,似乎没有用于在 tf-lite 中创建 GPU 委托的 Python API

Is there are clear reason for this?这有明确的原因吗?

Is the alternative to use the full Tensorflow library (I would prefer not use the Nvidia TensorRT engine)?是否可以使用完整的 Tensorflow 库(我不想使用 Nvidia TensorRT 引擎)?

Yes, I have tried to use tf lite on Jetson Nano before.是的,我之前曾尝试在 Jetson Nano 上使用 tf lite。

You can refer to my previous article on Medium (PS: I am sorry that the article was written in Chinese.)你可以参考我之前在Medium上的文章(PS:很抱歉这篇文章是用中文写的。)

This article is about how to set up the TF Lite Environment on Jetson Nano这篇文章是关于如何在Jetson Nano上设置TF Lite环境

Notice:注意:

You should change the following command according to your own environment.您应该根据自己的环境更改以下命令。

pip3 install https://dl.google.com/coral/python/tflite_runtime-2.1.0.post1-cp36-cp36m-linux_aarch64.whl

Setting up TF lite on Jetson Nano 在 Jetson Nano 上设置 TF lite

in case it is of interest to you to use inference with C++ you can compile TFlite 2.4.1 on your Jetson device like I did on the Xavier NX:如果您对使用 C++ 进行推理感兴趣,您可以像在 Xavier NX 上一样在 Jetson 设备上编译 TFlite 2.4.1:

$ sudo apt-get install cmake curl
$ wget -O tensorflow.zip https://github.com/tensorflow/tensorflow/archive/v2.4.1.zip
$ unzip tensorflow.zip
$ mv tensorflow-2.4.1 tensorflow
$ cd tensorflow
$ ./tensorflow/lite/tools/make/download_dependencies.sh
$ ./tensorflow/lite/tools/make/build_aarch64_lib.sh

After that you will also have to install the TF lite flat buffers like this:之后,您还必须像这样安装 TF lite 平面缓冲区:

$ cd ./tensorflow/tensorflow/lite/tools/make/downloads/flatbuffers
$ mkdir build && cd build
$ cmake ..
$ make -j
$ sudo make install
$ sudo ldconfig

After that you find the library here tensorflow/tensorflow/lite/tools/make/gen/linux_aarch64/libtensorflow-lite.a之后,您可以在此处找到库tensorflow/tensorflow/lite/tools/make/gen/linux_aarch64/libtensorflow-lite.a

You can build your inference application against that like this您可以像这样构建推理应用程序

gcc -llibtensorflow-lite.a -ledgetpu main.cpp

You will also need to install libedgetpu.so like shown on Coral.ai您还需要安装 libedgetpu.so,如 Coral.ai 所示

Best Alexander最好的亚历山大

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM