简体   繁体   English

Tensorflow.lite model 在 Android 应用程序中产生错误(不同)结果?

[英]Tensorflow.lite model produces wrong (different) results in Android app?

I've made an Image classification model and converted it to tflite format.我做了一个图像分类 model 并将其转换为 tflite 格式。 Then I've verified tflite model in Python using tf.lite.Interpreter — it produces same results for my test image as the original model.然后,我使用 tf.lite.Interpreter 验证了 Python 中的 tflite model ——它为我的测试图像产生了与原始 Z20F35E630DAF44DBFA4C3F68F5399D8ZC 相同的结果。 Here's a colab link to verify .这是一个用于验证的 colab 链接

Then I embedded it to a sample Android app, using Android Studio ML Model Binding and exact example code from Android studio.然后我使用Android Studio ML Model Binding和来自 ZE864E30B9AB870844 的精确示例代码将其嵌入到示例 Android 应用程序中。 Here's the main activity code , you can also use this link to navigate to the full android project.这是主要活动代码,您也可以使用此链接导航到完整的 android 项目。

val assetManager = this.assets
val istr = assetManager.open("test_image.JPG") //The same image
val b = BitmapFactory.decodeStream(istr)

val model = Model2.newInstance(this) //Model definition generated by Android Studio

// Creates inputs for reference.
val image = TensorImage.fromBitmap(b)

// Runs model inference and gets result.
val outputs = model.process(image)
val probability = outputs.probabilityAsCategoryList
probability.sortByDescending { it.score }
val top9 = probability.take(9)

this.findViewById<TextView>(R.id.results_text).text = top9.toString()

And then I'm getting completely different results on Android for the same model and the same input image.然后对于相同的 model 和相同的输入图像,我在 Android 上得到完全不同的结果。

Here are results matching my initial model in Python:以下是与我在 Python 中的初始 model 匹配的结果: Python中的正确结果

Here are wrong results I'm getting in Android app:这是我在 Android 应用程序中得到的错误结果:

Android上的错误结果

Links to the model and the test image are there in both examples, but I'll post them into the question once again:两个示例中都有指向 model 和测试图像的链接,但我将再次将它们发布到问题中:

tflite model tflite model

test image 测试图像

I guess it has something to do with input/output formats of the model.我想这与 model 的输入/输出格式有关。 Or the image is interpreted differently in python and in android.或者图像在 python 和 android 中的解释不同。 Or the metadata I added to the model is somehow wrong.或者我添加到 model 的元数据在某种程度上是错误的。 Anyways, I've tried everything to localize the issue and now I'm stuck.无论如何,我已经尝试了一切来定位问题,现在我被卡住了。

How do I fix my model or Android code so it produces the same results as my python code?如何修复我的 model 或 Android 代码,使其产生与我的 python 代码相同的结果?

I've managed to find and fix the issue: My model from this tutorial included a built-in image normalization layer.我已经设法找到并解决了这个问题:本教程中的我的 model 包含一个内置的图像规范化层。 Image normalization is when you transform standard 0-255 image color values to 0.0-1.0 float values, suitable for machine learning.图像标准化是将标准的0-255图像颜色值转换为0.0-1.0浮点值,适用于机器学习。

But the metadata I used for the tflite model included 2 parameters for external normalization: mean and std.但我用于 tflite model 的元数据包括 2 个用于外部标准化的参数:均值和标准差。 Formula for each value being: normalized_value = (value - mean) / std Since my model handles its own normalization, I need to turn off external normalization by setting mean = 0 and std = 1 .每个值的公式为: normalized_value = (value - mean) / std由于我的 model 处理自己的标准化,我需要通过设置mean = 0std = 1来关闭外部标准化。 This way I'll get normalized_value = value .这样我会得到normalized_value = value

So, setting the tflite metadata parameters to these:因此,将 tflite 元数据参数设置为:

    image_min=0,
    image_max=255.0,
    mean=[0.0],
    std=[1.0]

fixed the double normalization issue and my model now produces correct results in Android app.修复了双重标准化问题,我的 model 现在在 Android 应用程序中产生正确的结果

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Android 分类应用程序因 Tensorflow Lite 模型而崩溃 - Android Classification App crashes with Tensorflow Lite Model 在将 tensorflow lite 模型添加到 Android 应用程序之前,可以对其进行压缩吗? - Can tensorflow lite model be zipped before adding it to Android app? Tensorflow Lite 自定义 Object 检测 Model ZE84E30B9390CDB64DB6DB2C9ABZ8 应用程序中的错误 - Tensorflow Lite Custom Object detection Model Error in Android app 有没有办法在 android 应用程序中获取 tensorflow lite model 的路径? - Is there a way to get the path to the tensorflow lite model in an android app? 无法在TensorFlow Lite对象检测Android应用中使用自定义模型 - Fail to use custom model in tensorflow lite object detection android app 在 Android 中使用 Tensorflow Lite model 运行推理 - Running an inference with Tensorflow Lite model in Android 相同的Tensorflow模型在Android和Python上提供不同的结果 - Same Tensorflow model giving different results on Android and Python 使用TensorFlow-Lite时,由于不同的计算能力,模型的运行时在不同的android设备上可能会有很大差异 - Runtime of model may vary greatly on different android devices when using TensorFlow-Lite because of different computing capacity Tensorflow-Lite预训练模型在Android演示中不起作用 - Tensorflow-Lite pretrained model does not work in Android demo 在 Android 应用程序中使用 tensorflow-lite:0.0.0-nightly 和 tensorflow-lite 之间的区别 - Difference between using tensorflow-lite:0.0.0-nightly and tensorflow-lite in an Android App
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM