[英]Tensorflow.lite model produces wrong (different) results in Android app?
I've made an Image classification model and converted it to tflite format.我做了一个图像分类 model 并将其转换为 tflite 格式。 Then I've verified tflite model in Python using tf.lite.Interpreter — it produces same results for my test image as the original model.
然后,我使用 tf.lite.Interpreter 验证了 Python 中的 tflite model ——它为我的测试图像产生了与原始 Z20F35E630DAF44DBFA4C3F68F5399D8ZC 相同的结果。 Here's a colab link to verify .
这是一个用于验证的 colab 链接。
Then I embedded it to a sample Android app, using Android Studio ML Model Binding and exact example code from Android studio.然后我使用Android Studio ML Model Binding和来自 ZE864E30B9AB870844 的精确示例代码将其嵌入到示例 Android 应用程序中。 Here's the main activity code , you can also use this link to navigate to the full android project.
这是主要活动代码,您也可以使用此链接导航到完整的 android 项目。
val assetManager = this.assets
val istr = assetManager.open("test_image.JPG") //The same image
val b = BitmapFactory.decodeStream(istr)
val model = Model2.newInstance(this) //Model definition generated by Android Studio
// Creates inputs for reference.
val image = TensorImage.fromBitmap(b)
// Runs model inference and gets result.
val outputs = model.process(image)
val probability = outputs.probabilityAsCategoryList
probability.sortByDescending { it.score }
val top9 = probability.take(9)
this.findViewById<TextView>(R.id.results_text).text = top9.toString()
And then I'm getting completely different results on Android for the same model and the same input image.然后对于相同的 model 和相同的输入图像,我在 Android 上得到完全不同的结果。
Here are results matching my initial model in Python:以下是与我在 Python 中的初始 model 匹配的结果:
Here are wrong results I'm getting in Android app:这是我在 Android 应用程序中得到的错误结果:
Links to the model and the test image are there in both examples, but I'll post them into the question once again:两个示例中都有指向 model 和测试图像的链接,但我将再次将它们发布到问题中:
I guess it has something to do with input/output formats of the model.我想这与 model 的输入/输出格式有关。 Or the image is interpreted differently in python and in android.
或者图像在 python 和 android 中的解释不同。 Or the metadata I added to the model is somehow wrong.
或者我添加到 model 的元数据在某种程度上是错误的。 Anyways, I've tried everything to localize the issue and now I'm stuck.
无论如何,我已经尝试了一切来定位问题,现在我被卡住了。
How do I fix my model or Android code so it produces the same results as my python code?如何修复我的 model 或 Android 代码,使其产生与我的 python 代码相同的结果?
I've managed to find and fix the issue: My model from this tutorial included a built-in image normalization layer.我已经设法找到并解决了这个问题:本教程中的我的 model 包含一个内置的图像规范化层。 Image normalization is when you transform standard 0-255 image color values to 0.0-1.0 float values, suitable for machine learning.
图像标准化是将标准的0-255图像颜色值转换为0.0-1.0浮点值,适用于机器学习。
But the metadata I used for the tflite model included 2 parameters for external normalization: mean and std.但我用于 tflite model 的元数据包括 2 个用于外部标准化的参数:均值和标准差。 Formula for each value being: normalized_value = (value - mean) / std Since my model handles its own normalization, I need to turn off external normalization by setting mean = 0 and std = 1 .
每个值的公式为: normalized_value = (value - mean) / std由于我的 model 处理自己的标准化,我需要通过设置mean = 0和std = 1来关闭外部标准化。 This way I'll get normalized_value = value .
这样我会得到normalized_value = value 。
So, setting the tflite metadata parameters to these:因此,将 tflite 元数据参数设置为:
image_min=0,
image_max=255.0,
mean=[0.0],
std=[1.0]
fixed the double normalization issue and my model now produces correct results in Android app.修复了双重标准化问题,我的 model 现在在 Android 应用程序中产生正确的结果。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.