简体   繁体   English

使用android studio应用程序获取python生成的数据

[英]Get python generated data with android studio application

I have a python script that runs a Linear Regression model:我有一个运行线性回归模型的 python 脚本:

import sklearn
import pandas
import numpy
from sklearn import linear_model

#Get Data from data set
data = pandas.read_csv("dataset.csv", sep=";")
data = data[["Red", "Green", "Blue", "T"]]

#Define Variables
x = numpy.array(data.drop(["T"], 1))  # Data set
y = numpy.array(data["T"])  # Correct values


#Set training sets
x_train, x_test, y_train, y_test = sklearn.model_selection.train_test_split(x, y, test_size=0.2)
linear = linear_model.LinearRegression()
linear.fit(x_train, y_train)

#Random Value
a = [[145, 131, 125, 35], [0, 0, 0, 0]]

#Predictions
p = linear.predict(a)

The linear.predict() function returns exactly the value I need. linear.predict()函数返回的正是我需要的值。 But I need to be able to access this function and the model through an Android Studio Application.但我需要能够通过 Android Studio 应用程序访问此功能和模型。

Any suggestions??有什么建议??

assume your model is named 'model', then you can export it like this:假设您的模型名为“模型”,那么您可以像这样导出它:

import tensorflow as tf
from tensorflow.keras import models

keras_file='cf.h5'
models.save_model(model,keras_file)
converter=tf.compat.v1.lite.TFLiteConverter.from_keras_model_file(keras_file)
tfmodel = converter.convert()
open("degree.tflite","wb").write(tfmodel)

and in android studio you can predict using the sample input futures by calling doInterface method after you put the exported model file degree.tflite in your project assets在 android studio 中,您可以在将导出的模型文件 degree.tflite 放入项目资产后,通过调用doInterface方法来预测使用示例输入期货

here is a full tutorial on this 是一个完整的教程

import org.tensorflow.lite.Interpreter;

private MappedByteBuffer loadModelFile(){
 AssetFileDescriptor fileDescriptor= null;
 try {
 fileDescriptor = this.getAssets().openFd("degree.tflite");

 FileInputStream inputStream=new FileInputStream(fileDescriptor.getFileDescriptor());
 FileChannel fileChannel=inputStream.getChannel();
 long startOffset=fileDescriptor.getStartOffset();
 long declareLength=fileDescriptor.getDeclaredLength();
 return fileChannel.map(FileChannel.MapMode.READ_ONLY,startOffset,declareLength);
 } catch (IOException e) {
 e.printStackTrace();
 }
 return null;
}

private double[] doInference (double[] input) {
 Interpreter tflite;
 double[] output = new double[1]; //depending on your output shape
 try {
 tflite = new Interpreter(loadModelFile());
 tflite.run(input,output);
 return output;
 }catch (Exception ex){
 ex.printStackTrace();
 }
 return output;
}

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM