简体   繁体   中英

How to do prediction with Sklearn Model inside Spark?

I have trained a model in python using sklearn. How we can use same model to load in Spark and generate predictions on a spark RDD ?

Well,

I will show an example of linear regression in Sklearn and show you how to use that to predict elements in Spark RDD.

First training the model with sklearn example:

# Create linear regression object
regr = linear_model.LinearRegression()

# Train the model using the training sets
regr.fit(diabetes_X_train, diabetes_y_train)

Here we just have the fit, and you need to predict each data from an RDD.

Your RDD in this case should be a RDD with X like this:

rdd = sc.parallelize([1, 2, 3, 4])

So you first need to broadcast your model of sklearn:

regr_bc = self.sc.broadcast(regr)

Then you can use it to predict your data like this:

rdd.map(lambda x: (x, regr_bc.value.predict(x))).collect()

So your element in the RDD is your X and the seccond element is going to be your predicted Y. The collect will return somthing like this:

[(1, 2), (2, 4), (3, 6), ...]

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM