简体   繁体   中英

Loading Mllib models outside Spark

I'm training a model in spark with mllib and saving it:

val model = SVMWithSGD.train(training, numIterations)

model.save(sc, "~/model")

but I'm having trouble loading it from a java app without spark to make real time predictions.

SparkConf sconf = new SparkConf().setAppName("Application").setMaster("local");
SparkContext sc = new SparkContext(sconf);
SVMModel model = SVMModel.load(sc, "/model");

I'm getting:

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/SparkConf
    at ModelUser$.main(ModelUser.scala:11)
    at ModelUser.main(ModelUser.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:140)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.SparkConf

Is there a way to load the model in normal java app?

看看PMML模型在这里导出

PPML model export in spark is not being maintained anymore, and only the old RDD api support it. I've been using jpmml-sparkml to solve the problem. It also has the java runtime for standalone model execution.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM