[英]How do I import singleton object from Scala package in Java?
I am trying to use ARIMA object (Scala), which is imported from a package , in my Java program.我正在尝试在我的 Java 程序中使用从package导入的 ARIMA object (Scala)。 Although the compilation succeeds, meaning that ARIMA class is recognized during compilation, there is NoClassDefFoundError for the ARIMA object in runtime.
虽然编译成功,意味着编译时识别出 ARIMA class,但运行时 ARIMA object 出现NoClassDefFoundError 。 ARIMAModel class has no problem with importing since it is a class.
ARIMAModel class 导入没有问题,因为它是 class。
Is there any way to use the Scala object from my Java program?有什么方法可以使用我的 Java 程序中的 Scala object 吗?
Here is the source code for the object in Scala package.这是 Scala package 中 object 的源代码。
File: .../com/cloudera/sparkts/models/ARIMA.scala文件: .../com/cloudera/sparkts/models/ARIMA.scala
package com.cloudera.sparkts.models
object ARIMA {
def autoFit(ts: Vector, maxP: Int = 5, maxD: Int = 2, maxQ: Int = 5): ARIMAModel = {
...
}
}
class ARIMAModel(...) {
...
}
Here is my Java code.这是我的 Java 代码。
File: src/main/java/SingleSeriesARIMA.java文件:src/main/java/SingleSeriesARIMA.java
import com.cloudera.sparkts.models.ARIMA;
import com.cloudera.sparkts.models.ARIMAModel;
public class SingleSeriesARIMA {
public static void main(String[] args) {
...
ARIMAModel arimaModel = ARIMA.autoFit(tsVector, 1, 0, 1);
...
}
}
Here is the error.这是错误。
Exception in thread "main" java.lang.NoClassDefFoundError: com/cloudera/sparkts/models/ARIMA
at SingleSeriesARIMA.main(SingleSeriesARIMA.java:43)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:729)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: com.cloudera.sparkts.models.ARIMA
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 10 more
I am using Scala version 2.11.8 and Java 1.8我正在使用 Scala 版本 2.11.8 和 Java 1.8
You need to supply the dependency having Arima
object present to the spark cluster using --jars
option as below-您需要使用
--jars
选项向 spark 集群提供具有Arima
object 的依赖项,如下所示 -
spark-submit --jars <path>/<to>/sparkts-0.4.1.jar --class SingleSeriesARIMA target/simple-project-1.0.jar
This will pass the other dependency along with the application jar to be available at spark-runtime
.这将传递其他依赖项以及应用程序 jar 以在
spark-runtime
可用。
TO call ARIMA
object from java use-从 java 调用
ARIMA
object -
ARIMA$.MODULE$.autoFit(tsVector, 1, 0, 1);
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.