[英]Class org.apache.spark.sql.types.SQLUserDefinedType not found - continuing with a stub
I have a basic spark mllib program as follows. 我有一个基本的spark mllib程序,如下所示。
import org.apache.spark.mllib.clustering.KMeans
import org.apache.spark.SparkContext
import org.apache.spark.SparkConf
import org.apache.spark.mllib.linalg.Vectors
class Sample {
val conf = new SparkConf().setAppName("helloApp").setMaster("local")
val sc = new SparkContext(conf)
val data = sc.textFile("data/mllib/kmeans_data.txt")
val parsedData = data.map(s => Vectors.dense(s.split(' ').map(_.toDouble))).cache()
// Cluster the data into two classes using KMeans
val numClusters = 2
val numIterations = 20
val clusters = KMeans.train(parsedData, numClusters, numIterations)
// Export to PMML
println("PMML Model:\n" + clusters.toPMML)
}
I have manually added spark-core
, spark-mllib
and spark-sql
to the project class path through intellij all having version 1.5.0. 我已经通过intellij手动将spark-core
, spark-mllib
和spark-sql
到项目类路径,所有版本均为1.5.0。
I am getting the below error when I run the program? 运行程序时出现以下错误? any idea what's wrong? 知道有什么问题吗?
Error:scalac: error while loading Vector, Missing dependency 'bad symbolic reference. 错误:scalac:加载Vector时出错,缺少依赖项'错误的符号引用。 A signature in Vector.class refers to term types in package org.apache.spark.sql which is not available. Vector.class中的签名是指软件包org.apache.spark.sql中不可用的术语类型。 It may be completely missing from the current classpath, or the version on the classpath might be incompatible with the version used when compiling Vector.class.', required by /home/fazlann/Downloads/spark-mllib_2.10-1.5.0.jar(org/apache/spark/mllib/linalg/Vector.class 当前类路径可能会完全丢失它,或者类路径上的版本可能与/home/fazlann/Downloads/spark-mllib_2.10-1.5.0要求的Vector.class编译时使用的版本不兼容。 jar(org / apache / spark / mllib / linalg / Vector.class
DesirePRG. DesirePRG。 I have met the same problem as yours. 我遇到了和你一样的问题。 The solution is to import some jar which assemble the spark and hadoop, such as spark-assembly-1.4.1-hadoop2.4.0.jar , then it could work properly. 解决方案是导入一些组装spark和hadoop的jar,例如spark-assembly-1.4.1-hadoop2.4.0.jar ,然后它可以正常工作。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.