简体   繁体   English

执行Apache spark ML管道时出错

[英]Error when executing Apache spark ML pipeline

We are using Apache Spark 1.6, Scala 2.10.5, Sbt 0.13.9 我们使用的是Apache Spark 1.6,Scala 2.10.5,Sbt 0.13.9

While executing a simple pipeline: 在执行简单的管道时:

def buildPipeline(): Pipeline = {
    val tokenizer = new Tokenizer()
    tokenizer.setInputCol("Summary")
    tokenizer.setOutputCol("LemmatizedWords")
    val hashingTF = new HashingTF()
    hashingTF.setInputCol(tokenizer.getOutputCol)
    hashingTF.setOutputCol("RawFeatures")

    val pipeline = new Pipeline()
    pipeline.setStages(Array(tokenizer, hashingTF))
    pipeline
}

When executing the ML pipeline fit method get the error below. 执行ML管道拟合方法时,请获取以下错误。 Any comments about what might be going on would be helpful. 任何关于可能发生的事情的评论都会有所帮助。

**java.lang.RuntimeException: error reading Scala signature of org.apache.spark.mllib.linalg.Vector: value linalg is not a package**

[error] org.apache.spark.ml.feature.HashingTF$$typecreator1$1.apply(HashingTF.scala:66)
[error] org.apache.spark.sql.catalyst.ScalaReflection$class.localTypeOf(ScalaReflection.scala:642)

[error] org.apache.spark.sql.catalyst.ScalaReflection$.localTypeOf(ScalaReflection.scala:30)
[error] org.apache.spark.sql.catalyst.ScalaReflection$class.schemaFor(ScalaReflection.scala:630)
[error] org.apache.spark.sql.catalyst.ScalaReflection$.schemaFor(ScalaReflection.scala:30)
[error] org.apache.spark.sql.functions$.udf(functions.scala:2576)
[error] org.apache.spark.ml.feature.HashingTF.transform(HashingTF.scala:66)
[error] org.apache.spark.ml.PipelineModel$$anonfun$transform$1.apply(Pipeline.scala:297)
[error] org.apache.spark.ml.PipelineModel$$anonfun$transform$1.apply(Pipeline.scala:297)
[error] org.apache.spark.ml.PipelineModel.transform(Pipeline.scala:297)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
    at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)

build.sbt

scalaVersion in ThisBuild := "2.10.5"
scalacOptions := Seq("-unchecked", "-deprecation", "-encoding", "utf8")  

val sparkV = "1.6.0"
val sprayV = "1.3.2"
val specs2V = "2.3.11"
val slf4jV = "1.7.5"
val grizzledslf4jV = "1.0.2"
val akkaV = "2.3.14"

libraryDependencies in ThisBuild ++= { Seq(
  ("org.apache.spark" %% "spark-mllib" % sparkV) % Provided,  
  ("org.apache.spark" %% "spark-core" % sparkV) % Provided, 
  "com.typesafe.akka" %% "akka-actor" % akkaV,
  "io.spray" %% "spray-can" % sprayV,
  "io.spray" %% "spray-routing" % sprayV,
  "io.spray" %% "spray-json" % sprayV, 
  "io.spray" %% "spray-testkit" % "1.3.1" % "test", 
  "org.specs2" %% "specs2-core" % specs2V % "test",
  "org.specs2" %% "specs2-mock" % specs2V % "test",
  "org.specs2" %% "specs2-junit" % specs2V % "test",
  "org.slf4j" % "slf4j-api" % slf4jV,
  "org.clapper" %% "grizzled-slf4j" % grizzledslf4jV
) }

You should try using 你应该尝试使用

org.apache.spark.ml.linalg.Vector and org.apache.spark.ml.linalg.Vector和

org.apache.spark.mllib.linalg.Vectors over what you are using now that is, org.apache.spark.mllib.linalg.Vectors现在你正在使用的是什么,

org.apache.spark.mllib.linalg.Vectors org.apache.spark.mllib.linalg.Vectors

Hope this solves your problem. 希望这能解决你的问题。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM