简体   繁体   English

带有scala2.10.6的spark1.6.2没有可用的TypeTag

[英]spark1.6.2 with scala2.10.6 No TypeTag available

Im trying to run the KMeans case from here . 我试图从这里开始 KMeans案。

This is my code: 这是我的代码:

def main(args: Array[String]): Unit = {
val conf = new SparkConf().setAppName(this.getClass.getName).setMaster("local[10]")//.set("spark.sql.warehouse.dir", "file:///")
val sc = new SparkContext(conf)
val sqlContext = new SQLContext(sc)
// Crates a DataFrame
val dataset: DataFrame = sqlContext.createDataFrame(Seq(
  (1, Vectors.dense(0.0, 0.0, 0.0)),
  (2, Vectors.dense(0.1, 0.1, 0.1)),
  (3, Vectors.dense(0.2, 0.2, 0.2)),
  (4, Vectors.dense(9.0, 9.0, 9.0)),
  (5, Vectors.dense(9.1, 9.1, 9.1)),
  (6, Vectors.dense(9.2, 9.2, 9.2))
)).toDF("id", "features")

// Trains a k-means model
val kmeans = new KMeans()
  .setK(2)
  .setFeaturesCol("features")
  .setPredictionCol("prediction")
val model = kmeans.fit(dataset)

// Shows the result
println("Final Centers: ")
model.clusterCenters.foreach(println)}

The Error follow: 错误如下:

Information:2016/9/19 0019 下午 3:36 - Compilation completed with 1 error and 0 warnings in 2s 454ms
D:\IdeaProjects\de\src\main\scala\com.te\KMeansExample.scala
    Error:Error:line (18)No TypeTag available for (Int, org.apache.spark.mllib.linalg.Vector)
    val dataset: DataFrame = sqlContext.createDataFrame(Seq(

some details: 一些细节:

1. When I run this with spark1.6.2 and scala 2.10.6.it is compile fail and show the Error above. 1.当我用spark1.6.2和scala 2.10.6。运行它时,编译失败并显示上面的错误。 But when change the scala version to 2.11.0 . 但是,当将scala版本更改为2.11.0时 it's run OK . 运行正常

2. I run this code in Hue which submit this job to my Cluster by Livy , and my Cluster build with Spark1.6.2 and scala2.10.6 2.我在Hue中运行此代码,该代码由Livy将作业提交给我的集群,并且我的集群使用Spark1.6.2和scala2.10.6构建

Can anybody help me ? 有谁能够帮助我 ? Thanks 谢谢

色调

I am not very sure about the cause of this problem but I think that it is because scala reflection in older versions of scala was not able to work out the TypeTag of yet not inferred function parameters. 我不太确定这个问题的原因,但是我认为这是因为在旧版本的Scala中,Scala反射yet not inferred函数参数的TypeTag

In this case, 在这种情况下,

val dataset: DataFrame = sqlContext.createDataFrame(Seq(
  (1, Vectors.dense(0.0, 0.0, 0.0)),
  (2, Vectors.dense(0.1, 0.1, 0.1)),
  (3, Vectors.dense(0.2, 0.2, 0.2)),
  (4, Vectors.dense(9.0, 9.0, 9.0)),
  (5, Vectors.dense(9.1, 9.1, 9.1)),
  (6, Vectors.dense(9.2, 9.2, 9.2))
)).toDF("id", "features")

The parameter Seq((1, Vectors.dense(0.0, 0.0, 0.0)),.....) is being seen by Scala the first time and hence its type is still not inferred by the system. Scala首次看到参数Seq((1, Vectors.dense(0.0, 0.0, 0.0)),.....) ,因此系统仍无法推断出其类型。 And hence scala reflection can not work out the associated TypeTag . 因此,scala反射无法计算出关联的TypeTag

So... my guess is that if you just move that out.. allow scala to infer the type... it will work. 所以...我的猜测是,如果您只是将其移出..允许scala推断类型...它将起作用。

val vectorSeq = Seq(
  (1, Vectors.dense(0.0, 0.0, 0.0)),
  (2, Vectors.dense(0.1, 0.1, 0.1)),
  (3, Vectors.dense(0.2, 0.2, 0.2)),
  (4, Vectors.dense(9.0, 9.0, 9.0)),
  (5, Vectors.dense(9.1, 9.1, 9.1)),
  (6, Vectors.dense(9.2, 9.2, 9.2))
)

val dataset: DataFrame = sqlContext.createDataFrame(vectorSeq).toDF("id", "features")

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 没有 TypeTag 可用于案例 class 使用 scala 3 和火花 3 - No TypeTag available for a case class using scala 3 with spark 3 Spark Scala API:官方示例中spark.createDataFrame中没有typeTag - Spark Scala API: No typeTag available in spark.createDataFrame in official example 具有“No TypeTag available”的Scala / Spark应用程序“def main”样式App中的错误 - Scala/Spark App with “No TypeTag available” Error in “def main” style App Spark 数据帧 udf 没有可用的 TypeTag - Spark dataframe udf No TypeTag available 在Scala中,如何在TypeTag可用时转换值 - In Scala, how to cast a value when a TypeTag is available 在Scala中,如何创建TypeTag [Map [A​​,B]],而只有TypeTag [A]和TypeTag [B]可用? - In Scala, How to create TypeTag[Map[A,B]], while only TypeTag[A] and TypeTag[B] are available? Scala Spark Encoders.product[X](其中 X 是一个案例类)不断给我“No TypeTag available for X”错误 - Scala Spark Encoders.product[X] (where X is a case class) keeps giving me "No TypeTag available for X" error Scala 2.10.6 没有 Typelevel 编译器 - No Typelevel compiler for Scala 2.10.6 除了多线程之外,还可以从Spark 1.6,Scala 2.10.6进行HBase并发/并行扫描 - HBase Concurrent / Parallel Scan from Spark 1.6, Scala 2.10.6 besides multithreading Scala-没有TypeTag可用当使用案例类尝试获取TypeTag时发生异常吗? - Scala - No TypeTag Available Exception when using case class to try to get TypeTag?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM