简体   繁体   English

Spark scala 异常重载方法值 foreach

[英]Spark scala exception overloaded method value foreach

  def main(args: Array[String]): Unit = {

    val conf = new SparkConf().setAppName("Spardsl").setMaster("local")
    val sc = new SparkContext(conf)
    sc.setLogLevel("ERROR")
    val sparksession= SparkSession.builder().getOrCreate()//spark session initialization
    val struct =
      StructType(
        StructField("txnno", StringType, true) ::
          StructField("txndate", StringType, false) ::
          StructField("custno", StringType, true)::
          StructField("amount", StringType, true)::
          StructField("category", StringType, false)::
          StructField("product", StringType, true)::
          StructField("city", StringType, true)::
          StructField("state", StringType, true)::
          StructField("spendby", StringType, false):: Nil)

    val txndf = sparksession.read.format("csv").schema(struct).load("file:///D:/bigdata_tasks/txns.csv")
    println("=============Normal txndf=======================")
    txndf.show()
    println("===================with column========================")
    val withcolumnMonth=txndf.withColumn("col_check",lit("Y")).withColumn("col_month",expr("split(txndate,'-')[0]"))
    withcolumnMonth.show()
    println("===================with column========================")
    val spenbyCol= when(col("spendby")==="credit",0)
      .when(col("spendby")==="cash",1)
      .otherwise(3)

    val withcolumnspeendBy=txndf.withColumn("col_check",lit("Y"))
      .withColumn("col_month",expr("split(txndate,'-')[0]"))
      .withColumn("col_spend",spenbyCol)
    //withcolumnspeendBy.show()
    withcolumnspeendBy.foreach(println)
      }

After running it in intellij it is giving following exception在intellij中运行它后,它给出了以下异常

 enterError:(43, 24) overloaded method value foreach with alternatives:
 (func: org.apache.spark.api.java.function.ForeachFunction[org.apache.spark.sql.Row])Unit <and>
  (f: org.apache.spark.sql.Row => Unit)Unit cannot be applied to (Unit)
withcolumnspeendBy.foreach(println) 

What will be possible cause of this exception?此异常的可能原因是什么?

Several ways to do this:有几种方法可以做到这一点:

withcolumnspeendBy.collect.foreach(println)

withcolumnspeendBy.rdd.foreach(println)

withcolumnspeendBy.foreach(println(_))

each of which will give different results.每一个都会给出不同的结果。

println(_) is needed to pass the correct function type to foreach , which requires (Row => Unit) .需要println(_)将正确的 function 类型传递给foreach ,这需要(Row => Unit)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 scala重载方法值无法应用 - scala overloaded method value cannot be applied Scala错误:方法值超载+替代方法: - Scala error: overloaded method value + with alternatives: Mockito Scala 重载方法值 thenReturn with alternatives - Mockito Scala overloaded method value thenReturn with alternatives Scala:重载方法值与替代方案收敛: - Scala: overloaded method value converged with alternatives: 在 Scala 中使用替代方法重载了 Dim 的方法值 - overloaded method value ofDim with alternatives in scala 重载方法值适用于 scala 中的替代错误 - overloaded method value apply with alternatives error in scala 在 Apache Spark 上的 Scala 作业中没有这样的方法运行 forEach - No such method running forEach in Scala job on Apache Spark Scala Spark - 调用createDataFrame时获取重载方法 - Scala Spark - Get Overloaded method when calling createDataFrame 无法解决重载方法 IntelliJ 2018.2.8 / Scala 2.11.8/Spark 2.4.5 - Cannot Resolve Overloaded Method IntelliJ 2018.2.8 / Scala 2.11.8/Spark 2.4.5 scala.collection.immutable.Iterable [org.apache.spark.sql.Row]到DataFrame吗? 错误:方法值重载createDataFrame及其它替代方法 - scala.collection.immutable.Iterable[org.apache.spark.sql.Row] to DataFrame ? error: overloaded method value createDataFrame with alternatives
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM