繁体   English   中英

引起:java.lang.NoClassDefFoundError: org/apache/spark/sql/types/StructType 导入时

[英]Caused by: java.lang.NoClassDefFoundError: org/apache/spark/sql/types/StructType while it is imported

我正在使用 Spark SQL 依赖项,就像已经导入它的数据结构一样,但没有将 SPark 上下文传递给 Scala 的 main 方法。 为什么我得到这个异常?

Caused by: java.lang.NoClassDefFoundError: org/apache/spark/sql/types/StructType 

编码:

package io.sekai.industry.streaming

import org.apache.spark.sql.types
import org.apache.spark.sql.types.{LongType, StringType, StructField, StructType}

class NonSparkSensorsGeneration {

  val schema = StructType(
    StructField("id", LongType, nullable = false) :: //always skip this if using the operator
      StructField("Energy Data", StringType, nullable = false, metadata = new types.MetadataBuilder().putString("mqtt", "ESP_02/Energy Data").putString("label", "EnergyData").build()) ::
      Nil)

  def readFromCSVFile(shortPath: String): Unit= {

    println("Preparing to load the csv file from jar resource " + shortPath)
    val input = getClass.getResourceAsStream(shortPath)
    println("Loaded file from resource " + shortPath)
    println("Input stream from resource " + shortPath + " details: " + input.toString)
    val source = scala.io.Source.fromInputStream(input)
    val data = source.getLines.map(_.split("\t")).toArray
    source.close
    println(data.getClass)
  }


  def main(args: Array[String]): Unit = {
    readFromCSVFile("generation")
  }
}

您可能忘记在您的 IDE 中启用将Add dependencies with "provided" scope to classpath标志的依赖项(我提到的标志来自 IntelliJ IDEA)

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM