簡體   English   中英

引起:java.lang.NoClassDefFoundError: org/apache/spark/sql/types/StructType 導入時

[英]Caused by: java.lang.NoClassDefFoundError: org/apache/spark/sql/types/StructType while it is imported

我正在使用 Spark SQL 依賴項,就像已經導入它的數據結構一樣,但沒有將 SPark 上下文傳遞給 Scala 的 main 方法。 為什么我得到這個異常?

Caused by: java.lang.NoClassDefFoundError: org/apache/spark/sql/types/StructType 

編碼:

package io.sekai.industry.streaming

import org.apache.spark.sql.types
import org.apache.spark.sql.types.{LongType, StringType, StructField, StructType}

class NonSparkSensorsGeneration {

  val schema = StructType(
    StructField("id", LongType, nullable = false) :: //always skip this if using the operator
      StructField("Energy Data", StringType, nullable = false, metadata = new types.MetadataBuilder().putString("mqtt", "ESP_02/Energy Data").putString("label", "EnergyData").build()) ::
      Nil)

  def readFromCSVFile(shortPath: String): Unit= {

    println("Preparing to load the csv file from jar resource " + shortPath)
    val input = getClass.getResourceAsStream(shortPath)
    println("Loaded file from resource " + shortPath)
    println("Input stream from resource " + shortPath + " details: " + input.toString)
    val source = scala.io.Source.fromInputStream(input)
    val data = source.getLines.map(_.split("\t")).toArray
    source.close
    println(data.getClass)
  }


  def main(args: Array[String]): Unit = {
    readFromCSVFile("generation")
  }
}

您可能忘記在您的 IDE 中啟用將Add dependencies with "provided" scope to classpath標志的依賴項(我提到的標志來自 IntelliJ IDEA)

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM