简体   繁体   中英

Caused by: java.lang.NoClassDefFoundError: org/apache/spark/sql/types/StructType while it is imported

I'm using Spark SQL dependency just as a data structure already importing it but without passing the SPark context to the Scala's main method. Why I'm getting this exception?

Caused by: java.lang.NoClassDefFoundError: org/apache/spark/sql/types/StructType 

the code:

package io.sekai.industry.streaming

import org.apache.spark.sql.types
import org.apache.spark.sql.types.{LongType, StringType, StructField, StructType}

class NonSparkSensorsGeneration {

  val schema = StructType(
    StructField("id", LongType, nullable = false) :: //always skip this if using the operator
      StructField("Energy Data", StringType, nullable = false, metadata = new types.MetadataBuilder().putString("mqtt", "ESP_02/Energy Data").putString("label", "EnergyData").build()) ::
      Nil)

  def readFromCSVFile(shortPath: String): Unit= {

    println("Preparing to load the csv file from jar resource " + shortPath)
    val input = getClass.getResourceAsStream(shortPath)
    println("Loaded file from resource " + shortPath)
    println("Input stream from resource " + shortPath + " details: " + input.toString)
    val source = scala.io.Source.fromInputStream(input)
    val data = source.getLines.map(_.split("\t")).toArray
    source.close
    println(data.getClass)
  }


  def main(args: Array[String]): Unit = {
    readFromCSVFile("generation")
  }
}

You probably forgot to enable the Add dependencies with "provided" scope to classpath flag in your IDE (the flag I mentioned is from IntelliJ IDEA)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM