简体   繁体   中英

Not able to import Spark Implicits in ScalaTest

I am writing Test Cases for Spark using ScalaTest.

import org.apache.spark.sql.SparkSession
import org.scalatest.{BeforeAndAfterAll, FlatSpec}

class ClassNameSpec extends FlatSpec with BeforeAndAfterAll {
  var spark: SparkSession = _
  var className: ClassName = _

  override def beforeAll(): Unit = {
    spark = SparkSession.builder().master("local").appName("class-name-test").getOrCreate()
    className = new ClassName(spark)
  }

  it should "return data" in {
    import spark.implicits._
    val result = className.getData(input)

    assert(result.count() == 3)
  }

  override def afterAll(): Unit = {
    spark.stop()
  }
}

When I try to compile the test suite it gives me following error:

stable identifier required, but ClassNameSpec.this.spark.implicits found.
[error]     import spark.implicits._
[error]                  ^
[error] one error found
[error] (test:compileIncremental) Compilation failed

I am not able to understand why I cannot import spark.implicits._ in a test suite.

Any help is appreciated !

To do an import you need a "stable identifier" as the error message says. This means that you need to have a val, not a var. Since you defined spark as a var, scala can't import correctly.

To solve this you can simply do something like:

val spark2 = spark
import spark2.implicits._

or instead change the original var to val, eg:

lazy val spark: SparkSession = SparkSession.builder().master("local").appName("class-name-test").getOrCreate()

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM