简体   繁体   English

使用Spark测试基础时Spark单元测试编译错误

[英]Spark unit test compilation error when using Spark testing base

I am getting the following error 我收到以下错误

Error:scalac: bad symbolic reference. A signature in DataFrameSuiteBaseLike.class refers to term hive
in package org.apache.spark.sql which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling  
DataFrameSuiteBaseLike.class

You should add the Spark Hive library to your dependencies. 您应该将Spark Hive库添加到依赖项中。 For instance in SBT: 例如在SBT中:

libraryDependencies += "org.apache.spark" % "spark-hive_2.10" % "1.6.1"

This is mentioned in the project at https://github.com/holdenk/spark-testing-base/issues/93 这在项目中提到: https://github.com/holdenk/spark-testing-base/issues/93

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM