簡體   English   中英

獲取sparksession和SQLContext的依賴項錯誤

[英]Getting dependency error for sparksession and SQLContext

我的Spark程序中出現SQLContext和sparksession的依賴錯誤

val sqlContext = new SQLContext(sc)
val spark = SparkSession.builder()

SQLCOntext錯誤

Symbol 'type org.apache.spark.Logging' is missing from the classpath. This symbol is required by 'class org.apache.spark.sql.SQLContext'. Make sure that type Logging is in your classpath and check for conflicting dependencies with -Ylog-classpath. A full rebuild may help if 'SQLContext.class' was compiled against an incompatible version of org.apache.spark.

SparkSession錯誤:

not found: value SparkSession

以下是我的pom.xml中的spark依賴項

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-sql_2.10</artifactId>
    <version>1.6.0-cdh5.15.1</version>
</dependency>
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.10</artifactId>
    <version>2.0.0-cloudera1-SNAPSHOT</version>
</dependency>
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-catalyst_2.10</artifactId>
    <version>1.6.0-cdh5.15.1</version>
</dependency>
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-test-tags_2.10</artifactId>
    <version>1.6.0-cdh5.15.1</version>
</dependency>

您不能在項目中同時定義Spark 2和Spark 1.6依賴項。 org.apache.spark.Logging在Spark 2中不再可用。

更改

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.10</artifactId>
    <version>2.0.0-cloudera1-SNAPSHOT</version>
</dependency>

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.10</artifactId>
    <version>1.6.0-cdh5.15.1</version>
</dependency>

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM