简体   繁体   中英

Maven dependency for Spark's StreamingQuery

I have a compile-time error:

Cannot resolve symbol 'spark'

For the following code:

import org.apache.spark.sql.streaming.StreamingQuery;

I use InteliJ.

pom.xml:

...
<dependencies>
        <dependency>
            <groupId>com.sparkjava</groupId>
            <artifactId>spark-core</artifactId>
            <version>2.5</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.11</artifactId>
            <version>2.4.0</version>
        </dependency>
    </dependencies>

Edit:

I've noticed that Maven complains about the spark-sql dependency:

在此处输入图片说明

I can confirm that the interface org.apache.spark.sql.streaming.StreamingQuery exists in spark-sql_2.11:2.4.0 with the source code available here: https://github.com/apache/spark/blob/v2.4.0/sql/core/src/main/scala/org/apache/spark/sql/streaming/StreamingQuery.scala

I assume that the compilation error is related to the import in the source code, and not to some type (you haven't attached the source code of the file that has a failure, so I can only speculate here)

So, in terms of resolution:

  1. try running mvn clean compile from command line without intelliJ. If it succeeds - check intelliJ integration (maybe re-import will fix the issue).

  2. If it doesn't - check the pom.xml maybe you've updated a <dependencyManagement> section by mistake.

  3. Remove the jars of spark from local maven repository and retry, sometimes its possible that the jar will get downloaded corrupted

Update 1 Since now we know that mvn clean compile from the command line runs successfully, this must be an issue of opening project in intelliJ.

So I suggest to do the following:

  1. Close the current project: File --> Close Project
  2. File --> Open... Now find the project folder and (this is important) pick pom.xml . IntelliJ will ask whether you want to delete the previously found project files - respond "yes" and it will re-build an internal dependencies data structures. Then Try to run the program and update whether the issue exists.

"Spark Framework" at http://sparkjava.com/ has nothing to do with Apache Spark . Remove the com.sparkjava dependency (assuming it's not needed elsewhere in your project) and replace it with

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.11</artifactId>
    <version>2.4.0</version>
</dependency>

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM