简体   繁体   English

Java Spark和cassandra连接中的java.lang.AbstractMethodError

[英]java.lang.AbstractMethodError in Java Spark and cassandra connection

working with spark1.6.0 and cassandra-3.1.1 and I tried to connect to cassandra database using Java spark. 使用spark1.6.0和cassandra-3.1.1,我尝试使用Java spark连接到cassandra数据库。 there is no error while building but getting the following error while i run the application 构建时没有错误,但是在运行应用程序时出现以下错误

vException in thread "main" java.lang.AbstractMethodError
at org.apache.spark.Logging$class.log(Logging.scala:51)
at com.datastax.spark.connector.cql.CassandraConnector$.log(CassandraConnector.scala:144)
at org.apache.spark.Logging$class.logDebug(Logging.scala:62)
at com.datastax.spark.connector.cql.CassandraConnector$.logDebug(CassandraConnector.scala:144)
at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:154)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$4.apply(CassandraConnector.scala:151)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$4.apply(CassandraConnector.scala:151)
at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:36)
at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:61)
at com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:72)
at com.test.cassandra.spark.Main.generateData(Main.java:30)
at com.test.cassandra.spark.Main.run(Main.java:21)
at com.test.cassandra.spark.Main.main(Main.java:163)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

my code 我的代码

import com.datastax.driver.core.Session;
import com.datastax.spark.connector.cql.CassandraConnector;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaSparkContext;
import java.io.Serializable;

public class Main implements Serializable {
    private transient SparkConf sconf;
    private static final String keySpaceName = "java_api";
    private static final String primaryTableName = "test_cassandra";

    private Main(SparkConf conf) {
        this.sconf = conf;
    }
    private void run() {
      JavaSparkContext sc = new JavaSparkContext(sconf);
    generateData(sc); 
    sc.stop();
    }
    private void generateData(JavaSparkContext sc) {

        CassandraConnector connector = CassandraConnector.apply(sc.getConf());

        try (Session session = connector.openSession()) {
            System.out.println("connected to cassandra");
        session.execute("DROP KEYSPACE IF EXISTS java_api");
        session.execute("CREATE KEYSPACE java_api WITH replication = {'class': 'SimpleStrategy', 'replication_factor': 1}");            
        session.execute("CREATE TABLE java_api.sales (id UUID PRIMARY KEY, product INT, price DECIMAL)");
        session.execute("CREATE TABLE java_api.summaries (product INT PRIMARY KEY, summary DECIMAL)");
        System.out.println("connected");
        }
    }
public static void main(String[] args) {
        if (args.length != 2) {
            System.err
                    .println("Syntax: com.datastax.spark.demo.Main <Spark Master URL> <Cassandra contact point>");
            System.exit(1);
        }
        SparkConf conf = new SparkConf()
                .set("spark.cassandra.connection.host", "localhost")
                .set("spark.cassandra.connection.native.port", "9042");
        conf.setAppName("Java API demo");
        conf.setMaster(args[0]);
        //conf.set("spark.cassandra.connection.host", "127.0.0.1");
        Main app = new Main(conf);
        app.run();
}
}

my pom.xml 我的pom.xml

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <groupId>com.test</groupId>
    <artifactId>cassandra-spark</artifactId>
    <version>1.0</version>
    <packaging>jar</packaging>
    <properties>
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
        <maven.compiler.source>1.7</maven.compiler.source>
        <maven.compiler.target>1.7</maven.compiler.target>
    </properties>
    <dependencies>
        <dependency>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
            <version>3.8.1</version>
            <scope>test</scope>
        </dependency>
        <!--Spark Cassandra Connector -->
        <dependency>
            <groupId>com.datastax.spark</groupId>
            <artifactId>spark-cassandra-connector_2.10</artifactId>
            <version>1.5.0-M3</version>
        </dependency>
        <dependency>
            <groupId>com.datastax.spark</groupId>
            <artifactId>spark-cassandra-connector-java_2.10</artifactId>
            <version>1.5.0-M3</version>
        </dependency>
        <dependency>
            <groupId>com.datastax.cassandra</groupId>
            <artifactId>cassandra-driver-core</artifactId>
            <version>3.0.0-rc1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.10</artifactId>
            <version>1.6.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming_2.10</artifactId>
            <version>1.6.0</version>
        </dependency>
    </dependencies>
</project>

This may come from the fact that 这可能来自以下事实:

some class has incompatibly changed since the currently executing method was last compiled. 自上次编译当前执行的方法以来,某些类已发生不兼容的更改。

This may come from the java version for example 例如,这可能来自Java版本

See the response to this question: Spark streaming StreamingContext.start() - Error starting receiver 0 请参阅对此问题的响应: Spark流式StreamingContext.start()-启动接收器0时出错

Seems this issue is because of conflict in logging of spark and Cassandra.I was getting this error while using below dependency. 似乎此问题是由于spark和Cassandra的日志记录冲突引起的。我在使用以下依赖项时遇到此错误。

libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "1.6.2" libraryDependencies + =“ com.datastax.spark” %%“ spark-cassandra-connector”%“ 1.6.2”

I used below Cassandra connector to resolve this issue. 我使用下面的Cassandra连接器来解决此问题。 . libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "1.6.5" libraryDependencies + =“ com.datastax.spark” %%“ spark-cassandra-connector”%“ 1.6.5”

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM