简体   繁体   English

尝试使用Spark Streaming连接Cassandra数据库时出错

[英]Error while trying to connect cassandra database using spark streaming

I'm working in a project which uses Spark streaming, Apache kafka and Cassandra. 我正在使用Spark流,Apache kafka和Cassandra的项目中工作。 I use streaming-kafka integration. 我使用流式Kafka集成。 In kafka I have a producer which sends data using this configuration: 在kafka中,我有一个生产者,它使用以下配置发送数据:

props.put("metadata.broker.list", KafkaProperties.ZOOKEEPER); props.put("bootstrap.servers", KafkaProperties.SERVER); props.put("client.id", "DemoProducer");

where ZOOKEEPER = localhost:2181 , and SERVER = localhost:9092 . 其中ZOOKEEPER = localhost:2181SERVER = localhost:9092

Once I send data I can receive it with spark, and I can consume it too. 一旦发送数据,我就可以通过spark接收它,也可以使用它。 My spark configuration is: 我的火花配置是:

SparkConf sparkConf = new SparkConf().setAppName("org.kakfa.spark.ConsumerData").setMaster("local[4]");
sparkConf.set("spark.cassandra.connection.host", "localhost");
JavaStreamingContext jssc = new JavaStreamingContext(sparkConf, new Duration(2000));

After that I am trying to store this data into cassandra database. 之后,我试图将这些数据存储到cassandra数据库中。 But when I try to open session using this: 但是当我尝试使用此打开会话时:

CassandraConnector connector = CassandraConnector.apply(jssc.sparkContext().getConf());
Session session = connector.openSession();

I get the following error: 我收到以下错误:

Exception in thread "main" com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (tried: localhost/127.0.0.1:9042 (com.datastax.driver.core.exceptions.InvalidQueryException: unconfigured table schema_keyspaces))
at com.datastax.driver.core.ControlConnection.reconnectInternal(ControlConnection.java:220)
at com.datastax.driver.core.ControlConnection.connect(ControlConnection.java:78)
at com.datastax.driver.core.Cluster$Manager.init(Cluster.java:1231)
at com.datastax.driver.core.Cluster.getMetadata(Cluster.java:334)
at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:182)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:161)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:161)
at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:36)
at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:61)
at com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:70)
at org.kakfa.spark.ConsumerData.main(ConsumerData.java:80)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)

Regarding to cassandra, I'm using default configuration: 关于cassandra,我使用的是默认配置:

start_native_transport: true
native_transport_port: 9042
- seeds: "127.0.0.1"
cluster_name: 'Test Cluster'
rpc_address: localhost
rpc_port: 9160
start_rpc: true

I can manage to connect to cassandra from the command line using cqlsh localhost, getting the following message: 我可以使用cqlsh localhost从命令行连接到cassandra,得到以下消息:

Connected to Test Cluster at 127.0.0.1:9042. [cqlsh 5.0.1 | Cassandra 3.0.5 | CQL spec 3.4.0 | Native protocol v4] Use HELP for help. cqlsh> 

I used nodetool status too, which shows me this: 我也使用了nodetool status,它向我展示了这一点:

http://pastebin.com/ZQ5YyDyB http://pastebin.com/ZQ5YyDyB

For running cassandra I invoke bin/cassandra -f 为了运行cassandra,我调用bin/cassandra -f

What I am trying to run is this: 我要运行的是这样的:

try (Session session = connector.openSession()) {
        System.out.println("dentro del try");
        session.execute("DROP KEYSPACE IF EXISTS test");
        System.out.println("dentro del try - 1");
        session.execute("CREATE KEYSPACE test WITH replication = {'class': 'SimpleStrategy', 'replication_factor': 1}");
        System.out.println("dentro del try - 2");
        session.execute("CREATE TABLE test.users (id TEXT PRIMARY KEY, name TEXT)");
        System.out.println("dentro del try - 3");
    }

My pom.xml file looks like that: 我的pom.xml文件如下所示:

<dependencies>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-streaming_2.10</artifactId>
        <version>1.6.1</version>
    </dependency>

    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-streaming-kafka_2.10</artifactId>
        <version>1.6.1</version>
    </dependency>

    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.10</artifactId>
        <version>1.6.1</version>
    </dependency>
    <dependency>
        <groupId>com.datastax.spark</groupId>
        <artifactId>spark-cassandra-connector-java_2.10</artifactId>
        <version>1.6.0-M1</version>
    </dependency>
    <dependency>
        <groupId>com.datastax.spark</groupId>
        <artifactId>spark-cassandra-connector_2.10</artifactId>
        <version>1.6.0-M2</version>
    </dependency>
    <dependency>
        <groupId>com.datastax.spark</groupId>
        <artifactId>spark-cassandra-connector_2.10</artifactId>
        <version>1.1.0-alpha2</version>
    </dependency>
    <dependency>
        <groupId>com.datastax.spark</groupId>
        <artifactId>spark-cassandra-connector-java_2.10</artifactId>
        <version>1.1.0-alpha2</version>
    </dependency>

    <dependency>
        <groupId>org.json</groupId>
        <artifactId>json</artifactId>
        <version>20160212</version>
    </dependency>
</dependencies>

I have no idea why I can't connect to cassandra using spark, is it configuration bad or what i am doing wrong? 我不知道为什么我不能使用spark连接到cassandra,这是配置错误还是我做错了?

Thank you! 谢谢!

com.datastax.driver.core.exceptions.InvalidQueryException: unconfigured table schema_keyspaces) com.datastax.driver.core.exceptions.InvalidQueryException:未配置的表schema_keyspaces)

That error indicates an old driver with a new Cassandra version. 该错误表明旧的驱动程序具有新的Cassandra版本。 Looking at the POM file, we find there the spark-cassandra-connector dependency declared twice. 查看POM文件,我们发现在这里两次声明了spark-cassandra-connector依赖项。 One uses version 1.6.0-m2 (GOOD) and the other 1.1.0-alpha2 (old). 一个使用1.6.0-m2版本(GOOD),另一个使用1.1.0-alpha2 (旧)。

Remove the references to the old dependencies 1.1.0-alpha2 from your config: 从您的配置中删除对旧版本1.1.0-alpha2的引用:

<dependency>
    <groupId>com.datastax.spark</groupId>
    <artifactId>spark-cassandra-connector_2.10</artifactId>
    <version>1.1.0-alpha2</version>
</dependency>
<dependency>
    <groupId>com.datastax.spark</groupId>
    <artifactId>spark-cassandra-connector-java_2.10</artifactId>
    <version>1.1.0-alpha2</version>
</dependency>

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM