简体   繁体   English

通过Spark查询Cassandra表

[英]Query Cassandra table through Spark

I am trying to get values from Cassandra 2.0.17 table through spark-1.6.0 and scala-2.11.7 with the following steps 我正在尝试通过以下步骤通过spark-1.6.0和scala-2.11.7从Cassandra 2.0.17表中获取值

  1. Started cassandra -- service cassandra start 启动cassandra-服务cassandra启动
  2. Started spark-- sbin/start-all.sh 启动Spark-sbin / start-all.sh
  3. stated spark scala -- bin/spark-shell --jars spark-cassandra-connector_2.10-1.5.0-M1.jar 声明的spark scala-bin / spark-shell --jars spark-cassandra-connector_2.10-1.5.0-M1.jar

executed these commands in scala 在scala中执行这些命令

import org.apache.spark.SparkContext
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext._

sc.stop

val conf = new SparkConf(true).set("spark.cassandra.connection.host","127.0.0.1")

val sc=new SparkContext("local[2]","test",conf)

import com.datastax.spark.connector._

everything works fine till here, but when i execute - 一切正常,直到这里,但是当我执行-

val rdd=sc.cassandraTable("tutorialspoint","emp")

It gives me the below error 它给了我下面的错误

error: bad symbolic reference. A signature in CassandraTableScanRDD.class refers to term driver
in package com.datastax which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling CassandraTableScanRDD.class.
error: bad symbolic reference. A signature in CassandraTableScanRDD.class refers to term core
in value com.datastax.driver which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling CassandraTableScanRDD.class.
error: bad symbolic reference. A signature in CassandraTableScanRDD.class refers to term core
in value com.datastax.driver which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling CassandraTableScanRDD.class.

Have added jars from cassandra lib to spark and refrenced it. 从cassandra lib添加了jars来激发并刷新它。 my using java version 1.8.0_72 我使用的Java版本1.8.0_72

M i missing some thing? 我想念一些东西吗?

The driver you are using is incompatible with your scala and spark version. 您使用的驱动程序与您的scala和spark版本不兼容。 You are using scala-2.11.7, but this driver is for scala 2.10. 您正在使用scala-2.11.7,但是此驱动程序适用于scala 2.10。 Also for spark this driver supports spark 1.5.x. 同样对于Spark,此驱动程序支持spark1.5.x。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM