简体   繁体   中英

NoHostAvailableException while running spark with dse

I am using datastax 5.1 version for cassandra in my local machine. Started cassandra using

dse cassandra -k

Cassandra booted fine. Next I wanted to go to spark shell using

dse spark

However, its giving me the following errors.

2017-08-21 12:11:25 [main] ERROR o.a.s.d.DseSparkSubmitBootstrapper - Failed to start or submit Spark application because of com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (no host was tried) - see details in the log file(s): /home/rsahukar/.spark-shell.log
com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (no host was tried)
    at com.datastax.driver.core.exceptions.NoHostAvailableException.copy(NoHostAvailableException.java:75) ~[dse-java-driver-core-1.2.2.jar:na]
    at com.datastax.driver.core.exceptions.NoHostAvailableException.copy(NoHostAvailableException.java:28) ~[dse-java-driver-core-1.2.2.jar:na]
    at com.datastax.driver.core.DriverThrowables.propagateCause(DriverThrowables.java:28) ~[dse-java-driver-core-1.2.2.jar:na]
    at com.datastax.driver.core.DefaultResultSetFuture.getUninterruptibly(DefaultResultSetFuture.java:236) ~[dse-java-driver-core-1.2.2.jar:na]
    at com.datastax.driver.core.AbstractSession.execute(AbstractSession.java:59) ~[dse-java-driver-core-1.2.2.jar:na]
    at com.datastax.driver.core.AbstractSession.execute(AbstractSession.java:42) ~[dse-java-driver-core-1.2.2.jar:na]
    at com.datastax.driver.dse.DefaultDseSession.execute(DefaultDseSession.java:232) ~[dse-java-driver-core-1.2.2.jar:na]
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_131]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_131]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_131]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_131]
    at com.datastax.spark.connector.cql.SessionProxy.invoke(SessionProxy.scala:40) ~[spark-cassandra-connector-unshaded_2.11-2.0.3.jar:2.0.3]
    at com.sun.proxy.$Proxy6.execute(Unknown Source) ~[na:na]
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_131]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_131]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_131]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_131]
    at com.datastax.spark.connector.cql.SessionProxy.invoke(SessionProxy.scala:40) ~[spark-cassandra-connector-unshaded_2.11-2.0.3.jar:2.0.3]
    at com.sun.proxy.$Proxy7.execute(Unknown Source) ~[na:na]
    at com.datastax.bdp.util.rpc.RpcUtil.call(RpcUtil.java:42) ~[dse-core-5.1.2.jar:5.1.2]
    at org.apache.spark.deploy.SparkNodeConfiguration$$anonfun$fetch$1.apply(SparkNodeConfiguration.scala:54) ~[dse-spark-5.1.2.jar:5.1.2]
    at org.apache.spark.deploy.SparkNodeConfiguration$$anonfun$fetch$1.apply(SparkNodeConfiguration.scala:52) ~[dse-spark-5.1.2.jar:5.1.2]
    at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:112) ~[spark-cassandra-connector-unshaded_2.11-2.0.3.jar:2.0.3]
    at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:111) ~[spark-cassandra-connector-unshaded_2.11-2.0.3.jar:2.0.3]
    at com.datastax.spark.connector.cql.CassandraConnector.closeResourceAfterUse(CassandraConnector.scala:145) ~[spark-cassandra-connector-unshaded_2.11-2.0.3.jar:2.0.3]
    at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:111) ~[spark-cassandra-connector-unshaded_2.11-2.0.3.jar:2.0.3]
    at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:52) ~[dse-spark-5.1.2.jar:5.1.2]
    at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:81) ~[dse-spark-5.1.2.jar:5.1.2]
    at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:81) ~[dse-spark-5.1.2.jar:5.1.2]
    at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:81) ~[dse-spark-5.1.2.jar:5.1.2]
    at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:81) ~[dse-spark-5.1.2.jar:5.1.2]
    at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:81) ~[dse-spark-5.1.2.jar:5.1.2]
    at org.apache.spark.deploy.SparkNodeConfiguration$.apply(SparkNodeConfiguration.scala:44) ~[dse-spark-5.1.2.jar:5.1.2]
    at org.apache.spark.deploy.SparkConfigurator$$anonfun$8.apply(SparkConfigurator.scala:85) ~[dse-spark-5.1.2.jar:5.1.2]
    at org.apache.spark.deploy.SparkConfigurator$$anonfun$8.apply(SparkConfigurator.scala:85) ~[dse-spark-5.1.2.jar:5.1.2]
    at scala.util.Try$.apply(Try.scala:192) ~[scala-library-2.11.11.jar:na]
    at com.datastax.bdp.util.Lazy.internal$lzycompute(Lazy.scala:26) ~[dse-spark-5.1.2.jar:5.1.2]
    at com.datastax.bdp.util.Lazy.internal(Lazy.scala:25) ~[dse-spark-5.1.2.jar:5.1.2]
    at com.datastax.bdp.util.Lazy.get(Lazy.scala:31) ~[dse-spark-5.1.2.jar:5.1.2]
    at org.apache.spark.deploy.SparkConfigurator.dseDriverProps$lzycompute(SparkConfigurator.scala:152) ~[dse-spark-5.1.2.jar:5.1.2]
    at org.apache.spark.deploy.SparkConfigurator.dseDriverProps(SparkConfigurator.scala:151) ~[dse-spark-5.1.2.jar:5.1.2]
    at org.apache.spark.deploy.SparkConfigurator.dseSparkConfEntries$lzycompute(SparkConfigurator.scala:124) ~[dse-spark-5.1.2.jar:5.1.2]
    at org.apache.spark.deploy.SparkConfigurator.dseSparkConfEntries(SparkConfigurator.scala:124) ~[dse-spark-5.1.2.jar:5.1.2]
    at org.apache.spark.deploy.DseSparkArgsPreprocessor.updatedArgs$lzycompute(DseSparkArgsPreprocessor.scala:79) ~[dse-spark-5.1.2.jar:5.1.2]
    at org.apache.spark.deploy.DseSparkArgsPreprocessor.updatedArgs(DseSparkArgsPreprocessor.scala:68) ~[dse-spark-5.1.2.jar:5.1.2]
    at org.apache.spark.deploy.DseSparkSubmitBootstrapper$.main(DseSparkSubmitBootstrapper.scala:106) ~[dse-spark-5.1.2.jar:5.1.2]
    at org.apache.spark.deploy.DseSparkSubmitBootstrapper.main(DseSparkSubmitBootstrapper.scala) [dse-spark-5.1.2.jar:5.1.2]
Caused by: com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (no host was tried)
    at com.datastax.driver.core.RequestHandler.reportNoMoreHosts(RequestHandler.java:204) ~[dse-java-driver-core-1.2.2.jar:na]
    at com.datastax.driver.core.RequestHandler.access$1000(RequestHandler.java:40) ~[dse-java-driver-core-1.2.2.jar:na]
    at com.datastax.driver.core.RequestHandler$SpeculativeExecution.findNextHostAndQuery(RequestHandler.java:268) ~[dse-java-driver-core-1.2.2.jar:na]
    at com.datastax.driver.core.RequestHandler.startNewExecution(RequestHandler.java:108) ~[dse-java-driver-core-1.2.2.jar:na]
    at com.datastax.driver.core.RequestHandler.sendRequest(RequestHandler.java:88) ~[dse-java-driver-core-1.2.2.jar:na]
    at com.datastax.driver.core.SessionManager.executeAsync(SessionManager.java:124) ~[dse-java-driver-core-1.2.2.jar:na]
    ... 43 common frames omitted
2017-08-21 12:11:25 [Thread-1] ERROR o.a.s.d.DseSparkSubmitBootstrapper - Failed to cancel delegation token

Below is the dsetool ring output

$ dsetool ring
Address          DC                   Rack         Workload             Graph  Status  State    Load             Owns                 Token                                        Health [0,1] 
127.0.0.1        Analytics            rack1        Analytics(SM)        no     Up      Normal   189.19 KiB       ?                    5643405743002698980                          0.50         

Can someone help me?

Finally I found my mistake. I was running cassandra in local mode. And this was my spark conf file (spark-defaults.conf) before the change

....
spark.cassandra.connection.local_dc     localhost
spark.cassandra.connection.host         localhost
....

Please note the spark.cassandra.connection.local_dc value. Since I was running it in local mode, I thought, its value should be localhost too. But, it should be the DC name what the dsetool ring returns.

Below is my dsetool ring output

$ dsetool ring
Address          DC                   Rack         Workload             Graph  Status  State    Load             Owns                 Token                                        Health [0,1] 
127.0.0.1        Analytics            rack1        Analytics(SM)        no     Up      Normal   189.19 KiB       ?                    5643405743002698980                          0.50         

As we can see above, the DC value is Analytics . So, had to put the same value in the spark conf file. Below is the code after change

spark.cassandra.connection.local_dc     Analytics
spark.cassandra.connection.host         localhost

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM