简体   繁体   English

Spark - 找不到演员:ActorSelection

[英]Spark - Actor not found for: ActorSelection

I just cloned the master repository of Spark from Github.我刚刚从 Github 克隆了 Spark 的主存储库。 I am running it on OSX 10.9, Spark 1.4.1 and Scala 2.10.4我在 OSX 10.9、Spark 1.4.1 和 Scala 2.10.4 上运行它

I just tried to run the SparkPi example program using IntelliJ Idea but get the error : akka.actor.ActorNotFound: Actor not found for: ActorSelection[Anchor(akka.tcp://sparkMaster@myhost:7077/)我只是尝试使用 IntelliJ Idea 运行 SparkPi 示例程序,但得到错误:akka.actor.ActorNotFound: Actor not found for: ActorSelection[Anchor(akka.tcp://sparkMaster@myhost:7077/)

I did checkout a similar post at the mailing list but found no solution.我确实在邮件列表中查看了类似的帖子,但没有找到解决方案。

Find the complete stack trace below.在下面找到完整的堆栈跟踪。 Any help would be really appreciated.任何帮助将非常感激。

2015-07-28 22:16:45,888 INFO  [main] spark.SparkContext (Logging.scala:logInfo(59)) - Running Spark version 1.5.0-SNAPSHOT 
2015-07-28 22:16:47,125 WARN  [main] util.NativeCodeLoader (NativeCodeLoader.java:<clinit>(62)) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
2015-07-28 22:16:47,753 INFO  [main] spark.SecurityManager (Logging.scala:logInfo(59)) - Changing view acls to: mac 
2015-07-28 22:16:47,755 INFO  [main] spark.SecurityManager (Logging.scala:logInfo(59)) - Changing modify acls to: mac 
2015-07-28 22:16:47,756 INFO  [main] spark.SecurityManager (Logging.scala:logInfo(59)) - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(mac); users with modify permissions: Set(mac) 
2015-07-28 22:16:49,454 INFO  [sparkDriver-akka.actor.default-dispatcher-2] slf4j.Slf4jLogger (Slf4jLogger.scala:applyOrElse(80)) - Slf4jLogger started 
2015-07-28 22:16:49,695 INFO  [sparkDriver-akka.actor.default-dispatcher-2] Remoting (Slf4jLogger.scala:apply$mcV$sp(74)) - Starting remoting 
2015-07-28 22:16:50,167 INFO  [sparkDriver-akka.actor.default-dispatcher-2] Remoting (Slf4jLogger.scala:apply$mcV$sp(74)) - Remoting started; listening on addresses :[akka.tcp://sparkDriver@192.168.2.105:49981] 
2015-07-28 22:16:50,215 INFO  [main] util.Utils (Logging.scala:logInfo(59)) - Successfully started service 'sparkDriver' on port 49981. 
2015-07-28 22:16:50,372 INFO  [main] spark.SparkEnv (Logging.scala:logInfo(59)) - Registering MapOutputTracker 
2015-07-28 22:16:50,596 INFO  [main] spark.SparkEnv (Logging.scala:logInfo(59)) - Registering BlockManagerMaster 
2015-07-28 22:16:50,948 INFO  [main] storage.DiskBlockManager (Logging.scala:logInfo(59)) - Created local directory at /private/var/folders/8k/jfw576r50m97rlk5qpj1n4l80000gn/T/blockmgr-309db4d1-d129-43e5-a90e-12cf51ad491f 
2015-07-28 22:16:51,198 INFO  [main] storage.MemoryStore (Logging.scala:logInfo(59)) - MemoryStore started with capacity 491.7 MB 
2015-07-28 22:16:51,707 INFO  [main] spark.HttpFileServer (Logging.scala:logInfo(59)) - HTTP File server directory is /private/var/folders/8k/jfw576r50m97rlk5qpj1n4l80000gn/T/spark-f28e24e7-b798-4365-8209-409d8b27ad2f/httpd-ce32c41d-b618-49e9-bec1-f409454f3679 
2015-07-28 22:16:51,777 INFO  [main] spark.HttpServer (Logging.scala:logInfo(59)) - Starting HTTP Server 
2015-07-28 22:16:52,091 INFO  [main] server.Server (Server.java:doStart(272)) - jetty-8.1.14.v20131031 
2015-07-28 22:16:52,116 INFO  [main] server.AbstractConnector (AbstractConnector.java:doStart(338)) - Started SocketConnector@0.0.0.0:49982 
2015-07-28 22:16:52,116 INFO  [main] util.Utils (Logging.scala:logInfo(59)) - Successfully started service 'HTTP file server' on port 49982. 
2015-07-28 22:16:52,249 INFO  [main] spark.SparkEnv (Logging.scala:logInfo(59)) - Registering OutputCommitCoordinator 
2015-07-28 22:16:54,253 INFO  [main] server.Server (Server.java:doStart(272)) - jetty-8.1.14.v20131031 
2015-07-28 22:16:54,315 INFO  [main] server.AbstractConnector (AbstractConnector.java:doStart(338)) - Started SelectChannelConnector@0.0.0.0:4040 
2015-07-28 22:16:54,317 INFO  [main] util.Utils (Logging.scala:logInfo(59)) - Successfully started service 'SparkUI' on port 4040. 
2015-07-28 22:16:54,386 INFO  [main] ui.SparkUI (Logging.scala:logInfo(59)) - Started SparkUI at http://192.168.2.105:4040
2015-07-28 22:16:54,924 WARN  [main] metrics.MetricsSystem (Logging.scala:logWarning(71)) - Using default name DAGScheduler for source because spark.app.id is not set. 
2015-07-28 22:16:55,132 INFO  [appclient-register-master-threadpool-0] client.AppClient$ClientEndpoint (Logging.scala:logInfo(59)) - Connecting to master spark://myhost:7077... 
2015-07-28 22:16:55,392 WARN  [sparkDriver-akka.actor.default-dispatcher-14] client.AppClient$ClientEndpoint (Logging.scala:logWarning(71)) - Could not connect to myhost:7077: akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster@myhost:7077] 
2015-07-28 22:16:55,412 WARN  [sparkDriver-akka.actor.default-dispatcher-14] remote.ReliableDeliverySupervisor (Slf4jLogger.scala:apply$mcV$sp(71)) - Association with remote system [akka.tcp://sparkMaster@myhost:7077] has failed, address is now gated for [5000] ms. Reason: [Association failed with [akka.tcp://sparkMaster@myhost:7077]] Caused by: [myhost: unknown error] 
2015-07-28 22:16:55,447 WARN  [appclient-register-master-threadpool-0] client.AppClient$ClientEndpoint (Logging.scala:logWarning(92)) - Failed to connect to master myhost:7077 
akka.actor.ActorNotFound: Actor not found for: ActorSelection[Anchor(akka.tcp://sparkMaster@myhost:7077/), Path(/user/Master)] 
        at akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:65) 
        at akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:63) 
        at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32) 
        at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55) 
        at akka.dispatch.BatchingExecutor$Batch.run(BatchingExecutor.scala:73) 
        at akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.unbatchedExecute(Future.scala:74) 
        at akka.dispatch.BatchingExecutor$class.execute(BatchingExecutor.scala:120) 
        at akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.execute(Future.scala:73) 
        at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40) 
        at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248) 
        at akka.pattern.PromiseActorRef.$bang(AskSupport.scala:266) 
        at akka.actor.EmptyLocalActorRef.specialHandle(ActorRef.scala:533) 
        at akka.actor.DeadLetterActorRef.specialHandle(ActorRef.scala:569) 
        at akka.actor.DeadLetterActorRef.$bang(ActorRef.scala:559) 
        at akka.remote.RemoteActorRefProvider$RemoteDeadLetterActorRef.$bang(RemoteActorRefProvider.scala:87) 
        at akka.remote.EndpointWriter.postStop(Endpoint.scala:557) 
        at akka.actor.Actor$class.aroundPostStop(Actor.scala:477) 
        at akka.remote.EndpointActor.aroundPostStop(Endpoint.scala:411) 
        at akka.actor.dungeon.FaultHandling$class.akka$actor$dungeon$FaultHandling$$finishTerminate(FaultHandling.scala:210) 
        at akka.actor.dungeon.FaultHandling$class.terminate(FaultHandling.scala:172) 
        at akka.actor.ActorCell.terminate(ActorCell.scala:369) 
        at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:462) 
        at akka.actor.ActorCell.systemInvoke(ActorCell.scala:478) 
        at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:263) 
        at akka.dispatch.Mailbox.run(Mailbox.scala:219) 
        at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397) 
        at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) 
        at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) 
        at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) 
        at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) 
2015-07-28 22:17:15,459 INFO  [appclient-register-master-threadpool-0] client.AppClient$ClientEndpoint (Logging.scala:logInfo(59)) - Connecting to master spark://myhost:7077... 
2015-07-28 22:17:15,463 WARN  [sparkDriver-akka.actor.default-dispatcher-14] client.AppClient$ClientEndpoint (Logging.scala:logWarning(71)) - Could not connect to myhost:7077: akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster@myhost:7077] 
2015-07-28 22:17:15,464 WARN  [sparkDriver-akka.actor.default-dispatcher-2] remote.ReliableDeliverySupervisor (Slf4jLogger.scala:apply$mcV$sp(71)) - Association with remote system [akka.tcp://sparkMaster@myhost:7077] has failed, address is now gated for [5000] ms. Reason: [Association failed with [akka.tcp://sparkMaster@myhost:7077]] Caused by: [myhost: unknown error] 
2015-07-28 22:17:15,464 WARN  [appclient-register-master-threadpool-0] client.AppClient$ClientEndpoint (Logging.scala:logWarning(92)) - Failed to connect to master myhost:7077 
akka.actor.ActorNotFound: Actor not found for: ActorSelection[Anchor(akka.tcp://sparkMaster@myhost:7077/), Path(/user/Master)] 
        at akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:65) 
        at akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:63) 
        at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32) 
        at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55) 
        at akka.dispatch.BatchingExecutor$Batch.run(BatchingExecutor.scala:73) 
        at akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.unbatchedExecute(Future.scala:74) 
        at akka.dispatch.BatchingExecutor$class.execute(BatchingExecutor.scala:120) 
        at akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.execute(Future.scala:73) 
        at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40) 
        at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248) 
        at akka.pattern.PromiseActorRef.$bang(AskSupport.scala:266) 
        at akka.actor.EmptyLocalActorRef.specialHandle(ActorRef.scala:533) 
        at akka.actor.DeadLetterActorRef.specialHandle(ActorRef.scala:569) 
        at akka.actor.DeadLetterActorRef.$bang(ActorRef.scala:559) 
        at akka.remote.RemoteActorRefProvider$RemoteDeadLetterActorRef.$bang(RemoteActorRefProvider.scala:87) 
        at akka.remote.EndpointWriter.postStop(Endpoint.scala:557) 
        at akka.actor.Actor$class.aroundPostStop(Actor.scala:477) 
        at akka.remote.EndpointActor.aroundPostStop(Endpoint.scala:411) 
        at akka.actor.dungeon.FaultHandling$class.akka$actor$dungeon$FaultHandling$$finishTerminate(FaultHandling.scala:210) 
        at akka.actor.dungeon.FaultHandling$class.terminate(FaultHandling.scala:172) 
        at akka.actor.ActorCell.terminate(ActorCell.scala:369) 
        at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:462) 
        at akka.actor.ActorCell.systemInvoke(ActorCell.scala:478) 
        at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:263) 
        at akka.dispatch.Mailbox.run(Mailbox.scala:219) 
        at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397) 
        at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) 
        at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) 
        at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) 
        at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) 
2015-07-28 22:17:35,136 INFO  [appclient-register-master-threadpool-0] client.AppClient$ClientEndpoint (Logging.scala:logInfo(59)) - Connecting to master spark://myhost:7077... 
2015-07-28 22:17:35,141 WARN  [sparkDriver-akka.actor.default-dispatcher-13] client.AppClient$ClientEndpoint (Logging.scala:logWarning(71)) - Could not connect to myhost:7077: akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster@myhost:7077] 
2015-07-28 22:17:35,142 WARN  [sparkDriver-akka.actor.default-dispatcher-13] remote.ReliableDeliverySupervisor (Slf4jLogger.scala:apply$mcV$sp(71)) - Association with remote system [akka.tcp://sparkMaster@myhost:7077] has failed, address is now gated for [5000] ms. Reason: [Association failed with [akka.tcp://sparkMaster@myhost:7077]] Caused by: [myhost: unknown error] 
2015-07-28 22:17:35,142 WARN  [appclient-register-master-threadpool-0] client.AppClient$ClientEndpoint (Logging.scala:logWarning(92)) - Failed to connect to master myhost:7077 
akka.actor.ActorNotFound: Actor not found for: ActorSelection[Anchor(akka.tcp://sparkMaster@myhost:7077/), Path(/user/Master)] 
        at akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:65) 
        at akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:63) 
        at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32) 
        at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55) 
        at akka.dispatch.BatchingExecutor$Batch.run(BatchingExecutor.scala:73) 
        at akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.unbatchedExecute(Future.scala:74) 
        at akka.dispatch.BatchingExecutor$class.execute(BatchingExecutor.scala:120) 
        at akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.execute(Future.scala:73) 
        at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40) 
        at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248) 
        at akka.pattern.PromiseActorRef.$bang(AskSupport.scala:266) 
        at akka.actor.EmptyLocalActorRef.specialHandle(ActorRef.scala:533) 
        at akka.actor.DeadLetterActorRef.specialHandle(ActorRef.scala:569) 
        at akka.actor.DeadLetterActorRef.$bang(ActorRef.scala:559) 
        at akka.remote.RemoteActorRefProvider$RemoteDeadLetterActorRef.$bang(RemoteActorRefProvider.scala:87) 
        at akka.remote.EndpointWriter.postStop(Endpoint.scala:557) 
        at akka.actor.Actor$class.aroundPostStop(Actor.scala:477) 
        at akka.remote.EndpointActor.aroundPostStop(Endpoint.scala:411) 
        at akka.actor.dungeon.FaultHandling$class.akka$actor$dungeon$FaultHandling$$finishTerminate(FaultHandling.scala:210) 
        at akka.actor.dungeon.FaultHandling$class.terminate(FaultHandling.scala:172) 
        at akka.actor.ActorCell.terminate(ActorCell.scala:369) 
        at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:462) 
        at akka.actor.ActorCell.systemInvoke(ActorCell.scala:478) 
        at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:263) 
        at akka.dispatch.Mailbox.run(Mailbox.scala:219) 
        at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397) 
        at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) 
        at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) 
        at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) 
        at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) 
2015-07-28 22:17:35,462 INFO  [appclient-register-master-threadpool-0] client.AppClient$ClientEndpoint (Logging.scala:logInfo(59)) - Connecting to master spark://myhost:7077... 
2015-07-28 22:17:35,464 WARN  [appclient-register-master-threadpool-0] client.AppClient$ClientEndpoint (Logging.scala:logWarning(92)) - Failed to connect to master myhost:7077 
akka.actor.ActorNotFound: Actor not found for: ActorSelection[Anchor(akka.tcp://sparkMaster@myhost:7077/), Path(/user/Master)] 
        at akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:65) 
        at akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:63) 
        at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32) 
        at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55) 
        at akka.dispatch.BatchingExecutor$Batch.run(BatchingExecutor.scala:73) 
        at akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.unbatchedExecute(Future.scala:74) 
        at akka.dispatch.BatchingExecutor$class.execute(BatchingExecutor.scala:120) 
        at akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.execute(Future.scala:73) 
        at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40) 
        at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248) 
        at akka.pattern.PromiseActorRef.$bang(AskSupport.scala:266) 
        at akka.actor.EmptyLocalActorRef.specialHandle(ActorRef.scala:533) 
        at akka.actor.DeadLetterActorRef.specialHandle(ActorRef.scala:569) 
        at akka.actor.DeadLetterActorRef.$bang(ActorRef.scala:559) 
        at akka.remote.RemoteActorRefProvider$RemoteDeadLetterActorRef.$bang(RemoteActorRefProvider.scala:87) 
        at akka.remote.ReliableDeliverySupervisor$$anonfun$gated$1.applyOrElse(Endpoint.scala:335) 
        at akka.actor.Actor$class.aroundReceive(Actor.scala:467) 
        at akka.remote.ReliableDeliverySupervisor.aroundReceive(Endpoint.scala:188) 
        at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516) 
        at akka.actor.ActorCell.invoke(ActorCell.scala:487) 
        at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238) 
        at akka.dispatch.Mailbox.run(Mailbox.scala:220) 
        at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397) 
        at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) 
        at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) 
        at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) 
        at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) 
2015-07-28 22:17:55,135 INFO  [appclient-register-master-threadpool-0] client.AppClient$ClientEndpoint (Logging.scala:logInfo(59)) - Connecting to master spark://myhost:7077... 
2015-07-28 22:17:55,140 WARN  [sparkDriver-akka.actor.default-dispatcher-19] client.AppClient$ClientEndpoint (Logging.scala:logWarning(71)) - Could not connect to myhost:7077: akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster@myhost:7077] 
2015-07-28 22:17:55,140 WARN  [sparkDriver-akka.actor.default-dispatcher-3] remote.ReliableDeliverySupervisor (Slf4jLogger.scala:apply$mcV$sp(71)) - Association with remote system [akka.tcp://sparkMaster@myhost:7077] has failed, address is now gated for [5000] ms. Reason: [Association failed with [akka.tcp://sparkMaster@myhost:7077]] Caused by: [myhost: unknown error] 
2015-07-28 22:17:55,178 ERROR [appclient-registration-retry-thread] util.SparkUncaughtExceptionHandler (Logging.scala:logError(96)) - Uncaught exception in thread Thread[appclient-registration-retry-thread,5,main] 
java.util.concurrent.RejectedExecutionException: Task java.util.concurrent.FutureTask@3db0c61c rejected from java.util.concurrent.ThreadPoolExecutor@33773fda[Running, pool size = 1, active threads = 1, queued tasks = 0, completed tasks = 4] 
        at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2047) 
        at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:823) 
        at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1369) 
        at java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:112) 
        at org.apache.spark.deploy.client.AppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1.apply(AppClient.scala:96) 
        at org.apache.spark.deploy.client.AppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1.apply(AppClient.scala:95) 
        at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244) 
        at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244) 
        at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33) 
        at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108) 
        at scala.collection.TraversableLike$class.map(TraversableLike.scala:244) 
        at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:108) 
        at org.apache.spark.deploy.client.AppClient$ClientEndpoint.tryRegisterAllMasters(AppClient.scala:95) 
        at org.apache.spark.deploy.client.AppClient$ClientEndpoint.org$apache$spark$deploy$client$AppClient$ClientEndpoint$$registerWithMaster(AppClient.scala:121) 
        at org.apache.spark.deploy.client.AppClient$ClientEndpoint$$anon$2$$anonfun$run$1.apply$mcV$sp(AppClient.scala:132) 
        at org.apache.spark.util.Utils$.tryOrExit(Utils.scala:1218) 
        at org.apache.spark.deploy.client.AppClient$ClientEndpoint$$anon$2.run(AppClient.scala:124) 
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
        at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) 
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) 
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) 
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 
        at java.lang.Thread.run(Thread.java:745) 
2015-07-28 22:17:55,224 INFO  [Thread-0] storage.DiskBlockManager (Logging.scala:logInfo(59)) - Shutdown hook called 
2015-07-28 22:17:55,241 INFO  [Thread-0] util.Utils (Logging.scala:logInfo(59)) - Shutdown hook called 
2015-07-28 22:17:55,243 INFO  [Thread-0] util.Utils (Logging.scala:logInfo(59)) - Deleting directory /private/var/folders/8k/jfw576r50m97rlk5qpj1n4l80000gn/T/spark-f28e24e7-b798-4365-8209-409d8b27ad2f/userFiles-5ccb1927-1499-4deb-b4b2-92a24d8ab7a3 

The problem was that I was trying to start the example app in standalone cluster mode by passing in问题是我试图通过传入以独立集群模式启动示例应用程序

-Dspark.master=spark://myhost:7077

as an argument to the JVM.作为 JVM 的参数。 I launched the example app locally using我使用本地启动了示例应用程序

-Dspark.master=local

and it worked.它奏效了。

I know this is an old question ,我知道这是一个老问题,

just in case, for users come here after installing spark chart on Kubernetis cluster :以防万一,用户在 Kubernetis 集群上安装 spark chart 后来到这里:

  • after chart installation open Spark UI on localhost:8080安装图表后,在 localhost:8080 上打开 Spark UI
  • figure out spark master name , for example : Spark Master at spark://newbie-cricket-master:7077找出火花大师的名字,例如:火花大师在 spark://newbie-cricket-master:7077

  • then on master cmd /bin/spark-shell --master spark://newbie-cricket-master:7077然后在主 cmd /bin/spark-shell --master spark://newbie-cricket-master:7077

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM