簡體   English   中英

spark-submit不使用YARN

[英]spark-submit not using YARN

我已經設置了一個帶有YARN的5節點hadoop集群,所有5個節點上也設置了Spark。 我使用的是spark-1.5.0-cdh5.5.0

當我跑

spark-shell --master yarn --num-executors 3

這啟動了一個預期的shell,並使用紗線從RM獲取資源。 所以,我猜測火花正在按預期使用hadoop conf文件。 但是,當我做火花提交

spark-submit word_count.py --master yarn-cluster --num-executors 3

這是試圖連接到火花大師,相信在紗線上運行時不需要。 錯誤如下:

    16/11/08 00:18:31 INFO util.Utils: Successfully started service 'HTTP file server' on port 47990.
16/11/08 00:18:31 INFO spark.SparkEnv: Registering OutputCommitCoordinator
16/11/08 00:18:41 INFO server.Server: jetty-8.y.z-SNAPSHOT
16/11/08 00:18:41 INFO server.AbstractConnector: Started SelectChannelConnector@0.0.0.0:4040
16/11/08 00:18:41 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
16/11/08 00:18:41 INFO ui.SparkUI: Started SparkUI at http://10.0.0.4:4040
16/11/08 00:18:41 INFO util.Utils: Copying /home/rshaik26/word_count.py to /tmp/spark-0a5348f8-5ba8-4906-89af-7499054b554e/userFiles-287b5d13-123a-4bd6-9fe3-489af2a502a1/word_count.py
16/11/08 00:18:41 INFO spark.SparkContext: Added file file:/home/rshaik26/word_count.py at http://10.0.0.4:47990/files/word_count.py with timestamp 1478544521986
16/11/08 00:18:42 WARN metrics.MetricsSystem: Using default name DAGScheduler for source because spark.app.id is not set.
16/11/08 00:18:42 INFO client.AppClient$ClientEndpoint: Connecting to master spark://ubuntuhdp2:7077...
16/11/08 00:18:42 WARN client.AppClient$ClientEndpoint: Could not connect to ubuntuhdp2:7077: akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]
16/11/08 00:18:42 ERROR akka.ErrorMonitor: AssociationError [akka.tcp://sparkDriver@10.0.0.4:53411] -> [akka.tcp://sparkMaster@ubuntuhdp2:7077]: Error [Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]] [
akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]
Caused by: akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2: Connection refused: ubuntuhdp2/10.0.0.4:7077
]
akka.event.Logging$Error$NoCause$
16/11/08 00:18:42 WARN client.AppClient$ClientEndpoint: Could not connect to ubuntuhdp2:7077: akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]
16/11/08 00:18:42 ERROR akka.ErrorMonitor: AssociationError [akka.tcp://sparkDriver@10.0.0.4:53411] -> [akka.tcp://sparkMaster@ubuntuhdp2:7077]: Error [Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]] [
akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]
Caused by: akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2: Connection refused: ubuntuhdp2/10.0.0.4:7077
]
akka.event.Logging$Error$NoCause$
16/11/08 00:18:42 WARN client.AppClient$ClientEndpoint: Could not connect to ubuntuhdp2:7077: akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]
16/11/08 00:18:42 ERROR akka.ErrorMonitor: AssociationError [akka.tcp://sparkDriver@10.0.0.4:53411] -> [akka.tcp://sparkMaster@ubuntuhdp2:7077]: Error [Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]] [
akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]
Caused by: akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2: Connection refused: ubuntuhdp2/10.0.0.4:7077
]
akka.event.Logging$Error$NoCause$
16/11/08 00:18:42 WARN client.AppClient$ClientEndpoint: Could not connect to ubuntuhdp2:7077: akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]
16/11/08 00:18:42 ERROR akka.ErrorMonitor: AssociationError [akka.tcp://sparkDriver@10.0.0.4:53411] -> [akka.tcp://sparkMaster@ubuntuhdp2:7077]: Error [Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]] [
akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]
Caused by: akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2: Connection refused: ubuntuhdp2/10.0.0.4:7077
]
akka.event.Logging$Error$NoCause$
16/11/08 00:18:42 WARN client.AppClient$ClientEndpoint: Failed to connect to master ubuntuhdp2:7077
akka.actor.ActorNotFound: Actor not found for: ActorSelection[Actor[akka.tcp://sparkMaster@ubuntuhdp2:7077/]/user/Master]
    at akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:66)
    at akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:64)
    at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
    at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.processBatch$1(BatchingExecutor.scala:67)
    at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:82)
    at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
    at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
    at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72)
    at akka.dispatch.BatchingExecutor$Batch.run(BatchingExecutor.scala:58)
    at akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.unbatchedExecute(Future.scala:74)
    at akka.dispatch.BatchingExecutor$class.execute(BatchingExecutor.scala:110)
    at akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.execute(Future.scala:73)
    at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
    at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
    at akka.pattern.PromiseActorRef.$bang(AskSupport.scala:269)
    at akka.actor.EmptyLocalActorRef.specialHandle(ActorRef.scala:512)
    at akka.actor.DeadLetterActorRef.specialHandle(ActorRef.scala:545)
    at akka.actor.DeadLetterActorRef.$bang(ActorRef.scala:535)
    at akka.remote.RemoteActorRefProvider$RemoteDeadLetterActorRef.$bang(RemoteActorRefProvider.scala:91)
    at akka.actor.ActorRef.tell(ActorRef.scala:125)
    at akka.dispatch.Mailboxes$$anon$1$$anon$2.enqueue(Mailboxes.scala:44)
    at akka.dispatch.QueueBasedMessageQueue$class.cleanUp(Mailbox.scala:438)
    at akka.dispatch.UnboundedDequeBasedMailbox$MessageQueue.cleanUp(Mailbox.scala:650)
    at akka.dispatch.Mailbox.cleanUp(Mailbox.scala:309)
    at akka.dispatch.MessageDispatcher.unregister(AbstractDispatcher.scala:204)
    at akka.dispatch.MessageDispatcher.detach(AbstractDispatcher.scala:140)
    at akka.actor.dungeon.FaultHandling$class.akka$actor$dungeon$FaultHandling$$finishTerminate(FaultHandling.scala:203)
    at akka.actor.dungeon.FaultHandling$class.terminate(FaultHandling.scala:163)
    at akka.actor.ActorCell.terminate(ActorCell.scala:338)
    at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:431)
    at akka.actor.ActorCell.systemInvoke(ActorCell.scala:447)
    at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:262)
    at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:240)
    at akka.dispatch.Mailbox.run(Mailbox.scala:219)
    at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
    at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
    at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
    at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
    at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
16/11/08 00:19:02 INFO client.AppClient$ClientEndpoint: Connecting to master spark://ubuntuhdp2:7077...
16/11/08 00:19:02 WARN client.AppClient$ClientEndpoint: Could not connect to ubuntuhdp2:7077: akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]
16/11/08 00:19:02 ERROR akka.ErrorMonitor: AssociationError [akka.tcp://sparkDriver@10.0.0.4:53411] -> [akka.tcp://sparkMaster@ubuntuhdp2:7077]: Error [Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]] [
akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]
Caused by: akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2: Connection refused: ubuntuhdp2/10.0.0.4:7077
]
akka.event.Logging$Error$NoCause$
16/11/08 00:19:02 WARN client.AppClient$ClientEndpoint: Could not connect to ubuntuhdp2:7077: akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]
16/11/08 00:19:02 ERROR akka.ErrorMonitor: AssociationError [akka.tcp://sparkDriver@10.0.0.4:53411] -> [akka.tcp://sparkMaster@ubuntuhdp2:7077]: Error [Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]] [
akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]
Caused by: akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2: Connection refused: ubuntuhdp2/10.0.0.4:7077
]
akka.event.Logging$Error$NoCause$
16/11/08 00:19:02 WARN client.AppClient$ClientEndpoint: Could not connect to ubuntuhdp2:7077: akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]
16/11/08 00:19:02 ERROR akka.ErrorMonitor: AssociationError [akka.tcp://sparkDriver@10.0.0.4:53411] -> [akka.tcp://sparkMaster@ubuntuhdp2:7077]: Error [Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]] [
akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]
Caused by: akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2: Connection refused: ubuntuhdp2/10.0.0.4:7077
]
akka.event.Logging$Error$NoCause$
16/11/08 00:19:02 WARN client.AppClient$ClientEndpoint: Could not connect to ubuntuhdp2:7077: akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]
16/11/08 00:19:02 ERROR akka.ErrorMonitor: AssociationError [akka.tcp://sparkDriver@10.0.0.4:53411] -> [akka.tcp://sparkMaster@ubuntuhdp2:7077]: Error [Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]] [
akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]
Caused by: akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2: Connection refused: ubuntuhdp2/10.0.0.4:7077
]
akka.event.Logging$Error$NoCause$
16/11/08 00:19:02 WARN client.AppClient$ClientEndpoint: Failed to connect to master ubuntuhdp2:7077
akka.actor.ActorNotFound: Actor not found for: ActorSelection[Actor[akka.tcp://sparkMaster@ubuntuhdp2:7077/]/user/Master]
    at akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:66)
    at akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:64)
    at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
    at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.processBatch$1(BatchingExecutor.scala:67)
    at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:82)
    at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
    at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
    at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72)
    at akka.dispatch.BatchingExecutor$Batch.run(BatchingExecutor.scala:58)
    at akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.unbatchedExecute(Future.scala:74)
    at akka.dispatch.BatchingExecutor$class.execute(BatchingExecutor.scala:110)
    at akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.execute(Future.scala:73)
    at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
    at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
    at akka.pattern.PromiseActorRef.$bang(AskSupport.scala:269)
    at akka.actor.EmptyLocalActorRef.specialHandle(ActorRef.scala:512)
    at akka.actor.DeadLetterActorRef.specialHandle(ActorRef.scala:545)
    at akka.actor.DeadLetterActorRef.$bang(ActorRef.scala:535)
    at akka.remote.RemoteActorRefProvider$RemoteDeadLetterActorRef.$bang(RemoteActorRefProvider.scala:91)
    at akka.actor.ActorRef.tell(ActorRef.scala:125)
    at akka.dispatch.Mailboxes$$anon$1$$anon$2.enqueue(Mailboxes.scala:44)
    at akka.dispatch.QueueBasedMessageQueue$class.cleanUp(Mailbox.scala:438)
    at akka.dispatch.UnboundedDequeBasedMailbox$MessageQueue.cleanUp(Mailbox.scala:650)
    at akka.dispatch.Mailbox.cleanUp(Mailbox.scala:309)
    at akka.dispatch.MessageDispatcher.unregister(AbstractDispatcher.scala:204)
    at akka.dispatch.MessageDispatcher.detach(AbstractDispatcher.scala:140)
    at akka.actor.dungeon.FaultHandling$class.akka$actor$dungeon$FaultHandling$$finishTerminate(FaultHandling.scala:203)
    at akka.actor.dungeon.FaultHandling$class.terminate(FaultHandling.scala:163)
    at akka.actor.ActorCell.terminate(ActorCell.scala:338)
    at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:431)
    at akka.actor.ActorCell.systemInvoke(ActorCell.scala:447)
    at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:262)
    at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:240)
    at akka.dispatch.Mailbox.run(Mailbox.scala:219)
    at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
    at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
    at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
    at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
    at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

如果有任何配置錯誤,請幫幫我

這是spark-env.sh

SPARK_JAVA_OPTS=-Dspark.driver.port=53411
HADOOP_CONF_DIR=/usr/lib/hadoop-2.6.0-cdh5.5.0/etc/hadoop/
SPARK_MASTER_IP=ubuntuhdp2
SPARK_DIST_CLASSPATH=$(hadoop classpath):/usr/lib/hadoop-2.6.0-cdh5.5.0/share/hadoop/tools/lib/*

火花defaults.conf

spark.master                     spark://ubuntuhdp2:7077
# spark.eventLog.enabled           true
# spark.eventLog.dir               hdfs://namenode:8021/directory
  spark.serializer                 org.apache.spark.serializer.KryoSerializer
# spark.driver.memory              5g
# spark.executor.extraJavaOptions  -XX:+PrintGCDetails -Dkey=value -Dnumbers="one two three"

我相信:

spark-submit word_count.py --master yarn-cluster --num-executors 3

你說spark launching “我想提交word_count.py而我的[application-arguments]--master yarn-cluster --num-executors 3所以他選擇了默認的主人。

試試以下:

 spark-submit --master yarn-cluster --num-executors 3 word_count.py 

是的,這應該為紗線提供火花,當您定義-cluster ,您的應用程序在群集上運行(在“紗線資源節點”上)。

我在我的系統上試過這個。 線索是這個日志聲明:

numExecutors已設置為默認值:2

maxExecutors已設置為默認值:2

即使我增加了執行者的數量,我也得到了同樣的結果。

所以修復很簡單:

spark-submit --master yarn-cluster --num-executors 3 word_count.py 

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM