简体   繁体   English

SparkContext 错误 - 找不到文件 /tmp/spark-events 不存在

[英]SparkContext Error - File not found /tmp/spark-events does not exist

Running a Python Spark Application via API call - On submitting the Application - response - Failed SSH into the Worker通过 API 调用运行 Python Spark 应用程序 - 提交应用程序时 - 响应 - SSH 失败进入 Worker

My python application exists in我的 python 申请存在于

/root/spark/work/driver-id/wordcount.py

Error can be found in错误可以在

/root/spark/work/driver-id/stderr

Show the following error -显示以下错误 -

Traceback (most recent call last):
  File "/root/wordcount.py", line 34, in <module>
    main()
  File "/root/wordcount.py", line 18, in main
    sc = SparkContext(conf=conf)
  File "/root/spark/python/lib/pyspark.zip/pyspark/context.py", line 115, in __init__
  File "/root/spark/python/lib/pyspark.zip/pyspark/context.py", line 172, in _do_init
  File "/root/spark/python/lib/pyspark.zip/pyspark/context.py", line 235, in _initialize_context
  File "/root/spark/python/lib/py4j-0.9-src.zip/py4j/java_gateway.py", line 1064, in __call__
  File "/root/spark/python/lib/py4j-0.9-src.zip/py4j/protocol.py", line 308, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.io.FileNotFoundException: File file:/tmp/spark-events does not exist.
  at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:402)
  at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:255)
  at org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:100)
  at org.apache.spark.SparkContext.<init>(SparkContext.scala:549)
  at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:59)
  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
  at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:234)
  at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:381)
  at py4j.Gateway.invoke(Gateway.java:214)
  at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:79)
  at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:68)
  at py4j.GatewayConnection.run(GatewayConnection.java:209)
  at java.lang.Thread.run(Thread.java:745)

It indicates - /tmp/spark-events Does not exist - which is true However, in wordcount.py它表明 - /tmp/spark-events 不存在 - 这是真的但是,在 wordcount.py

from pyspark import SparkContext, SparkConf

... few more lines ...

def main():
    conf = SparkConf().setAppName("MyApp").setMaster("spark://ec2-54-209-108-127.compute-1.amazonaws.com:7077")
    sc = SparkContext(conf=conf)
    sc.stop()

if __name__ == "__main__":
    main()

/tmp/spark-events is the location that Spark store the events logs. /tmp/spark-events是Spark存储事件日志的位置。 Just create this directory in the master machine and you're set. 只需在主机中创建此目录即可进行设置。

$mkdir /tmp/spark-events
$ sudo /root/spark-ec2/copy-dir /tmp/spark-events/
RSYNC'ing /tmp/spark-events to slaves...
ec2-54-175-163-32.compute-1.amazonaws.com

While trying to setup my spark history server on my local machine, I had the same 'File file:/tmp/spark-events does not exist.' 在尝试在我的本地计算机上设置我的spark历史记录服务器时,我有相同的'文件文件:/ tmp / spark-events不存在。 error. 错误。 I had customized my log directory to a non-default path. 我已将我的日志目录自定义为非默认路径。 To resolve this, I needed to do 2 things. 要解决这个问题,我需要做两件事。

  1. edit $SPARK_HOME/conf/spark-defaults.conf -- add these 2 lines spark.history.fs.logDirectory /mycustomdir spark.eventLog.enabled true 编辑$ SPARK_HOME / conf / spark-defaults.conf - 添加这两行spark.history.fs.logDirectory /mycustomdir spark.eventLog.enabled true
  2. create a link from /tmp/spark-events to /mycustomdir. 创建从/ tmp / spark-events到/ mycustomdir的链接。
    ln -fs /tmp/spark-events /mycustomdir Ideally, step 1 would have solved my issue entirely, but i still needed to create the link so I suspect there might have been one other setting i missed. ln -fs /tmp/spark-events /mycustomdir理想情况下,第1步可以完全解决我的问题,但我仍然需要创建链接,所以我怀疑可能还有其他一个我错过的设置。 Anyhow, once I did this, i was able to run my historyserver and see new jobs logged in my webui. 无论如何,一旦我这样做,我就可以运行我的历史服务器并查看我的webui中记录的新作业。

Use spark.eventLog.dir for client/driver program 将spark.eventLog.dir用于客户端/驱动程序

spark.eventLog.dir=/usr/local/spark/history

and use spark.history.fs.logDirectory for history server 并使用spark.history.fs.logDirectory作为历史服务器

spark.history.fs.logDirectory=/usr/local/spark/history

as mentioned in: How to enable spark-history server for standalone cluster non hdfs mode 如下所述: 如何为独立群集非hdfs模式启用spark-history服务器

At least as per Spark version 2.2.1 至少按照Spark 2.2.1版本

I just created /tmp/spark-events on the {master} node and then distributed it to other nodes on the cluster to work. 我刚刚在{master}节点上创建了/tmp/spark-events ,然后将它分发到集群上的其他节点才能工作。

mkdir /tmp/spark-events
rsync -a /tmp/spark-events {slaves}:/tmp/spark-events

my spark-default.conf: 我的spark-default.conf:

spark.history.ui.port=18080
spark.eventLog.enabled=true
spark.history.fs.logDirectory=hdfs:///home/elon/spark/events

when I try edit two files spark-default.conf spark_env.sh , and histroy-server starting.当我尝试编辑两个文件spark-default.conf spark_env.sh和 histroy-server 启动时。

spark-default.conf:火花-default.conf:

spark.eventLog.enabled           true
spark.history.ui.port=18080
spark.history.fs.logDirectory={host}:{port}/directory

spark_env.sh spark_env.sh

export SPARK_HISTORY_OPTS="
-Dspark.history.ui.port=18080
-Dspark.history.fs.logDirectory={host}:{port}/directory
-Dspark.history.retainedApplications=30"

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM