繁体   English   中英

如何增加bluemix上的spark-submit作业的日志记录输出?

[英]how to increase the logging output for spark-submit job on bluemix?

我已将python作业提交给bluemix spark作为服务,但失败了。 不幸的是,日志记录还不够,并且没有给我任何提示为什么它失败了。

如何增加日志级别的输出?

Spark即服务的输出:

==== Failed Status output =====================================================

Getting status
HTTP/1.1 200 OK
Server: nginx/1.8.0
Date: Thu, 12 May 2016 19:09:30 GMT
Content-Type: application/json;charset=utf-8
Content-Length: 850
Connection: keep-alive

{
  "action" : "SubmissionStatusResponse",
  "driverState" : "ERROR",
  "message" : "Exception from the cluster:
org.apache.spark.SparkUserAppException: User application exited with 255
    org.apache.spark.deploy.PythonRunner$.main(PythonRunner.scala:88)
    org.apache.spark.deploy.PythonRunner.main(PythonRunner.scala)
    sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:95)
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:55)
    java.lang.reflect.Method.invoke(Method.java:507)
    org.apache.spark.deploy.ego.EGOClusterDriverWrapper$$anon$3.run(EGOClusterDriverWrapper.scala:430)",
  "serverSparkVersion" : "1.6.0",
  "submissionId" : "xxxxxx",
  "success" : true
}
===============================================================================

我已经在BigInsights集群上成功运行了同一工作。 在biginsights集群上运行时,我还会得到更多详细的输出。

从集群将stderr-%timestamp%stdout-%timestamp%文件下载到运行spark-submit.sh本地目录中。 通常,您会在这两个文件中找到工作问题。

参考: http : //spark.apache.org/docs/latest/spark-standalone.html#monitoring-and-logging

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM