简体   繁体   English

在本地模式下运行oozie会出错

[英]Running oozie in local mode gives error

I am trying to run a oozie job using below xml . 我正在尝试使用xml下面的oozie作业。 However the action fails with the error: 但是,操作失败并显示以下错误:

Main class [org.apache.oozie.action.hadoop.SparkMain], exit code [101] 主类[org.apache.oozie.action.hadoop.SparkMain],退出代码[101]

On Analysis of logs I observed that error was because of java.lang.ClassNotFoundException: Mainclass . 在分析日志时,我观察到错误是由于java.lang.ClassNotFoundException:Mainclass引起的 However Mainclass exists in jar in hdfs location. 但是Mainclass存在于hdfs位置的jar中。 The jar is specified in in xml below.Here is my code: 该jar在下面的xml中指定。这是我的代码:

<action name="action1" cred="hive_credentials">
                <spark xmlns="uri:oozie:spark-action:0.2">
                        <job-tracker>${jobTracker}</job-tracker>
                        <name-node>${nameNode}</name-node>
                        <master>local[*]</master>
                        <name>name</name>
                        <class>Mainclass</class>
                        <jar>${jar1}</jar>
                        <spark-opts>
                                --files hive-site.xml --conf spark.yarn.security.tokens.hive.enabled=false
                        </spark-opts>
                        <arg>arg1</arg>
                        <file>${nameNode}/test/${wf:user()}/hive-site.xml</file>
                </spark>
                <ok to="end" />
                <error to="kill_job" />
        </action>

What could be the issue? 可能是什么问题?

I resolved the issue, 我解决了这个问题

1) Creating a "lib" folder directly next to workflow xml 1)直接在工作流xml旁边创建一个“ lib”文件夹

2) Copying Mainclass jar to lib folder 2)将Mainclass jar复制到lib文件夹

3) Specifying only jar name in tag and not fully qualified hdfs path 3)仅在标记中指定jar名称,而不指定完全限定的hdfs路径

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM