简体   繁体   English

Hadoop Streaming Job在OOzie中不起作用

[英]Hadoop Streaming Job not working in OOzie

I am trying to write a simple map only hadoop streaming job reading data from hdfs and pushing it to vertica. 我试图编写一个仅包含hadoop流作业的简单地图,该作业从hdfs读取数据并将其推送到vertica。

I have written a shell script as below 我写了如下的shell脚本

./vsql -c "copy $TABLE from stdin delimiter E'\t' direct null '\\N';" -U $DBUSER -w $DBPWD -h $DBHOST -p $DBPORT

I have created oozie workflow as : 我创建了oozie工作流程为:

 <action name="loadToVertica">
        <map-reduce>
                            <job-tracker>${jobTracker}</job-tracker>
                            <name-node>${nameNode}</name-node>
                            <prepare>
                                    <delete path="${nameNode}/user/$USER/output/${exportDataDate}"/>
                            </prepare>
                            <streaming>
                                    <mapper>shell export.sh</mapper>
                            </streaming>
                            <configuration>
                                    <property>
                                            <name>oozie.libpath</name>
                                            <value>${wfsBasePath}/libs</value>
                                    </property>
                                    <property>
                                            <name>mapred.input.dir</name>
                                            <value>${nameNode}/user/$USER$/{exportDataDate}</value>
                                    </property>
                                    <property>
                                            <name>mapred.output.dir</name>
                                            <value>${nameNode}/user/$USER/output/${exportDataDate}</value>
                                    </property>
                                    <property>
                                            <name>mapred.reduce.tasks</name>
                                            <value>0</value>
                                    </property>
                            </configuration>
                            <file>${wfsBasePath}/libs/${STREAMING_JAR_PATH}#${STREAMING_JAR_PATH}</file>
                            <file>${wfsBasePath}/libs/oozie-sharelib-streaming-4.2.0.2.5.3.0-37.jar#oozie-sharelib-streaming-4.2.0.2.5.3.0-37.jar</file>
                            <file>${wfsBasePath}/scripts/export.sh#export.sh</file>
                            <file>${wfsBasePath}/config/vsql#vsql</file>
                    </map-reduce>
            <ok to="end"/>
           <error to="end"/>
        </action>

When i run this the status of job is Failed/Killed without any error message. 当我运行此程序时,作业的状态为“失败/杀死”,而没有任何错误消息。

Adding -e in after #!/bin/sh helped me to trace what the actual error is. 在#!/ bin / sh之后添加-e可以帮助我跟踪实际的错误。

After adding -e option in the script there was an error code in the logs. 在脚本中添加-e选项后,日志中将出现错误代码。

After this the first line would look like : 之后,第一行看起来像:

#!/bin/sh -e

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM