[英]Sqoop - Hive import using Oozie failed
I am trying to execute a sqoop import from oracle to hive, but the job fails with error 我试图执行从Oracle到Hive的sqoop导入,但是作业失败并出现错误
WARN [main] conf.HiveConf (HiveConf.java:initialize(2472)) - HiveConf of name hive.auto.convert.sortmerge.join.noconditionaltask does not exist Intercepting System.exit(1) <<< Invocation of Main class completed <<< Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1] Oozie Launcher failed, finishing Hadoop job gracefully
<property> <name>hive.metastore.uris</name> <value>thrift://sv2lxgsed01.xxxx.com:9083</value> </property>
My workflow.xml is as below 我的workflow.xml如下
<workflow-app name="WorkflowWithSqoopAction" xmlns="uri:oozie:workflow:0.1"> <start to="sqoopAction"/> <action name="sqoopAction"> <sqoop xmlns="uri:oozie:sqoop-action:0.2"> <job-tracker>${jobTracker}</job-tracker> <name-node>${nameNode}</name-node> <command>import --connect jdbc:oracle:thin:@//sv2axcrmdbdi301.xxx.com:1521/DI3CRM --username xxxxxxx --password xxxxxx--table SIEBEL.S_ORG_EXT --hive-table eg.EQX_EG_CRM_S_ORG_EXT --hive-import -m1</command> <file>/user/oozie/oozieProject/workflowSqoopAction/hive-site.xml</file> </sqoop> <ok to="end"/> <error to="killJob"/> </action> <kill name="killJob"> <message>"Killed job due to error: ${wf:errorMessage(wf:lastErrorNode())}"</message> </kill> <end name="end" /> </workflow-app>
I can also find the data being loaded in HDFS. 我还可以找到正在HDFS中加载的数据。
You need to do 2 things 你需要做两件事
1) Copy hive-site.xml in the oozie workflow directory 2) In your Hive action tell oozie that use my hive-site.xml 1)将hive-site.xml复制到oozie工作流目录中2)在您的Hive动作中告诉oozie使用我的hive-site.xml
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.