简体   繁体   中英

Hadoop Oozie Workflow not getting Coordinator properties

I have a simple Oozie coordinator and workflow. I'm trying to pass the coordinator's dataIn property to the workflow as described here: https://oozie.apache.org/docs/3.2.0-incubating/CoordinatorFunctionalSpec.html#a6.7.1._coord:dataInString_name_EL_Function

For some reason, the value is empty in the workflow's properties and EL variable is empty ${inputDir} in the following example.

The actual error is: variable [inputDir] cannot be resolved

Config

coordinator.xml

<?xml version="1.0" encoding="UTF-8"?>
<coordinator-app xmlns="uri:oozie:coordinator:0.4" name="awesome" frequency="${coord:days(1)}" start="2014-10-06T00:01Z" end="2050-01-01T00:01Z" timezone="UTC">
  <controls>
    <!-- Wait 23 hours before giving up -->
    <timeout>1380</timeout>
    <concurrency>1</concurrency>
    <execution>LIFO</execution>
  </controls>
  <datasets>
    <dataset name="itsready" frequency="${coord:days(1)}" initial-instance="2014-10-06T08:00Z" timezone="America/Los_Angeles">
      <uri-template>${s3DataPath}/${YEAR}-${MONTH}-${DAY}</uri-template>
      <!-- with the done-flag set to none, this will look for the folder's existance -->
      <done-flag></done-flag>
    </dataset>
    <!-- output dataset -->
    <dataset name="itsdone" frequency="${coord:days(1)}" initial-instance="2014-10-06T08:00Z" timezone="America/Los_Angeles">
      <uri-template>${dataPath}/awesome/sql-schema-tab-delim-load/${YEAR}-${MONTH}-${DAY}/loaded</uri-template>
    </dataset>
  </datasets>
  <input-events>
    <data-in name="input" dataset="itsready">
      <instance>${coord:current(0)}</instance>
    </data-in>
  </input-events>
  <output-events>
    <data-out name="output" dataset="itsdone">
      <instance>${coord:current(0)}</instance>
    </data-out>
  </output-events>
  <action>
    <workflow>
      <app-path>${workflowApplicationPath}</app-path>
      <configuration>
        <property>
          <name>inputDir</name>
          <value>${coord:dataIn('input')}</value>
        </property>
      </configuration>
    </workflow>
  </action>
</coordinator-app>

workflow.xml

<?xml version="1.0" encoding="UTF-8"?>
<workflow-app xmlns="uri:oozie:workflow:0.4" name="awesome-wf">
  <start to="shell-import"/>
  <action name="shell-import">
    <shell xmlns="uri:oozie:shell-action:0.2">
      <job-tracker>${jobTracker}</job-tracker>
      <name-node>${nameNode}</name-node>
      <exec>${importFile}</exec>
      <env-var>INPUT_DIR=${inputDir}</env-var>
      <file>${importFile}#${importFile}</file>
    </shell>
    <ok to="end"/>
    <error to="fail"/>
  </action>
  <kill name="fail">
    <message>it failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
  </kill>
  <end name="end"/>
</workflow-app>

job.properties

hadoopMaster=myawesome.server.com
nameNode=hdfs://${hadoopMaster}:8020
jobTracker=${hadoopMaster}:8050
tzOffset=-8
oozie.use.system.libpath=true
oozie.libpath=/user/oozie/share/lib
appPath=${nameNode}/apps
dataPath=${appPath}/data
s3DataPath=s3n://an/awesome/s3/data/path

oozie.wf.action.notification.url=https://zapier.com/mysecreturl
workflowApplicationPath=${appPath}/awesome

#uncomment both of these lines to test the workflow
#inputDir=s3://awesome/path/2014-10-06
#oozie.wf.application.path=${workflowApplicationPath}

oozie.coord.application.path=${workflowApplicationPath}


importFile=import.sh

Here's the gist: https://gist.github.com/nathantsoi/dc8caac7109a57c99399#file-awesome-oozie-config-md

Finally had a chance to revisit this. It's working now, but could be for a variety of reasons. For posterity, here's what I changed:

  • removed the empty done flag
  • used dataOut instead of dataIn
  • added another dataOut events, giving each a unique name

It would take some debugging to pin down the exact cause.

This is pretty easy to answer: Your coordinator's start time is 2014-10-06T00:01Z while the initial instance of your dataset is 2014-10-06T08:00Z. So ${coord:current(0)} can't return a valid dataset for the first run of the coordinator.

Seem that you are running only the workflow , not the coordinator.

If you want the coordinator to fill in those param - you need to run the coordinator - which will run the workflow when the data is ready

There could be two possibility:-

  1. instead of ${YEAR}-${MONTH}-${DAY} in uri template ${s3DataPath}/${YEAR}-${MONTH}-${DAY}, try to hard code the value with complete path of hdfs path(like hdfs://namenode:8020/user/data/s3DataPath/2012-10-10 ) and then check whether EL function is substituting dates format correctly or not. If not then check the formatter to define it correctly.

2.It could be with having same value ${coord:current(0)} as input. So try to make it ${coord:current(1)}.

May be it will help.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM