简体   繁体   中英

Using HBase put command in airflow bashOperator

I'm trying to insert some data into a Hbase table with a Airflow BashOperator task. I try to first call the hbase shell and then insert some data into my table:

logg_data_to_hbase = BashOperator(
    task_id='data_to_hbase',
    dag=test_dag,
    bash_command="hbase shell && put 'tablename', 'rowname','columnvalue', 1000")

I get an error saying ERROR - Bash command failed.

[2022-01-06 11:01:17,077] {bash_operator.py:100} INFO - Temporary script location: /tmp/airflowtmpcKRT8C/data_to_hbaseY7y25j

[2022-01-06 11:01:17,077] {bash_operator.py:110} INFO - Running command: hbase shell && put 'tablename', 'rowname','columnvalue', 1000

[2022-01-06 11:01:17,091] {bash_operator.py:119} INFO - Output:

[2022-01-06 11:01:28,659] {bash_operator.py:123} INFO - /tmp/airflowtmpcKRT8C/data_to_hbaseY7y25j: line 1: put: command not found

[2022-01-06 11:01:28,660] {bash_operator.py:127} INFO - Command exited with return code 127

[2022-01-06 11:01:28,672] {models.py:1788} ERROR - Bash command failed

Traceback (most recent call last): File "/opt/python-2.7.16-AF-1.10.2-XXX/lib/python2.7/site-packages/airflow/models.py", line 1652, in _run_raw_task result = task_copy.execute(context=context) File "/opt/python-2.7.16-AF-1.10.2-XXX/lib/python2.7/site-packages/airflow/operators/bash_operator.py", line 131, in execute raise AirflowException("Bash command failed") AirflowException: Bash command failed

What do I need to change in order to execute the put-command?

The last line needs to be changed.

bash_command = "echo \"put 'tablename', 'rowname','columnvalue', '1000'\" | hbase shell"

This way the put command gets executed in hbase shell with the help of | .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM