简体   繁体   中英

Airflow Docker Operator unable to find .sock file on local machine

I want to run a docker container containing a python script on a schedule with Airflow. I'm running into an issue when running my DockerOperator task through the Airflow CLI locally.

--------------------------------------------------------------------------------
Starting attempt 1 of 4
--------------------------------------------------------------------------------

[2018-10-31 15:20:10,760] {models.py:1569} INFO - Executing <Task(DockerOperator): amplitude_to_s3_docker> on 2018-10-02T00:00:00+00:00
[2018-10-31 15:20:10,761] {base_task_runner.py:124} INFO - Running: ['bash', '-c', 'airflow run get_amplitude_docker_dag amplitude_to_s3_docker 2018-10-02T00:00:00+00:00 --job_id 19 --raw -sd DAGS_FOLDER/amplitude_to_s3_docker_dag.py --cfg_path /var/folders/ys/83xq3b3d1qv3zfx3dtkkp9tc0000gn/T/tmp_lu9mgzz']
[2018-10-31 15:20:12,501] {base_task_runner.py:107} INFO - Job 19: Subtask amplitude_to_s3_docker [2018-10-31 15:20:12,501] {__init__.py:51} INFO - Using executor SequentialExecutor
[2018-10-31 15:20:13,465] {base_task_runner.py:107} INFO - Job 19: Subtask amplitude_to_s3_docker [2018-10-31 15:20:13,464] {models.py:258} INFO - Filling up the DagBag from /Users/thisuser/Projects/GitRepos/DataWarehouse/dags/amplitude_to_s3_docker_dag.py
[2018-10-31 15:20:13,581] {base_task_runner.py:107} INFO - Job 19: Subtask amplitude_to_s3_docker [2018-10-31 15:20:13,581] {example_kubernetes_operator.py:54} WARNING - Could not import KubernetesPodOperator: No module named 'kubernetes'
[2018-10-31 15:20:13,582] {base_task_runner.py:107} INFO - Job 19: Subtask amplitude_to_s3_docker [2018-10-31 15:20:13,582] {example_kubernetes_operator.py:55} WARNING - Install kubernetes dependencies with:     pip install airflow['kubernetes']
[2018-10-31 15:20:13,770] {base_task_runner.py:107} INFO - Job 19: Subtask amplitude_to_s3_docker [2018-10-31 15:20:13,770] {cli.py:492} INFO - Running <TaskInstance: get_amplitude_docker_dag.amplitude_to_s3_docker 2018-10-02T00:00:00+00:00 [running]> on host 254.1.168.192.in-addr.arpa
[2018-10-31 15:20:13,804] {docker_operator.py:169} INFO - Starting docker container from image amplitude
[2018-10-31 15:20:13,974] {models.py:1736} ERROR - create_container() got an unexpected keyword argument 'cpu_shares'
Traceback (most recent call last):
  File "/Users/thisuser/anaconda/lib/python3.5/site-packages/airflow/models.py", line 1633, in _run_raw_task
    result = task_copy.execute(context=context)
  File "/Users/thisuser/anaconda/lib/python3.5/site-packages/airflow/operators/docker_operator.py", line 210, in execute
    working_dir=self.working_dir
TypeError: create_container() got an unexpected keyword argument 'cpu_shares'

I have the script running outside of Airflow fine, using this command:

docker run amplitude get_amplitude.py 2018-10-02 2018-10-02

Here is my dag and task file:

from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from airflow.operators.docker_operator import DockerOperator
from datetime import datetime, timedelta


default_args = {
    "owner": "airflow",
    "depends_on_past": False,
    "start_date": datetime(2018, 10, 30),
    "email": ["me@myemail.com"],
    "email_on_failure": True,
    "email_on_retry": False,
    "retries": 3,
    "retry_delay": timedelta(minutes=5),
}

dag = DAG("get_amplitude_docker_dag", default_args=default_args, schedule_interval=timedelta(minutes=10))

templated_command = """
    get_amplitude.py {{ ds }} {{ ds }}
"""

t1 = DockerOperator(
   task_id='amplitude_to_s3_docker',
   command=templated_command,
   image='amplitude',
   dag=dag
)

After initializing the local airflow db and getting the webserver + scheduler up, I run my dag task with:

airflow run get_amplitude_docker_dag amplitude_to_s3_docker 2018-10-02

Additionally, the task will run fine through airflow if I configure it as a bash operator:

templated_command = """
   docker run amplitude get_amplitude.py {{ ds }} {{ ds }} 
"""


t1 = BashOperator(
    task_id="amplitude_to_s3",
    bash_command=templated_command,
    params={},
    dag=dag,
)

I read before that there can be issues mounting the docker daemon, but my .sock file is located at where the default docker_url parameter points to, /var/run/docker.sock.

Can anyone help me configure this job?

The actual error is TypeError: create_container() got an unexpected keyword argument 'cpu_shares' which means that the create_container function does not expect the cpu_shares as an argument.

I've got the same error using the docker python library version 3.5.1 and downgrading to version 2.7.0 (which seems the latest version that accepts the cpu_shares parameter for create_container ), fixed this problem.

Try running this to downgrade the docker library:

sudo pip3 install docker==2.7.0

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM