Tag : airflow-operator

I’m wonder where is the task which is based on DockerOperator run in case of Airflow deployed with docker-compose? I mean, I have Airflow services that were deployed with docker-compose (from official documentation): webserver, scheduler, worker, broker (Redis), db (Postgres). I use CeleryExecutor as well. So, I have a simple DAG with one task based ..

Read more

I have docker operator in airflow as: DockerOperator(api_version=’auto’, force_pull=True, command=’/usr/local/test/test.txt student /usr/local/test/result.txt’, image=’test_image’, mem_limit=’5g’, volumes=[ ‘/test/:/usr/local/test/’], task_id=’test’, docker_conn_id=’docker_conn’, dag=dag) Questions: I have image test_image with two tags one is 0.1 and other is latest. Which one would be used in this operator. I suppose it would be the latest not 0.1 What would be its ..

Read more

I have simple DAG with one task which needs to be dockerized. Based on run of that script which have variable1 inside it I need to provide it to callback function created in the DAG file. How this could be solved? with DAG( dag_id=DagName, default_args=default_args, schedule_interval=’12 * * * *’, on_failure_callback=callback_function(variable1), ) as dag: first_task ..

Read more

I am getting below error while running pyhton code on airflow. Airflow is deployed on AWS EC2 it was working fine till now. not sure why this error occurred now. looks like api is not returning output on time and other are running parallel at the same time. Any guide will be helpful. [2020-11-09 01:04:19,301] ..

Read more

I’m trying to use the docker operator to automate the execution of some scripts using airflow. Airflow version: apache-airflow==1.10.12 What I want to do is to "copy" all my project’s files (with folders and files) to the container using this code. The following file ml-intermediate.py is in this directory ~/airflow/dags/ml-intermediate.py: """ Template to convert a ..

Read more

I have a DAG in airflow that uses the KubernetesPodOperator and I am trying to get some files that are generated by the container running in the pod back to the airflow host. For development my host is a Docker container running an airflow image with a docker-desktop K8s cluster and for production I am ..

Read more

I’m trying to use the docker operator on an airflow pipeline. This is the code I’m using: from airflow import DAG from airflow.operators.bash_operator import BashOperator from datetime import datetime, timedelta from airflow.operators.docker_operator import DockerOperator default_args = { ‘owner’ : ‘airflow’, ‘description’ : ‘Use of the DockerOperator’, ‘depend_on_past’ : False, ‘start_date’ : datetime(2018, 1, 3), ’email_on_failure’ ..

Read more