Category : airflow

I created a custom image with the following Dockerfile: FROM apache/airflow:2.1.1-python3.8 USER root RUN apt-get update && apt-get -y install gcc gnupg2 && curl https://packages.microsoft.com/keys/microsoft.asc | apt-key add – && curl https://packages.microsoft.com/config/debian/10/prod.list > /etc/apt/sources.list.d/mssql-release.list RUN apt-get update && ACCEPT_EULA=Y apt-get -y install msodbcsql17 && ACCEPT_EULA=Y apt-get -y install mssql-tools RUN echo ‘export PATH="$PATH:/opt/mssql-tools/bin"’ >> ~/.bashrc ..

Read more

I am new to aiflow. I am trying to run a container by an airflow, but getting a timeout error: [2021-07-21 07:02:06,176] {docker.py:231} INFO – Starting docker container from image python:3.9.2-slim [2021-07-21 07:03:06,171] {taskinstance.py:1501} ERROR – Task failed with exception Traceback (most recent call last): File "/home/airflow/.local/lib/python3.6/site-packages/urllib3/connectionpool.py", line 426, in _make_request six.raise_from(e, None) File "<string>", ..

Read more

I am getting an error when using DockerOperator from Airflow: raise AirflowException( airflow.exceptions.AirflowException: Invalid arguments were passed to DockerOperator (task_id: etl_in_ch). Invalid arguments were: **kwargs: {‘volumes’: [‘./CH_ETL/src:/usr/src/copy_data’, ‘./pyproject.toml:pyproject.toml’]} My code: from datetime import datetime, timedelta from airflow import DAG from airflow.operators.docker_operator import DockerOperator from airflow.operators.dummy_operator import DummyOperator default_args = { ‘owner’ : ‘airflow’, ‘description’ : ..

Read more

I’m running airflow on a google VM server using docker and I have a bunch of tasks I want to run. I have my scripts inside the dag folder and even if I’m using an absolute path to read a csv file (os.path.dirname(os.path.realpath(file))) I get : FileNotFoundError: [Errno 2] No such file or directory: ‘/opt/airflow/dags/bearish_abandonedbaby/updates3.csv’. ..

Read more

I’m trying to set up airflow instance using docker-compose as described in official docs and I’m stuck at airflow-init part. It looks like there is no connectivity between containers, but I don’t know how to fix it. Currently, I see this in my shell: ~/dwn $ docker-compose up airflow-init 51ad8448b197_dwn_redis_1 is up-to-date 70409dec742c_dwn_postgres_1 is up-to-date ..

Read more

I’m trying to set up airflow instance using docker-compose as described in official docs and I’m stuck at airflow-init part. It looks like there is no connectivity between containers, but I don’t know how to fix it. I use literally the same docker-compose.yaml as described in docs. It can be downloaded here: https://airflow.apache.org/docs/apache-airflow/stable/docker-compose.yaml Currently, I ..

Read more

I was running Airflow using the official docker-compose file, which I removed to add custom credentials, using the command: docker-compose -f docker-compose.yaml down –volumes –rmi all Now, after making the necessary changes my UI is not showing dags, In the Webserver, scheduler, worker container the dags directory has been mapped. Even my containers are running ..

Read more