Category : airflow

I’m using the below docker-compose.yml. For some of my dags I need to install additional requirements. Is it possible to specify additional python packages to install while using the official docker compose file? version: ‘3’ x-airflow-common: &airflow-common image: ${AIRFLOW_IMAGE_NAME:-apache/airflow:master-python3.8} environment: &airflow-common-env AIRFLOW__CORE__EXECUTOR: CeleryExecutor AIRFLOW__CORE__SQL_ALCHEMY_CONN: postgresql+psycopg2://airflow:[email protected]/airflow AIRFLOW__CELERY__RESULT_BACKEND: db+postgresql://airflow:[email protected]/airflow AIRFLOW__CELERY__BROKER_URL: redis://:@redis:6379/0 AIRFLOW__CORE__FERNET_KEY: ” AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION: ‘true’ AIRFLOW__CORE__LOAD_EXAMPLES: ‘false’ ..

Read more

I am using the puckel airflow image and upgraded the airflow version to include google and installed apache-airflow-providers-google 2.2.0 as a dependency when I built the container but when I go to airflow on my localhost it still says missing module ‘google’. How can I solve this? Source: Docker..

Read more

I have set up Airflow with docker-compose as described here. https://airflow.apache.org/docs/apache-airflow/stable/start/docker.html And there is a airflow task that have to execute docker command like BashOperator( task_id=’my’, bash_command=""" docker run ………….. """, dag=dag, ) This means Docker package required in the airflow docker image, but there is not. So, I tried to build my own airflow ..

Read more

I have been following this official documentation https://airflow.apache.org/docs/apache-airflow/stable/howto/connection.html to try to set up connections in Airflow. I want to access the uri of the connections in my python scripts so that I can work with my databases. airflow connections get sqlite_default in bash gives me details of the connection. I want to do a similar ..

Read more

https://airflow.apache.org/docs/apache-airflow/stable/start/docker.html I am following the above installation guide. At the step where it says: echo -e "AIRFLOW_UID=$(id -u)nAIRFLOW_GID=0" > .env I think it’s supposed to create a .env file, but it isn’t creating anything for me. If i use the ls command .env doesn’t show up. If i try to move on to the next ..

Read more

I’m trying out apache airflow with the docker-compose using base container container apache/airflow:2.0.1. I’m following this tutorial https://airflow.apache.org/docs/apache-airflow/stable/start/docker.html# How do you run a simple query to get data from a SQL SERVER database? At this stage, I’d just like to see if it’s possible. I’ve tired to extend the image FROM apache/airflow:2.0.1 RUN pip install ..

Read more

I am using airflow 2.0. I am trying to connect to redshift, using docker container on mac. Here is my dag.py: from airflow import DAG from datetime import datetime,timedelta from airflow.operators.bash import BashOperator from airflow.operators.python import PythonOperator,BranchPythonOperator from airflow.providers.jdbc.hooks.jdbc import JdbcHook #Creating JDBC connection using Conn ID JdbcConn = JdbcHook(jdbc_conn_id=’Redshift_conn’) def getconnection(): JdbcConn.get_connection(‘Redshift’) print("connected") default_args ..

Read more