How‑To Guides
This section provides practical step‑by‑step instructions to accomplish common tasks in Apache Airflow. Choose a topic from the sidebar or explore the highlights below.
Creating your first DAG
Learn how to define a simple DAG that prints “Hello World”.
from airflow import DAG
from airflow.operators.bash import BashOperator
from datetime import datetime
with DAG(
dag_id="hello_world",
start_date=datetime(2024, 1, 1),
schedule_interval="@daily",
catchup=False,
) as dag:
BashOperator(
task_id="print_hello",
bash_command="echo 'Hello World!'"
)
Using PythonOperator
Execute Python callables directly inside your tasks.
from airflow import DAG
from airflow.operators.python import PythonOperator
from datetime import datetime
def greet():
print("Greetings from Airflow!")
with DAG(
"python_operator_example",
start_date=datetime(2024, 1, 1),
schedule_interval=None,
catchup=False,
) as dag:
PythonOperator(
task_id="greet_task",
python_callable=greet
)
Setting up Connections
Configure external services (e.g., Postgres, S3) via the UI or CLI.
# CLI example
airflow connections add 'my_postgres' \
--conn-uri 'postgresql://user:password@host:5432/dbname'
Deploying Airflow with Docker Compose
Quickly spin up a local development environment.
# Clone the official repo
git clone https://github.com/apache/airflow.git
cd airflow
# Export version
export AIRFLOW_VERSION=2.9.0
export PYTHON_VERSION=3.11
# Generate compose file
docker compose up airflow-webserver