Deploying Apache Airflow with Docker
This guide demonstrates how to quickly get a production‑ready Apache Airflow deployment up and running using Docker and Docker Compose. The configuration follows the official Docker installation guide but adds best‑practice tweaks for observability and security.
Prerequisites
- Docker Engine ≥ 20.10
- Docker Compose ≥ 2.0
- At least 4 GB RAM allocated to Docker
- Basic knowledge of
docker
anddocker‑compose
Docker Compose Setup
Create a directory for your deployment and place the following docker-compose.yaml
file inside.
# docker-compose.yaml
version: "3.8"
services:
postgres:
image: postgres:15-alpine
environment:
POSTGRES_USER: airflow
POSTGRES_PASSWORD: airflow
POSTGRES_DB: airflow
volumes:
- postgres-db:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U airflow"]
interval: 10s
timeout: 5s
retries: 5
redis:
image: redis:7-alpine
command: ["redis-server", "--appendonly", "yes"]
volumes:
- redis-data:/data
airflow-webserver:
image: apache/airflow:2.9.0-python3.11
restart: always
depends_on:
postgres:
condition: service_healthy
redis:
condition: service_started
environment:
# Core
AIRFLOW__CORE__EXECUTOR: CeleryExecutor
AIRFLOW__CORE__SQL_ALCHEMY_CONN: postgresql+psycopg2://airflow:airflow@postgres/airflow
AIRFLOW__CORE__FERNET_KEY: ''
AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION: 'true'
# Celery
AIRFLOW__CELERY__BROKER_URL: redis://redis:6379/0
AIRFLOW__CELERY__RESULT_BACKEND: db+postgresql://airflow:airflow@postgres/airflow
# Webserver
AIRFLOW__WEBSERVER__EXPOSE_CONFIG: 'true'
ports:
- "8080:8080"
volumes:
- ./dags:/opt/airflow/dags
- ./logs:/opt/airflow/logs
- ./plugins:/opt/airflow/plugins
command: webserver
airflow-scheduler:
image: apache/airflow:2.9.0-python3.11
restart: always
depends_on:
airflow-webserver:
condition: service_started
environment: *airflow_webserver_environment
volumes:
- ./dags:/opt/airflow/dags
- ./logs:/opt/airflow/logs
- ./plugins:/opt/airflow/plugins
command: scheduler
volumes:
postgres-db:
redis-data:
Save the file and run:
docker compose up -d
The web UI will be available at http://localhost:8080. Default credentials are airflow
/ airflow
(change them immediately).
Running Airflow Commands
To execute any Airflow CLI command inside the container, use docker compose exec
:
docker compose exec airflow-webserver airflow users create \
--username admin \
--firstname Admin \
--lastname User \
--role Admin \
--email admin@example.com
Initialize the database (first‑time only):
docker compose exec airflow-webserver airflow db init
Customizing Images
If you need extra Python packages, create a custom Dockerfile that extends the official image:
# Dockerfile
FROM apache/airflow:2.9.0-python3.11
USER root
RUN apt-get update && apt-get install -y git
USER airflow
# Install Python dependencies
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
Then reference the image in docker-compose.yaml
:
airflow-webserver:
build: .
# other settings unchanged
FAQ
- How to enable persistent logs? Mount a host directory (as shown) or configure a remote log backend (e.g., S3, GCS).
- Can I use a different executor? Yes, replace
CeleryExecutor
withLocalExecutor
orKubernetesExecutor
and adjust the services accordingly. - Where are the environment variables documented? See the Configuration Reference.