Apache Airflow Documentation

Configuring Airflow

On this page

Overview

Airflow is highly configurable. All runtime settings are stored in the airflow.cfg file, which can be overridden using environment variables or secret backends. This guide walks through the common configuration patterns and best practices for production deployments.

airflow.cfg file

The default location of airflow.cfg is $AIRFLOW_HOME/airflow.cfg. You can generate a fresh file with airflow config generate. Below is a summary of the most important sections.

Section Key Default Description
core executor SequentialExecutor Type of executor to use (e.g., LocalExecutor, CeleryExecutor, KubernetesExecutor)
core sql_alchemy_conn sqlite:///airflow.db SQLAlchemy connection string for the metadata database
webserver base_url http://localhost:8080 Base URL used in generated links (e.g., email alerts)
scheduler max_threads 2 Maximum number of scheduler threads
logging remote_logging False Enable remote log storage (e.g., S3, GCS, Elasticsearch)

Environment Variables

Any configuration key can be overridden by an environment variable prefixed with AIRFLOW__. The naming convention is:

AIRFLOW__{SECTION}__{KEY}=value

Example for setting the executor to CeleryExecutor:

export AIRFLOW__CORE__EXECUTOR=CeleryExecutor

Secret Backends

Sensitive configuration (passwords, tokens) should be stored in a secret backend such as AWS Secrets Manager, Google Secret Manager, or HashiCorp Vault. Enable a backend by adding it to [secrets] section:

[secrets]
backend = airflow.providers.amazon.aws.secrets.manager.AwsSecretsManagerBackend
backend_kwargs = {"connections_prefix": "airflow/connections", "variables_prefix":"airflow/variables"}

Example Configuration

A minimal production airflow.cfg for a Celery deployment:

[core]
executor = CeleryExecutor
sql_alchemy_conn = postgresql+psycopg2://airflow:{{ env['AIRFLOW_DB_PASSWORD'] }}@db:5432/airflow
fernet_key = {{ env['AIRFLOW_FERNET_KEY'] }}

[webserver]
base_url = https://airflow.example.com
workers = 4

[celery]
broker_url = redis://:{{ env['REDIS_PASSWORD'] }}@redis:6379/0
result_backend = db+postgresql://airflow:{{ env['AIRFLOW_DB_PASSWORD'] }}@db:5432/airflow

[secrets]
backend = airflow.providers.amazon.aws.secrets.manager.AwsSecretsManagerBackend
backend_kwargs = {"connections_prefix": "airflow/connections", "variables_prefix":"airflow/variables"}

After editing airflow.cfg, restart the webserver and scheduler to apply changes.