Apache Airflow Documentation - Stable
Welcome to the stable documentation for Apache Airflow. This documentation covers the latest stable release of Airflow, providing comprehensive guides, concepts, and API references for building and managing your data pipelines.
Getting Started
This section provides a quick overview of Airflow and how to get it up and running.
- Installation: Learn how to install Airflow on your system.
- Core Concepts: Understand the fundamental building blocks of Airflow, such as DAGs, tasks, operators, and executors.
Key Features
Airflow offers a rich set of features to empower your data orchestration needs.
- Dynamic Pipeline Generation: Pipelines are defined as Python code, allowing for dynamic generation.
- Rich User Interface: A beautiful and intuitive web UI to monitor and manage your workflows.
- Extensibility: A vast ecosystem of operators and hooks to integrate with various services.
- Scalability: Designed to scale horizontally to handle complex and large-scale workflows.
- Robustness: Features like retries, SLAs, and alerting ensure your pipelines are reliable.
How-to Guides
Dive into practical guides on how to accomplish specific tasks with Airflow.
- Managing Connections: Learn how to configure and manage connections to external services.
- Writing Custom Operators: Discover how to create your own operators for specific use cases.
- Using Variables: Understand how to store and retrieve configuration variables.
- Authoring DAGs: A comprehensive guide to defining your workflows in Python.
Note: For the absolute latest features and changes, please refer to the development documentation, but for production environments, the stable release is recommended.
API Reference
Explore the detailed API reference for all modules and classes within Airflow.
Community and Support
Join the vibrant Airflow community for help, discussions, and contributions.