Data Glossary 🧠
Search
What is Apache Airflow?
Airflow is a data orchestrator and the first that made task scheduling popular with Python. Originally created by Maxime Beauchemin working at Airbnb.
Airflow programmatically author, schedule, and monitor workflows. It follows the imperative paradigm of schedule as how a DAG is run has to be defined within the Airflow jobs. Airflow calls its Workflow as code with the main characteristics
- Dynamic: Airflow pipelines are configured as Python code, allowing for dynamic pipeline generation.
- Extensible: The Airflow framework contains operators to connect with numerous technologies. All Airflow components are extensible to easily adjust to your environment.
- Flexible: Workflow parameterization is built-in leveraging the Jinja Templating engine.