|
- Apache Airflow
Apache Airflow® provides many plug-and-play operators that are ready to execute your tasks on Google Cloud Platform, Amazon Web Services, Microsoft Azure and many other third-party services
- What is Airflow®? — Airflow 3. 1. 3 Documentation
Apache Airflow® is an open-source platform for developing, scheduling, and monitoring batch-oriented workflows Airflow’s extensible Python framework enables you to build workflows connecting with virtually any technology
- Apache Airflow® 3 is Generally Available! | Apache Airflow
We announced our intent to focus on Apache Airflow 3 0® as the next big milestone for the Airflow project at the Airflow Summit in September 2024 We are delighted to announce that Airflow 3 0 is now released!
- Documentation - Apache Airflow
Airflow has an official Helm Chart that will help you set up your own Airflow on a cloud on-prem Kubernetes environment and leverage its scalable nature to support a large group of users
- Installation of Airflow® — Airflow 3. 1. 3 Documentation
Airflow consists of many components, often distributed among many physical or virtual machines, therefore installation of Airflow might be quite complex, depending on the options you choose
- Quick Start — Airflow 3. 1. 3 Documentation - Apache Airflow
Airflow requires a home directory, and uses ~ airflow by default, but you can set a different location if you prefer The AIRFLOW_HOME environment variable is used to inform Airflow of the desired location
- Tutorials — Airflow 3. 1. 3 Documentation
Tutorials ¶ Once you have Airflow up and running with the Quick Start, these tutorials are a great way to get a sense for how Airflow works
- Architecture Overview — Airflow 3. 1. 3 Documentation
Airflow is a platform that lets you build and run workflows A workflow is represented as a Dag (a Directed Acyclic Graph), and contains individual pieces of work called Tasks, arranged with dependencies and data flows taken into account
|
|
|