|
- Can someone explain in simple terms to me what a directed acyclic graph . . .
6 A DAG is a graph where everything flows in the same direction and no node can reference back to itself Think of ancestry trees; they are actually DAGs All DAGs have Nodes (places to store data) Directed Edges (that point in the same direction) An ancestral node (a node without parents) Leaves (nodes that have no children) DAGs are different
- How to Trigger a DAG on the success of a another DAG in Airflow using . . .
I have a python DAG Parent Job and DAG Child Job The tasks in the Child Job should be triggered on the successful completion of the Parent Job tasks which are run daily How can add external job t
- python - How to Run a Simple Airflow DAG - Stack Overflow
I am totally new to Airflow I would like to run a simple DAG at a specified date I'm struggling to make difference between the start date, the execution date, and backfilling And what is the com
- airflow - Is there a benefit to use the with dag as DAG (. . . ) clause . . .
@jonrsharpe I see that the the dag is pushed to a dag context manager, which seems to be some kind of collection Does that mean are supposed to use the "with as " clause when creating a dag ?
- DAG marked as success if one task fails, because of trigger . . . - airflow
However, since end is the last task and succeeds, the DAG is always marked as SUCCESS How can I configure my DAG so that if one of the tasks failed, the whole DAG is marked as FAILED? Example to reproduce import datetime from airflow import DAG from airflow operators bash_operator import BashOperator from airflow utils import trigger_rule dag
- python - How to control the parallelism or concurrency of an Airflow . . .
Here's an expanded list of configuration options that are available since Airflow v1 10 2 Some can be set on a per-DAG or per-operator basis, but may also fall back to the setup-wide defaults when they are not specified
- how do I use the --conf option in airflow - Stack Overflow
I am trying to run a airflow DAG and need to pass some parameters for the tasks How do I read the JSON string passed as the --conf parameter in the command line trigger_dag command, in the python DAG file
- Airflow DAG Import errors even after clearing import_error table
I have these DAG Import Errors on my Airflow UI I am working on a test server so I have free hand I have deleted all records from task_instance; dag_run; import_error; serialized_dag; and dag; postgres metadata tables still the records reappears in dag, import_error, serialized_dag table
|
|
|