OpsCanary
Back to daily brief
data infradata pipelinePractitioner

Mastering Data Pipelines: Best Practices for Airflow

5 min read Apache Airflow DocsApr 23, 2026
PractitionerHands-on experience recommended

Data pipelines are essential for managing the flow of information in any organization. They automate the movement and transformation of data, ensuring that the right data is available at the right time. However, poorly designed pipelines can lead to inefficiencies and failures. Best practices in Airflow help you create robust, scalable, and maintainable data pipelines.

Airflow operates on Directed Acyclic Graphs (DAGs), which you define using Python code. Each task in a DAG can run on different servers, especially if you're using Kubernetes or Celery executors. This means you should avoid storing files or configurations in the local filesystem, as subsequent tasks may not have access to them. Key parameters like connection_id and gcp_conn_id should be defined in default_args to prevent mistakes. The scheduler executes code outside of the operator’s execute methods, adhering to a minimum interval defined by min_file_process_interval. This allows for dynamic scheduling of your DAGs, making it critical to understand how these components interact.

In production, you need to be aware of the implications of your code structure. For instance, avoid expensive imports at the top level of your DAG files, as these are executed frequently during parsing. Instead, import heavy libraries within your task functions to optimize performance. Additionally, never use the Python datetime now() function inside a task for critical computations, as it can yield inconsistent results across runs. These nuances can significantly impact the reliability and efficiency of your data pipelines.

Key takeaways

  • Define `connection_id` and `gcp_conn_id` in `default_args` to avoid mistakes.
  • Avoid storing files in the local filesystem; tasks may run on different servers.
  • Import heavy libraries within task functions, not at the top level of DAG files.
  • Use XCom for small message communication between tasks effectively.
  • Never use `datetime now()` for critical computations inside tasks.

Why it matters

Implementing these best practices can drastically reduce the risk of failures and inefficiencies in your data pipelines, leading to more reliable data processing and better decision-making in your organization.

Code examples

Python
1```
2import pendulum
3from airflow.sdk import DAG
4from airflow.sdk import task
5def expensive_api_call():
6    print("Hello from Airflow!")
7    sleep(1000)
8my_expensive_response = expensive_api_call()
9with DAG(dag_id="example_python_operator", schedule=None, start_date=pendulum.datetime(2021, 1, 1, tz="UTC"), catchup=False, tags=["example"],) as dag:
10    @task()
11    def print_expensive_api_call():
12        print(my_expensive_response)
13```
Python
1```
2import pendulum
3from airflow.sdk import DAG
4from airflow.sdk import task
5def expensive_api_call():
6    sleep(1000)
7    return "Hello from Airflow!"
8with DAG(dag_id="example_python_operator", schedule=None, start_date=pendulum.datetime(2021, 1, 1, tz="UTC"), catchup=False, tags=["example"],) as dag:
9    @task()
10    def print_expensive_api_call():
11        my_expensive_response = expensive_api_call()
12        print(my_expensive_response)
13```
Python
1```
2from airflow.sdk import DAG
3from airflow.providers.standard.operators.python import PythonOperator
4import pendulum
5def get_task_id():
6    return "print_array_task"  # <- is that code going to be executed?
7def get_array():
8    return [1, 2, 3]  # <- is that code going to be executed?
9with DAG(dag_id="example_python_operator", schedule=None, start_date=pendulum.datetime(2021, 1, 1, tz="UTC"), catchup=False, tags=["example"],) as dag:
10    operator = PythonOperator(task_id=get_task_id(), python_callable=get_array, dag=dag,
11```

When NOT to use this

The official docs don't call out specific anti-patterns here. Use your judgment based on your scale and requirements.

Want the complete reference?

Read official docs

Test what you just learned

Quiz questions written from this article

Take the quiz →

Get the daily digest

One email. 5 articles. Every morning.

No spam. Unsubscribe anytime.