top of page

Installing and Setting Up Apache Airflow

Introduction 

Apache Airflow is a powerful tool for orchestrating and managing workflows, allowing users to automate complex tasks and monitor their execution efficiently. Whether you’re a data engineer, developer, or analyst, setting up Apache Airflow is the first step toward creating and managing workflows effectively. In this blog, we’ll walk you through the installation and setup process of Apache Airflow, ensuring that you have everything configured for success.


Apache Airflow enables you to:

  • Author workflows as code: Use Python to define workflows, making them modular and scalable.

  • Visualize pipelines: Its web-based UI provides a clear view of task dependencies and execution status.

  • Handle complex dependencies: Airflow is ideal for workflows requiring sophisticated scheduling and task orchestration.

  • Scale effortlessly: Designed to handle workflows of all sizes, from a single task to enterprise-scale pipelines.


System Requirements and Prerequisites

Before diving into the installation, let’s make sure your system meets the necessary hardware and software requirements.

Hardware Requirements

  • RAM: Minimum 4 GB (8 GB or more recommended for larger workflows).

  • CPU: At least 2 cores (multi-core CPUs recommended for distributed execution).

  • Storage: A minimum of 10 GB of free disk space for logs and metadata.


Software Requirements

  • Operating System: Linux or Windows Subsystem for Linux (WSL).

  • Python Version: Python 3.7, 3.8, 3.9, or 3.10 (compatible with the Airflow version you are installing).

  • Pip Version: Minimum version 20.2.


Step-by-Step Installation Guide

Step 1: Install Pip

The first step is to install pip, the Python package manager. It helps you install and manage Python packages required for Airflow.

sudo apt install python3-pip

Step 2: Install Virtual Environment

To isolate Airflow’s dependencies and avoid conflicts with other Python projects, install virtualenv.

sudo pip3 install virtualenv

This command installs virtualenv, which allows you to create an isolated environment for Airflow.


Step 3: Create a Directory for Airflow

Organize your project by creating a dedicated folder for Airflow-related files.

cd Desktop/
mkdir Airflow
cd Airflow/

This step ensures that all Airflow files, configurations, and logs are contained in one place for easy management.


Step 4: Set Up a Virtual Environment

Create a virtual environment to isolate Airflow’s dependencies.

virtualenv airflow_env

This command creates a new virtual environment named airflow_env. The environment ensures that Airflow dependencies don’t interfere with global Python packages.



Step 5: Activate the Virtual Environment

Activate the virtual environment to install Airflow within it.

source airflow_env/bin/activate

Once activated, your terminal prompt will reflect the environment name (airflow_env). All Python packages installed after this step will be contained within the virtual environment.


Step 6: Install Apache Airflow

After activating your virtual environment, the next step is to install Apache Airflow. Use the following command:

pip install "apache-airflow[celery]==2.10.3" --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-2.10.3/constraints-3.8.txt"
  1. pip install "apache-airflow[celery]==2.10.3":

    • This command installs Airflow version 2.10.3 with support for the celery executor.

    • The celery executor allows distributed task execution, enabling Airflow to handle larger workloads by running tasks across multiple workers.


  1. --constraint:

    • This ensures compatibility between Airflow and its dependencies.

    • The provided constraint file (constraints-3.8.txt) contains a list of specific library versions that work well with Python 3.8 and Airflow 2.10.3.

    • Using constraints avoids dependency conflicts during installation.

This step ensures that Airflow is installed correctly along with all its dependencies. Adding celery support prepares your setup for distributed task execution, which is crucial for scaling workflows in production environments.





Step 7: Initialize the Airflow Database

Airflow requires a metadata database to store task states, logs, and configurations. Use the following command to initialize this database:

airflow db init

This command sets up the default SQLite database in your Airflow directory. For production environments, consider switching to PostgreSQL or MySQL for better performance and scalability.


Step 8: Create a Directory for DAGs

DAGs (Directed Acyclic Graphs) define your workflows in Airflow. Create a folder to store all your DAG files.

mkdir dags

Step 9: Create an Admin User

To access the Airflow web interface, you need an admin user. Create one using the following command:

airflow users create --username admin --password admin --firstname admin --lastname admin --role Admin --email admin@gmail.com

This command sets up a user with administrative privileges, enabling you to manage workflows, users, and configurations through the web interface.



Step 10: Verify Users

Check the list of users to confirm that the admin user has been created successfully.

airflow users list

This command displays all registered users, including their roles and email addresses.


Step 11: Start the Airflow Scheduler

The Airflow scheduler monitors task dependencies and triggers task execution. Start it using:

airflow scheduler

This command launches the scheduler, ensuring your tasks are executed as per their defined schedules.


Step 12: Launch the Airflow Web Server

The web server provides a user-friendly interface for managing workflows, monitoring task execution, and troubleshooting errors. Start it using:

airflow webserver

Once the web server is running, open your browser and navigate to:

This URL opens the Airflow web interface. Log in using the admin credentials you created earlier. From the dashboard, you can:

  • View DAGs: Monitor workflows and their execution status.

  • Trigger Tasks: Manually start workflows.

  • Analyze Logs: Check logs for troubleshooting and debugging.



Understanding the Airflow Workflow

Now that Airflow is installed and set up, here’s how its key components work together:

  1. DAGs: Define workflows, task dependencies, and schedules.

  2. Scheduler: Orchestrates task execution based on DAG definitions.

  3. Executor: Runs the tasks (e.g., LocalExecutor for small setups, CeleryExecutor for distributed systems).

  4. Web Interface: Provides a visual representation of workflows and execution history.


Best Practices for Airflow Setup

  • Use a dedicated database: Switch from SQLite to PostgreSQL or MySQL for better performance.

  • Set up environment variables: Store sensitive data like database credentials in environment variables or Airflow’s connection manager.

Enable logging: Configure detailed logs to troubleshoot issues effectively.



Commentaires


bottom of page