Airflow logo

Airflow

A platform to programmaticaly author, schedule and monitor data pipelines, by Airbnb
861
1.4K
+ 1
79

What is Airflow?

Use Airflow to author workflows as directed acyclic graphs (DAGs) of tasks. The Airflow scheduler executes your tasks on an array of workers while following the specified dependencies. Rich command lines utilities makes performing complex surgeries on DAGs a snap. The rich user interface makes it easy to visualize pipelines running in production, monitor progress and troubleshoot issues when needed.
Airflow is a tool in the Workflow Manager category of a tech stack.
Airflow is an open source tool with 18.7K GitHub stars and 7.3K GitHub forks. Here’s a link to Airflow's open source repository on GitHub

Who uses Airflow?

Companies
186 companies reportedly use Airflow in their tech stacks, including Airbnb, Slack, and Robinhood.

Developers
654 developers on StackShare have stated that they use Airflow.

Airflow Integrations

Public Decisions about Airflow

Here are some stack decisions, common use cases and reviews by companies and developers who chose Airflow in their tech stack.

Shared insights
on
Jenkins
Airflow

I am looking for an open-source scheduler tool with cross-functional application dependencies. Some of the tasks I am looking to schedule are as follows:

  1. Trigger Matillion ETL loads
  2. Trigger Attunity Replication tasks that have downstream ETL loads
  3. Trigger Golden gate Replication Tasks
  4. Shell scripts, wrappers, file watchers
  5. Event-driven schedules

I have used Airflow in the past, and I know we need to create DAGs for each pipeline. I am not familiar with Jenkins, but I know it works with configuration without much underlying code. I want to evaluate both and appreciate any advise

See more
sunilsy08
Software Developer at WedMeGood · | 1 upvotes · 20K views
Shared insights
on
Python
Airflow
Node.js

I need to implement a Node.js cron scheduler like Airflow. Is it possible to implement it without working on Python? Till now, all my jobs are running on my server only via internal script calling another job scripts. Any alternative or better way to implement?

See more

I am looking for the best tool to orchestrate #ETL workflows in non-Hadoop environments, mainly for regression testing use cases. Would Airflow or Apache NiFi be a good fit for this purpose?

For example, I want to run an Informatica ETL job and then run an SQL task as a dependency, followed by another task from Jira. What tool is best suited to set up such a pipeline?

See more
Shared insights
on
AWS Step Functions
Airflow

I am working on a project that grabs a set of input data from AWS S3, pre-processes and divvies it up, spins up 10K batch containers to process the divvied data in parallel on AWS Batch, post-aggregates the data, and pushes it to S3.

I already have software patterns from other projects for Airflow + Batch but have not dealt with the scaling factors of 10k parallel tasks. Airflow is nice since I can look at which tasks failed and retry a task after debugging. But dealing with that many tasks on one Airflow EC2 instance seems like a barrier. Another option would be to have one task that kicks off the 10k containers and monitors it from there.

I have no experience with AWS Step Functions but have heard it's AWS's Airflow. There looks to be plenty of patterns online for Step Functions + Batch. Do Step Functions seem like a good path to check out for my use case? Do you get the same insights on failing jobs / ability to retry tasks as you do with Airflow?

See more

Airflow's Features

  • Dynamic: Airflow pipelines are configuration as code (Python), allowing for dynamic pipeline generation. This allows for writting code that instantiate pipelines dynamically.
  • Extensible: Easily define your own operators, executors and extend the library so that it fits the level of abstraction that suits your environment.
  • Elegant: Airflow pipelines are lean and explicit. Parameterizing your scripts is built in the core of Airflow using powerful Jinja templating engine.
  • Scalable: Airflow has a modular architecture and uses a message queue to talk to orchestrate an arbitrary number of workers. Airflow is ready to scale to infinity.

Airflow Alternatives & Comparisons

What are some alternatives to Airflow?
Luigi
It is a Python module that helps you build complex pipelines of batch jobs. It handles dependency resolution, workflow management, visualization etc. It also comes with Hadoop support built in.
Apache NiFi
An easy to use, powerful, and reliable system to process and distribute data. It supports powerful and scalable directed graphs of data routing, transformation, and system mediation logic.
Jenkins
In a nutshell Jenkins CI is the leading open-source continuous integration server. Built with Java, it provides over 300 plugins to support building and testing virtually any project.
AWS Step Functions
AWS Step Functions makes it easy to coordinate the components of distributed applications and microservices using visual workflows. Building applications from individual components that each perform a discrete function lets you scale and change applications quickly.
Pachyderm
Pachyderm is an open source MapReduce engine that uses Docker containers for distributed computations.
See all alternatives

Airflow's Followers
1365 developers follow Airflow to keep up with related blogs and decisions.