Need advice about which tool to choose?Ask the StackShare community!

Airflow

1.2K
2K
+ 1
112
Pachyderm

18
58
+ 1
5
Add tool

Airflow vs Pachyderm: What are the differences?

Airflow: A platform to programmaticaly author, schedule and monitor data pipelines, by Airbnb. Use Airflow to author workflows as directed acyclic graphs (DAGs) of tasks. The Airflow scheduler executes your tasks on an array of workers while following the specified dependencies. Rich command lines utilities makes performing complex surgeries on DAGs a snap. The rich user interface makes it easy to visualize pipelines running in production, monitor progress and troubleshoot issues when needed; Pachyderm: MapReduce without Hadoop. Analyze massive datasets with Docker. Pachyderm is an open source MapReduce engine that uses Docker containers for distributed computations.

Airflow and Pachyderm are primarily classified as "Workflow Manager" and "Big Data" tools respectively.

Some of the features offered by Airflow are:

  • Dynamic: Airflow pipelines are configuration as code (Python), allowing for dynamic pipeline generation. This allows for writting code that instantiate pipelines dynamically.
  • Extensible: Easily define your own operators, executors and extend the library so that it fits the level of abstraction that suits your environment.
  • Elegant: Airflow pipelines are lean and explicit. Parameterizing your scripts is built in the core of Airflow using powerful Jinja templating engine.

On the other hand, Pachyderm provides the following key features:

  • Git-like File System
  • Dockerized MapReduce
  • Microservice Architecture

Airflow and Pachyderm are both open source tools. Airflow with 13K GitHub stars and 4.72K forks on GitHub appears to be more popular than Pachyderm with 3.81K GitHub stars and 369 GitHub forks.

Advice on Airflow and Pachyderm
Needs advice
on
Apache Spark
Luigi
and
Airflow

I am so confused. I need a tool that will allow me to go to about 10 different URLs to get a list of objects. Those object lists will be hundreds or thousands in length. I then need to get detailed data lists about each object. Those detailed data lists can have hundreds of elements that could be map/reduced somehow. My batch process dies sometimes halfway through which means hours of processing gone, i.e. time wasted. I need something like a directed graph that will keep results of successful data collection and allow me either pragmatically or manually to retry the failed ones some way (0 - forever) times. I want it to then process all the ones that have succeeded or been effectively ignored and load the data store with the aggregation of some couple thousand data-points. I know hitting this many endpoints is not a good practice but I can't put collectors on all the endpoints or anything like that. It is pretty much the only way to get the data.

See more
Replies (1)
Gilroy Gordon
Solution Architect at IGonics Limited · | 2 upvotes · 119.6K views
Recommends
Cassandra

For a non-streaming approach:

You could consider using more checkpoints throughout your spark jobs. Furthermore, you could consider separating your workload into multiple jobs with an intermittent data store (suggesting cassandra or you may choose based on your choice and availability) to store results , perform aggregations and store results of those.

Spark Job 1 - Fetch Data From 10 URLs and store data and metadata in a data store (cassandra) Spark Job 2..n - Check data store for unprocessed items and continue the aggregation

Alternatively for a streaming approach: Treating your data as stream might be useful also. Spark Streaming allows you to utilize a checkpoint interval - https://spark.apache.org/docs/latest/streaming-programming-guide.html#checkpointing

See more
Get Advice from developers at your company using Private StackShare. Sign up for Private StackShare.
Learn More
Pros of Airflow
Pros of Pachyderm
  • 43
    Features
  • 13
    Task Dependency Management
  • 12
    Beautiful UI
  • 11
    Cluster of workers
  • 10
    Extensibility
  • 5
    Open source
  • 4
    Python
  • 4
    Complex workflows
  • 3
    K
  • 2
    Dashboard
  • 2
    Good api
  • 2
    Custom operators
  • 1
    Apache project
  • 3
    Containers
  • 1
    Versioning
  • 1
    Can run on GCP or AWS

Sign up to add or upvote prosMake informed product decisions

Cons of Airflow
Cons of Pachyderm
  • 1
    Open source - provides minimum or no support
  • 1
    Logical separation of DAGs is not straight forward
  • 1
    Running it on kubernetes cluster relatively complex
  • 1
    Observability is not great when the DAGs exceed 250
    Be the first to leave a con

    Sign up to add or upvote consMake informed product decisions

    What is Airflow?

    Use Airflow to author workflows as directed acyclic graphs (DAGs) of tasks. The Airflow scheduler executes your tasks on an array of workers while following the specified dependencies. Rich command lines utilities makes performing complex surgeries on DAGs a snap. The rich user interface makes it easy to visualize pipelines running in production, monitor progress and troubleshoot issues when needed.

    What is Pachyderm?

    Pachyderm is an open source MapReduce engine that uses Docker containers for distributed computations.

    Need advice about which tool to choose?Ask the StackShare community!

    What companies use Airflow?
    What companies use Pachyderm?
    See which teams inside your own company are using Airflow or Pachyderm.
    Sign up for Private StackShareLearn More

    Sign up to get full access to all the companiesMake informed product decisions

    What tools integrate with Airflow?
    What tools integrate with Pachyderm?

    Blog Posts

    What are some alternatives to Airflow and Pachyderm?
    Luigi
    It is a Python module that helps you build complex pipelines of batch jobs. It handles dependency resolution, workflow management, visualization etc. It also comes with Hadoop support built in.
    Apache NiFi
    An easy to use, powerful, and reliable system to process and distribute data. It supports powerful and scalable directed graphs of data routing, transformation, and system mediation logic.
    Jenkins
    In a nutshell Jenkins CI is the leading open-source continuous integration server. Built with Java, it provides over 300 plugins to support building and testing virtually any project.
    AWS Step Functions
    AWS Step Functions makes it easy to coordinate the components of distributed applications and microservices using visual workflows. Building applications from individual components that each perform a discrete function lets you scale and change applications quickly.
    Kubeflow
    The Kubeflow project is dedicated to making Machine Learning on Kubernetes easy, portable and scalable by providing a straightforward way for spinning up best of breed OSS solutions.
    See all alternatives