Airflow
Airflow

217
15
433
Apache Spark
Apache Spark

917
96
0
Add tool

What is Airflow?

Use Airflow to author workflows as directed acyclic graphs (DAGs) of tasks. The Airflow scheduler executes your tasks on an array of workers while following the specified dependencies. Rich command lines utilities makes performing complex surgeries on DAGs a snap. The rich user interface makes it easy to visualize pipelines running in production, monitor progress and troubleshoot issues when needed.

What is Apache Spark?

Spark is a fast and general processing engine compatible with Hadoop data. It can run in Hadoop clusters through YARN or Spark's standalone mode, and it can process data in HDFS, HBase, Cassandra, Hive, and any Hadoop InputFormat. It is designed to perform both batch processing (similar to MapReduce) and new workloads like streaming, interactive queries, and machine learning.

Want advice about which of these to choose?Ask the StackShare community!

Why do developers choose Airflow?
Why do developers choose Apache Spark?
What are the cons of using Airflow?
What are the cons of using Apache Spark?
Be the first to leave a con
What companies use Airflow?
What companies use Apache Spark?
What are some alternatives to Airflow and Apache Spark?
Hadoop
The Apache Hadoop software library is a framework that allows for the distributed processing of large data sets across clusters of computers using simple programming models. It is designed to scale up from single servers to thousands of machines, each offering local computation and storage.
Splunk
Splunk Inc. provides the leading platform for Operational Intelligence. Customers use Splunk to search, monitor, analyze and visualize machine data.
Cassandra
Partitioning means that Cassandra can distribute your data across multiple machines in an application-transparent matter. Cassandra will automatically repartition as machines are added and removed from the cluster. Row store means that like relational databases, Cassandra organizes data by rows and columns. The Cassandra Query Language (CQL) is a close relative of SQL.
Apache Flink
Apache Flink is an open source system for fast and versatile data analytics in clusters. Flink supports batch and streaming analytics, in one system. Analytical programs can be written in concise and elegant APIs in Java and Scala.
Amazon Athena
Amazon Athena is an interactive query service that makes it easy to analyze data in Amazon S3 using standard SQL. Athena is serverless, so there is no infrastructure to manage, and you pay only for the queries that you run.
See all alternatives
What tools integrate with Airflow?
What tools integrate with Apache Spark?
No integrations found
Decisions about Airflow and Apache Spark
No stack decisions found
Interest over time
Reviews of Airflow and Apache Spark
No reviews found
How developers use Airflow and Apache Spark
Avatar of Wei Chen
Wei Chen uses Apache SparkApache Spark

Spark is good at parallel data processing management. We wrote a neat program to handle the TBs data we get everyday.

Avatar of Eugene Ivanchenko
Eugene Ivanchenko uses AirflowAirflow

Manage the calculation pipeline and data distribution procedures.

Avatar of Ralic Lo
Ralic Lo uses Apache SparkApache Spark

Used Spark Dataframe API on Spark-R for big data analysis.

Avatar of Kalibrr
Kalibrr uses Apache SparkApache Spark

We use Apache Spark in computing our recommendations.

Avatar of BrainFinance
BrainFinance uses Apache SparkApache Spark

As a part of big data machine learning stack (SMACK).

Avatar of Dotmetrics
Dotmetrics uses Apache SparkApache Spark

Big data analytics and nightly transformation jobs.

Avatar of Christopher Davison
Christopher Davison uses AirflowAirflow

Used for scheduling ETL jobs

How much does Airflow cost?
How much does Apache Spark cost?
Pricing unavailable
Pricing unavailable
News about Apache Spark
More news