Need advice about which tool to choose?Ask the StackShare community!
Add tool
Get Advice from developers at your company using StackShare Enterprise. Sign up for StackShare Enterprise.
Learn MorePros of Apache Oozie
Pros of Zookeeper
Pros of Apache Oozie
Be the first to leave a pro
Pros of Zookeeper
- High performance ,easy to generate node specific config11
- Kafka support8
- Java8
- Spring Boot Support5
- Supports extensive distributed IPC3
- Used in ClickHouse2
- Supports DC/OS2
- Embeddable In Java Service1
- Curator1
- Used in Hadoop1
Sign up to add or upvote prosMake informed product decisions
What is Apache Oozie?
It is a server-based workflow scheduling system to manage Hadoop jobs. Workflows in it are defined as a collection of control flow and action nodes in a directed acyclic graph. Control flow nodes define the beginning and the end of a workflow as well as a mechanism to control the workflow execution path.
What is Zookeeper?
A centralized service for maintaining configuration information, naming, providing distributed synchronization, and providing group services. All of these kinds of services are used in some form or another by distributed applications.
Need advice about which tool to choose?Ask the StackShare community!
Jobs that mention Apache Oozie and Zookeeper as a desired skillset
What companies use Apache Oozie?
What companies use Zookeeper?
What companies use Apache Oozie?
What companies use Zookeeper?
See which teams inside your own company are using Apache Oozie or Zookeeper.
Sign up for StackShare EnterpriseLearn MoreSign up to get full access to all the companiesMake informed product decisions
What tools integrate with Apache Oozie?
What tools integrate with Zookeeper?
What tools integrate with Apache Oozie?
No integrations found
Sign up to get full access to all the tool integrationsMake informed product decisions
Blog Posts
What are some alternatives to Apache Oozie and Zookeeper?
Apache Spark
Spark is a fast and general processing engine compatible with Hadoop data. It can run in Hadoop clusters through YARN or Spark's standalone mode, and it can process data in HDFS, HBase, Cassandra, Hive, and any Hadoop InputFormat. It is designed to perform both batch processing (similar to MapReduce) and new workloads like streaming, interactive queries, and machine learning.
Airflow
Use Airflow to author workflows as directed acyclic graphs (DAGs) of tasks. The Airflow scheduler executes your tasks on an array of workers while following the specified dependencies. Rich command lines utilities makes performing complex surgeries on DAGs a snap. The rich user interface makes it easy to visualize pipelines running in production, monitor progress and troubleshoot issues when needed.
Apache NiFi
An easy to use, powerful, and reliable system to process and distribute data. It supports powerful and scalable directed graphs of data routing, transformation, and system mediation logic.
Yarn
Yarn caches every package it downloads so it never needs to again. It also parallelizes operations to maximize resource utilization so install times are faster than ever.
Apache Beam
It implements batch and streaming data processing jobs that run on any execution engine. It executes pipelines on multiple execution environments.