What is CDAP?
Cask Data Application Platform (CDAP) is an open source application development platform for the Hadoop ecosystem that provides developers with data and application virtualization to accelerate application development, address a broader range of real-time and batch use cases, and deploy applications into production while satisfying enterprise requirements.
CDAP is a tool in the Big Data Tools category of a tech stack.
CDAP is an open source tool with 549 GitHub stars and 275 GitHub forks. Here’s a link to CDAP's open source repository on GitHub
Who uses CDAP?
14 developers on StackShare have stated that they use CDAP.
- Streams for data ingestion
- Reusable libraries for common Big Data access patterns
- Data available to multiple applications and different paradigms
- Framework level guarantees
- Full development lifecycle and production deployment
- Standardization of applications across programming paradigms
CDAP Alternatives & Comparisons
What are some alternatives to CDAP?
See all alternatives
Use Airflow to author workflows as directed acyclic graphs (DAGs) of tasks. The Airflow scheduler executes your tasks on an array of workers while following the specified dependencies. Rich command lines utilities makes performing complex surgeries on DAGs a snap. The rich user interface makes it easy to visualize pipelines running in production, monitor progress and troubleshoot issues when needed.
Spark is a fast and general processing engine compatible with Hadoop data. It can run in Hadoop clusters through YARN or Spark's standalone mode, and it can process data in HDFS, HBase, Cassandra, Hive, and any Hadoop InputFormat. It is designed to perform both batch processing (similar to MapReduce) and new workloads like streaming, interactive queries, and machine learning.
A distributed knowledge graph store. Knowledge graphs are suitable for modeling data that is highly interconnected by many types of relationships, like encyclopedic information about the world.
An easy to use, powerful, and reliable system to process and distribute data. It supports powerful and scalable directed graphs of data routing, transformation, and system mediation logic.
An end-to-end data integration platform to build, run, monitor and manage smart data pipelines to deliver continuous data for DataOps