Need advice about which tool to choose?Ask the StackShare community!

CDAP

41
108
+ 1
0
Hue

55
98
+ 1
0
Add tool

CDAP vs Hue: What are the differences?

CDAP: Open source virtualization platform for Hadoop data and apps. Cask Data Application Platform (CDAP) is an open source application development platform for the Hadoop ecosystem that provides developers with data and application virtualization to accelerate application development, address a broader range of real-time and batch use cases, and deploy applications into production while satisfying enterprise requirements; Hue: An open source SQL Workbench for Data Warehouses. It is open source and lets regular users import their big data, query it, search it, visualize it and build dashboards on top of it, all from their browser.

CDAP and Hue belong to "Big Data Tools" category of the tech stack.

CDAP is an open source tool with 354 GitHub stars and 181 GitHub forks. Here's a link to CDAP's open source repository on GitHub.

Manage your open source components, licenses, and vulnerabilities
Learn More

What is CDAP?

Cask Data Application Platform (CDAP) is an open source application development platform for the Hadoop ecosystem that provides developers with data and application virtualization to accelerate application development, address a broader range of real-time and batch use cases, and deploy applications into production while satisfying enterprise requirements.

What is Hue?

It is open source and lets regular users import their big data, query it, search it, visualize it and build dashboards on top of it, all from their browser.

Need advice about which tool to choose?Ask the StackShare community!

What companies use CDAP?
What companies use Hue?
Manage your open source components, licenses, and vulnerabilities
Learn More

Sign up to get full access to all the companiesMake informed product decisions

What tools integrate with CDAP?
What tools integrate with Hue?
    No integrations found
    What are some alternatives to CDAP and Hue?
    Airflow
    Use Airflow to author workflows as directed acyclic graphs (DAGs) of tasks. The Airflow scheduler executes your tasks on an array of workers while following the specified dependencies. Rich command lines utilities makes performing complex surgeries on DAGs a snap. The rich user interface makes it easy to visualize pipelines running in production, monitor progress and troubleshoot issues when needed.
    Apache Spark
    Spark is a fast and general processing engine compatible with Hadoop data. It can run in Hadoop clusters through YARN or Spark's standalone mode, and it can process data in HDFS, HBase, Cassandra, Hive, and any Hadoop InputFormat. It is designed to perform both batch processing (similar to MapReduce) and new workloads like streaming, interactive queries, and machine learning.
    Akutan
    A distributed knowledge graph store. Knowledge graphs are suitable for modeling data that is highly interconnected by many types of relationships, like encyclopedic information about the world.
    Apache NiFi
    An easy to use, powerful, and reliable system to process and distribute data. It supports powerful and scalable directed graphs of data routing, transformation, and system mediation logic.
    StreamSets
    An end-to-end data integration platform to build, run, monitor and manage smart data pipelines that deliver continuous data for DataOps.
    See all alternatives