Need advice about which tool to choose?Ask the StackShare community!

Dremio

117
348
+ 1
8
Snowflake

1.1K
1.2K
+ 1
27
Add tool

Dremio vs Snowflake: What are the differences?

Introduction:

Dremio and Snowflake are both popular data platforms that assist organizations in managing and analyzing their data. However, there are key differences between them that differentiate their functionalities and capabilities. This Markdown code presents six distinct differences between Dremio and Snowflake when it comes to data management and analysis.

1. Dremio: Native Execution Engine vs Snowflake: Virtualized Execution: Dremio utilizes a native execution engine, which means it directly executes queries on the data sources, resulting in faster processing and better performance. In contrast, Snowflake follows a virtualized execution approach using a special query optimizer. This allows Snowflake to optimize queries and distribute computing resources more efficiently but may come at the cost of slightly slower execution speed.

2. Dremio: Self-Service Data Integration vs Snowflake: Traditional ETL Pipeline: Dremio prioritizes self-service data integration, empowering users to directly access and integrate various data sources without relying heavily on traditional extract, transform, and load (ETL) pipelines. On the other hand, Snowflake follows a more traditional approach by using ETL pipelines for data integration, which typically involves more steps and additional configuration.

3. Dremio: Data Reflections vs Snowflake: Materialized Views: Dremio integrates a feature called data reflections, which are pre-aggregated and accelerated data representations stored in memory. This enhances query performance by reducing the need for extensive data processing during analysis. In contrast, Snowflake adopts materialized views, which are similar in concept but implemented differently. Materialized views in Snowflake require explicit creation and may not offer the same ease of use and performance optimization features as Dremio's data reflections.

4. Dremio: Interactive Analytics Platform vs Snowflake: Cloud Data Warehouse: Dremio positions itself as an interactive analytics platform, providing users with an interactive and exploratory experience while querying and analyzing data. Snowflake, on the other hand, is primarily marketed as a cloud data warehouse, designed to store and manage large volumes of structured and semi-structured data, with a focus on delivering scalability, durability, and elasticity in a cloud environment.

5. Dremio: Open-Source Core with Enterprise Edition vs Snowflake: Proprietary Data Platform: Dremio offers an open-source core with its community edition, allowing users to access and customize the platform's codebase. Additionally, Dremio provides an enterprise edition with additional enterprise-grade features, support, and scalability options. In contrast, Snowflake is a proprietary data platform, offering a unified and fully managed service with limited customization options compared to Dremio's open-source core.

6. Dremio: On-Premises and Cloud Deployment Options vs Snowflake: Cloud-Only Deployment: Dremio provides users with the flexibility to deploy the platform on-premises or in the cloud, allowing organizations to choose the deployment option that best suits their infrastructure and security requirements. In contrast, Snowflake primarily offers a cloud-only deployment model, where all the data and processing are hosted in the cloud, limiting deployment choices for organizations with specific on-premises requirements.

In Summary, Dremio offers a native execution engine, self-service data integration, data reflections for performance optimization, an interactive analytics platform, an open-source core with an enterprise edition, and on-premises and cloud deployment options. In comparison, Snowflake uses a virtualized execution approach, relies on traditional ETL pipelines, offers materialized views for optimization, focuses on being a cloud data warehouse, provides a proprietary data platform, and primarily supports cloud-only deployment.

Advice on Dremio and Snowflake

We need to perform ETL from several databases into a data warehouse or data lake. We want to

  • keep raw and transformed data available to users to draft their own queries efficiently
  • give users the ability to give custom permissions and SSO
  • move between open-source on-premises development and cloud-based production environments

We want to use inexpensive Amazon EC2 instances only on medium-sized data set 16GB to 32GB feeding into Tableau Server or PowerBI for reporting and data analysis purposes.

See more
Replies (3)
John Nguyen
Recommends
on
AirflowAirflowAWS LambdaAWS Lambda

You could also use AWS Lambda and use Cloudwatch event schedule if you know when the function should be triggered. The benefit is that you could use any language and use the respective database client.

But if you orchestrate ETLs then it makes sense to use Apache Airflow. This requires Python knowledge.

See more
Recommends
on
AirflowAirflow

Though we have always built something custom, Apache airflow (https://airflow.apache.org/) stood out as a key contender/alternative when it comes to open sources. On the commercial offering, Amazon Redshift combined with Amazon Kinesis (for complex manipulations) is great for BI, though Redshift as such is expensive.

See more
Recommends

You may want to look into a Data Virtualization product called Conduit. It connects to disparate data sources in AWS, on prem, Azure, GCP, and exposes them as a single unified Spark SQL view to PowerBI (direct query) or Tableau. Allows auto query and caching policies to enhance query speeds and experience. Has a GPU query engine and optimized Spark for fallback. Can be deployed on your AWS VM or on prem, scales up and out. Sounds like the ideal solution to your needs.

See more
karunakaran karthikeyan
Needs advice
on
DremioDremio
and
TalendTalend

I am trying to build a data lake by pulling data from multiple data sources ( custom-built tools, excel files, CSV files, etc) and use the data lake to generate dashboards.

My question is which is the best tool to do the following:

  1. Create pipelines to ingest the data from multiple sources into the data lake
  2. Help me in aggregating and filtering data available in the data lake.
  3. Create new reports by combining different data elements from the data lake.

I need to use only open-source tools for this activity.

I appreciate your valuable inputs and suggestions. Thanks in Advance.

See more
Replies (1)
Rod Beecham
Partnering Lead at Zetaris · | 3 upvotes · 68K views
Recommends
on
DremioDremio

Hi Karunakaran. I obviously have an interest here, as I work for the company, but the problem you are describing is one that Zetaris can solve. Talend is a good ETL product, and Dremio is a good data virtualization product, but the problem you are describing best fits a tool that can combine the five styles of data integration (bulk/batch data movement, data replication/data synchronization, message-oriented movement of data, data virtualization, and stream data integration). I may be wrong, but Zetaris is, to the best of my knowledge, the only product in the world that can do this. Zetaris is not a dashboarding tool - you would need to combine us with Tableau or Qlik or PowerBI (or whatever) - but Zetaris can consolidate data from any source and any location (structured, unstructured, on-prem or in the cloud) in real time to allow clients a consolidated view of whatever they want whenever they want it. Please take a look at www.zetaris.com for more information. I don't want to do a "hard sell", here, so I'll say no more! Warmest regards, Rod Beecham.

See more
Manage your open source components, licenses, and vulnerabilities
Learn More
Pros of Dremio
Pros of Snowflake
  • 3
    Nice GUI to enable more people to work with Data
  • 2
    Connect NoSQL databases with RDBMS
  • 2
    Easier to Deploy
  • 1
    Free
  • 7
    Public and Private Data Sharing
  • 4
    Multicloud
  • 4
    Good Performance
  • 4
    User Friendly
  • 3
    Great Documentation
  • 2
    Serverless
  • 1
    Economical
  • 1
    Usage based billing
  • 1
    Innovative

Sign up to add or upvote prosMake informed product decisions

Cons of Dremio
Cons of Snowflake
  • 1
    Works only on Iceberg structured data
    Be the first to leave a con

    Sign up to add or upvote consMake informed product decisions

    What is Dremio?

    Dremio—the data lake engine, operationalizes your data lake storage and speeds your analytics processes with a high-performance and high-efficiency query engine while also democratizing data access for data scientists and analysts.

    What is Snowflake?

    Snowflake eliminates the administration and management demands of traditional data warehouses and big data platforms. Snowflake is a true data warehouse as a service running on Amazon Web Services (AWS)—no infrastructure to manage and no knobs to turn.

    Need advice about which tool to choose?Ask the StackShare community!

    Jobs that mention Dremio and Snowflake as a desired skillset
    What companies use Dremio?
    What companies use Snowflake?
    Manage your open source components, licenses, and vulnerabilities
    Learn More

    Sign up to get full access to all the companiesMake informed product decisions

    What tools integrate with Dremio?
    What tools integrate with Snowflake?

    Sign up to get full access to all the tool integrationsMake informed product decisions

    Blog Posts

    Jul 2 2019 at 9:34PM

    Segment

    Google AnalyticsAmazon S3New Relic+25
    10
    6881
    What are some alternatives to Dremio and Snowflake?
    Presto
    Distributed SQL Query Engine for Big Data
    Apache Drill
    Apache Drill is a distributed MPP query layer that supports SQL and alternative query languages against NoSQL and Hadoop data storage systems. It was inspired in part by Google's Dremel.
    Denodo
    It is the leader in data virtualization providing data access, data governance and data delivery capabilities across the broadest range of enterprise, cloud, big data, and unstructured data sources without moving the data from their original repositories.
    AtScale
    Its Virtual Data Warehouse delivers performance, security and agility to exceed the demands of modern-day operational analytics.
    Segment
    Segment is a single hub for customer data. Collect your data in one place, then send it to more than 100 third-party tools, internal systems, or Amazon Redshift with the flip of a switch.
    See all alternatives