Amazon Redshift vs Amazon Redshift Spectrum

Need advice about which tool to choose?Ask the StackShare community!

Amazon Redshift

1.3K
1.1K
+ 1
104
Amazon Redshift Spectrum

92
130
+ 1
3
Add tool

Amazon Redshift vs Amazon Redshift Spectrum: What are the differences?

Developers describe Amazon Redshift as "Fast, fully managed, petabyte-scale data warehouse service". Redshift makes it simple and cost-effective to efficiently analyze all your data using your existing business intelligence tools. It is optimized for datasets ranging from a few hundred gigabytes to a petabyte or more and costs less than $1,000 per terabyte per year, a tenth the cost of most traditional data warehousing solutions. On the other hand, Amazon Redshift Spectrum is detailed as "Exabyte-Scale In-Place Queries of S3 Data". With Redshift Spectrum, you can extend the analytic power of Amazon Redshift beyond data stored on local disks in your data warehouse to query vast amounts of unstructured data in your Amazon S3 “data lake” -- without having to load or transform any data.

Amazon Redshift can be classified as a tool in the "Big Data as a Service" category, while Amazon Redshift Spectrum is grouped under "Big Data Tools".

Lyft, Coursera, and 9GAG are some of the popular companies that use Amazon Redshift, whereas Amazon Redshift Spectrum is used by VSCO, CommonBond, and intermix.io. Amazon Redshift has a broader approval, being mentioned in 270 company stacks & 68 developers stacks; compared to Amazon Redshift Spectrum, which is listed in 5 company stacks and 4 developer stacks.

Advice on Amazon Redshift and Amazon Redshift Spectrum

We need to perform ETL from several databases into a data warehouse or data lake. We want to

  • keep raw and transformed data available to users to draft their own queries efficiently
  • give users the ability to give custom permissions and SSO
  • move between open-source on-premises development and cloud-based production environments

We want to use inexpensive Amazon EC2 instances only on medium-sized data set 16GB to 32GB feeding into Tableau Server or PowerBI for reporting and data analysis purposes.

See more
Replies (3)

You could also use AWS Lambda and use Cloudwatch event schedule if you know when the function should be triggered. The benefit is that you could use any language and use the respective database client.

But if you orchestrate ETLs then it makes sense to use Apache Airflow. This requires Python knowledge.

See more
Recommends
AirflowAirflow

Though we have always built something custom, Apache airflow (https://airflow.apache.org/) stood out as a key contender/alternative when it comes to open sources. On the commercial offering, Amazon Redshift combined with Amazon Kinesis (for complex manipulations) is great for BI, though Redshift as such is expensive.

See more
Recommends

You may want to look into a Data Virtualization product called Conduit. It connects to disparate data sources in AWS, on prem, Azure, GCP, and exposes them as a single unified Spark SQL view to PowerBI (direct query) or Tableau. Allows auto query and caching policies to enhance query speeds and experience. Has a GPU query engine and optimized Spark for fallback. Can be deployed on your AWS VM or on prem, scales up and out. Sounds like the ideal solution to your needs.

See more
Get Advice from developers at your company using Private StackShare. Sign up for Private StackShare.
Learn More
Pros of Amazon Redshift
Pros of Amazon Redshift Spectrum
  • 37
    Data Warehousing
  • 27
    Scalable
  • 17
    SQL
  • 14
    Backed by Amazon
  • 5
    Encryption
  • 1
    Cheap and reliable
  • 1
    Isolation
  • 1
    Best Cloud DW Performance
  • 1
    Fast columnar storage
  • 1
    Good Performance
  • 1
    Great Documentation
  • 1
    Economical

Sign up to add or upvote prosMake informed product decisions

What is Amazon Redshift?

It is optimized for data sets ranging from a few hundred gigabytes to a petabyte or more and costs less than $1,000 per terabyte per year, a tenth the cost of most traditional data warehousing solutions.

What is Amazon Redshift Spectrum?

With Redshift Spectrum, you can extend the analytic power of Amazon Redshift beyond data stored on local disks in your data warehouse to query vast amounts of unstructured data in your Amazon S3 “data lake” -- without having to load or transform any data.

Need advice about which tool to choose?Ask the StackShare community!

Jobs that mention Amazon Redshift and Amazon Redshift Spectrum as a desired skillset
What companies use Amazon Redshift?
What companies use Amazon Redshift Spectrum?
See which teams inside your own company are using Amazon Redshift or Amazon Redshift Spectrum.
Sign up for Private StackShareLearn More

Sign up to get full access to all the companiesMake informed product decisions

What tools integrate with Amazon Redshift?
What tools integrate with Amazon Redshift Spectrum?

Sign up to get full access to all the tool integrationsMake informed product decisions

Blog Posts

Jul 9 2019 at 7:22PM

Blue Medora

DockerPostgreSQLNew Relic+8
11
1810
JavaScriptGitHubPython+42
52
19980
GitHubSlackMySQL+44
109
50186
What are some alternatives to Amazon Redshift and Amazon Redshift Spectrum?
Google BigQuery
Run super-fast, SQL-like queries against terabytes of data in seconds, using the processing power of Google's infrastructure. Load data with ease. Bulk load your data using Google Cloud Storage or stream it in. Easy access. Access BigQuery by using a browser tool, a command-line tool, or by making calls to the BigQuery REST API with client libraries such as Java, PHP or Python.
Amazon Athena
Amazon Athena is an interactive query service that makes it easy to analyze data in Amazon S3 using standard SQL. Athena is serverless, so there is no infrastructure to manage, and you pay only for the queries that you run.
Amazon DynamoDB
With it , you can offload the administrative burden of operating and scaling a highly available distributed database cluster, while paying a low price for only what you use.
Hadoop
The Apache Hadoop software library is a framework that allows for the distributed processing of large data sets across clusters of computers using simple programming models. It is designed to scale up from single servers to thousands of machines, each offering local computation and storage.
Microsoft Azure
Azure is an open and flexible cloud platform that enables you to quickly build, deploy and manage applications across a global network of Microsoft-managed datacenters. You can build applications using any language, tool or framework. And you can integrate your public cloud applications with your existing IT environment.
See all alternatives