Amazon Kinesis can collect and process hundreds of gigabytes of data per second from hundreds of thousands of sources, allowing you to easily write applications that process information in real-time, from sources such as web site click-streams, marketing and financial information, manufacturing instrumentation and social media, and operational logs and metering data. | It is a data streaming platform based on Apache Kafka: a full-scale streaming platform, capable of not only publish-and-subscribe, but also the storage and processing of data within the stream |
Real-time Processing- Amazon Kinesis enables you to collect and analyze information in real-time, allowing you to answer questions about the current state of your data, from inventory levels to stock trade frequencies, rather than having to wait for an out-of-date report;Easy to use- You can create a new stream, set the throughput requirements, and start streaming data quickly and easily. Amazon Kinesis automatically provisions and manages the storage required to reliably and durably collect your data stream;High throughput. Elastic.- Amazon Kinesis seamlessly scales to match the data throughput rate and volume of your data, from megabytes to terabytes per hour. Amazon Kinesis will scale up or down based on your needs;Integrate with Amazon S3, Amazon Redshift, and Amazon DynamoDB- With Amazon Kinesis, you can reliably collect, process, and transform all of your data in real-time before delivering it to data stores of your choice, where it can be used by existing or new applications. Connectors enable integration with Amazon S3, Amazon Redshift, and Amazon DynamoDB;Build Kinesis Applications- Amazon Kinesis provides developers with client libraries that enable the design and operation of real-time data processing applications. Just add the Amazon Kinesis Client Library to your Java application and it will be notified when new data is available for processing;Low Cost- Amazon Kinesis is cost-efficient for workloads of any scale. You can pay as you go, and you’ll only pay for the resources you use. You can get started by provisioning low throughput streams, and only pay a low hourly rate for the throughput you need | Reliable;
High-performance stream data platform;
Manage and organize data from different sources. |
Statistics | |
Stacks 730 | Stacks 250 |
Followers 604 | Followers 239 |
Votes 9 | Votes 14 |
Pros & Cons | |
Pros
Cons
| Pros
Cons
|
Integrations | |
| No integrations available | |

Kafka is a distributed, partitioned, replicated commit log service. It provides the functionality of a messaging system, but with a unique design.

RabbitMQ gives your applications a common platform to send and receive messages, and your messages a safe place to live until received.

Celery is an asynchronous task queue/job queue based on distributed message passing. It is focused on real-time operation, but supports scheduling as well.

Transmit any volume of data, at any level of throughput, without losing messages or requiring other services to be always available. With SQS, you can offload the administrative burden of operating and scaling a highly available messaging cluster, while paying a low price for only what you use.

NSQ is a realtime distributed messaging platform designed to operate at scale, handling billions of messages per day. It promotes distributed and decentralized topologies without single points of failure, enabling fault tolerance and high availability coupled with a reliable message delivery guarantee. See features & guarantees.

Apache ActiveMQ is fast, supports many Cross Language Clients and Protocols, comes with easy to use Enterprise Integration Patterns and many advanced features while fully supporting JMS 1.1 and J2EE 1.4. Apache ActiveMQ is released under the Apache 2.0 License.

The 0MQ lightweight messaging kernel is a library which extends the standard socket interfaces with features traditionally provided by specialised messaging middleware products. 0MQ sockets provide an abstraction of asynchronous message queues, multiple messaging patterns, message filtering (subscriptions), seamless access to multiple transport protocols and more.

An easy to use, powerful, and reliable system to process and distribute data. It supports powerful and scalable directed graphs of data routing, transformation, and system mediation logic.

Gearman allows you to do work in parallel, to load balance processing, and to call functions between languages. It can be used in a variety of applications, from high-availability web sites to the transport of database replication events.

Highly scalable and effortless data streaming platform. Made to enable developers and data teams to collaborate and build real-time and streaming apps fast.