Help developers discover the tools you use. Get visibility for your team's tech choices and contribute to the community's knowledge.
Import/Export supports importing and exporting data into and out of Amazon S3 buckets. For significant data sets, AWS Import/Export is often faster than Internet transfer and more cost effective than upgrading your connectivity. | The AWS Storage Gateway is a service connecting an on-premises software appliance with cloud-based storage. Once the AWS Storage Gateway’s software appliance is installed on a local host, you can mount Storage Gateway volumes to your on-premises application servers as iSCSI devices, enabling a wide variety of systems and applications to make use of them. Data written to these volumes is maintained on your on-premises storage hardware while being asynchronously backed up to AWS, where it is stored in Amazon Glacier or in Amazon S3 in the form of Amazon EBS snapshots. Snapshots are encrypted to make sure that customers do not have to worry about encrypting sensitive data themselves. When customers need to retrieve data, they can restore snapshots locally, or create Amazon EBS volumes from snapshots for use with applications running in Amazon EC2. It provides low-latency performance by maintaining frequently accessed data on-premises while securely storing all of your data encrypted. |
Data Migration – If you have data you need to upload into the AWS cloud for the first time, AWS Import/Export is often much faster than transferring that data via the Internet.;Content Distribution – Send data to your customers on portable storage devices.;Direct Data Interchange – If you regularly receive content on portable storage devices from your business associates, you can have them send it directly to AWS for import into Amazon S3 or Amazon EBS.;Offsite Backup – Send full or incremental backups to Amazon S3 and Amazon Glacier for reliable and redundant offsite storage.;Disaster Recovery – In the event you need to quickly retrieve a large backup stored in Amazon S3, use AWS Import/Export to transfer the data to a portable storage device and deliver it to your site. | Gateway-Cached Volumes – Gateway-Cached volumes allow you to utilize Amazon S3 for your primary data, while retaining some portion of it locally in a cache for frequently accessed data.;Gateway-Stored Volumes – Gateway-Stored volumes store your primary data locally, while asynchronously backing up that data to AWS.;Data Snapshots – Gateway-Cached volumes and Gateway-Stored volumes provide the ability to create and store point-in-time snapshots of your storage volumes in Amazon S3.;Gateway-VTL – Gateway-VTL provides you with a cost-effective, scalable, and durable virtual tape infrastructure that allows you to eliminate the challenges associated with owning and operating an on-premises physical tape infrastructure.;Secure – The AWS Storage Gateway securely transfers your data to AWS over SSL and stores data encrypted at rest in Amazon S3 and Amazon Glacier using Advanced Encryption Standard (AES) 256, a secure symmetric-key encryption standard using 256-bit encryption keys.;Durably backed by Amazon S3 and Amazon Glacier –The AWS Storage Gateway durably stores your on-premises application data by uploading it to Amazon S3 and Amazon Glacier. Amazon S3 and Amazon Glacier redundantly store data in multiple facilities and on multiple devices within each facility. Amazon S3 and Amazon Glacier also perform regular, systematic data integrity checks and are built to be automatically self-healing.;Compatible – There is no need to re-architect your on-premises applications. Gateway-Cached volumes and Gateway-Stored volumes expose a standard iSCSI block disk device interface and Gateway-VTL presents a standard iSCSI virtual tape library interface. |
Statistics | |
Stacks 5 | Stacks 17 |
Followers 31 | Followers 59 |
Votes 0 | Votes 0 |

In order to keep costs low, Amazon Glacier is optimized for data that is infrequently accessed and for which retrieval times of several hours are suitable. With Amazon Glacier, customers can reliably store large or small amounts of data for as little as $0.01 per gigabyte per month, a significant savings compared to on-premises solutions.

AWS Data Pipeline is a web service that provides a simple management system for data-driven workflows. Using AWS Data Pipeline, you define a pipeline composed of the “data sources” that contain your data, the “activities” or business logic such as EMR jobs or SQL queries, and the “schedule” on which your business logic executes. For example, you could define a job that, every hour, runs an Amazon Elastic MapReduce (Amazon EMR)–based analysis on that hour’s Amazon Simple Storage Service (Amazon S3) log data, loads the results into a relational database for future lookup, and then automatically sends you a daily summary email.

AWS Snowball Edge is a 100TB data transfer device with on-board storage and compute capabilities. You can use Snowball Edge to move large amounts of data into and out of AWS, as a temporary storage tier for large local datasets, or to support local workloads in remote or offline locations.

It is an elegant and simple HTTP library for Python, built for human beings. It allows you to send HTTP/1.1 requests extremely easily. There’s no need to manually add query strings to your URLs, or to form-encode your POST data.

It is a .NET library that can read/write Office formats without Microsoft Office installed. No COM+, no interop.

It's focus is on performance; specifically, end-user perceived latency, network and server resource usage.

It is an open-source bulk data loader that helps data transfer between various databases, storages, file formats, and cloud services.

It is a backup program that is fast, efficient and secure. It uses cryptography to guarantee the confidentiality and integrity of your data.

It is industry-leading Backup & Replication software. It delivers availability for all your cloud, virtual and physical workloads. Through a simple-by-design management console, you can easily achieve fast, flexible and reliable backup, recovery and replication for all your applications and data.

It is a deduplicating backup program. It provides an efficient and secure way to backup data. The data deduplication technique used makes it suitable for daily backups since only changes are stored. The authenticated encryption technique makes it suitable for backups to not fully trusted targets.