Manage tens to billions of objects at scale
S3 Batch Operations is an Amazon S3 data management feature that lets you manage billions of objects at scale with just a few clicks in the Amazon S3 Management Console or a single API request. With this feature, you can make changes to object metadata and properties, or perform other storage management tasks, such as copying or replicating objects between buckets, replacing object tag sets, modifying access controls, and restoring archived objects from S3 Glacier — instead of taking months to develop custom applications to perform these tasks.
S3 Batch Operations
S3 Batch Operations is a managed solution for performing storage actions like copying and tagging objects at scale, whether for one-time tasks or for recurring, batch workloads. S3 Batch Operations can perform actions across billions of objects and petabytes of data with a single request. To perform work in S3 Batch Operations, you create a job. The job consists of the list of objects, the action to perform, and the set of parameters you specify for that type of operation. You can create and run multiple jobs at a time in S3 Batch Operations or use job priorities as needed to define the precedence of each job and ensures the most critical work happens first. S3 Batch Operations also manages retries, tracks progress, sends completion notifications, generates reports, and delivers events to AWS CloudTrail for all changes made and tasks executed.
S3 Batch Operations complements any event-driven architecture you may be operating today. For new objects, using S3 events and Lambda functions is great for converting file types, creating thumbnails, performing data scans, and carrying out other operations. For example, customers use S3 events and Lambda functions to create smaller sized, low resolution versions of raw photographs when images are first uploaded to S3. S3 Batch Operations complements these existing event-driven workflows by providing a simple mechanism for performing the same actions across your existing objects as well.
How it works: S3 Batch Operations
To perform work in S3 Batch Operations, you create a job. The job consists of the list of objects, the action to perform, and the set of parameters you specify for that type of operation. You can create and run multiple jobs at a time in S3 Batch Operations or use job priorities as needed to define the precedence of each job and ensures the most critical work happens first. S3 Batch Operations also manages retries, tracks progress, sends completion notifications, generates reports, and delivers events to AWS CloudTrail for all changes made and tasks executed.
S3 Batch Operations tutorial
Customers
-
Teespring
Teespring was founded in 2011 and enables users to create and sell custom on-demand products online. As every piece of custom merchandise requires multiple assets inside Teespring, they store petabytes of data in Amazon S3.
-
Capital One
Capital One is a bank founded at the intersection of finance and technology and one of America’s most recognized brands. Capital One used Amazon S3 Batch Operations to copy data between two AWS regions to increase their data’s redundancy and to standardize their data footprint between those two locations.
-
ePlus
ePlus, an AWS Advanced Consulting Partner, works with customers to optimize their IT environments and uses solutions like, S3 Batch Operations, to save clients time and money.