How can I copy my Amazon EBS snapshot data to Amazon S3 and create EBS volumes for custom data in S3?

3 minute read
1

I want to copy an Amazon Elastic Block Store (Amazon EBS) snapshot to my Amazon Simple Storage Service (Amazon S3) bucket. I also want to create Amazon EBS volumes from data that's stored in my S3 bucket.

Short description

When you create an EBS snapshot, it's automatically stored in an Amazon S3 bucket that AWS manages. You can copy snapshots within the same AWS Region, or from one Region to another. However, you can't copy snapshots to S3 buckets that you manage.

To store snapshots that you infrequently access, consider using Amazon EBS Snapshots Archive. However, if you still want to use Amazon S3 to store your snapshots, then you can use the following workaround.

Resolution

Note: If you receive errors when running AWS Command Line Interface (AWS CLI) commands, make sure that you're using the most recent AWS CLI version.

To copy the contents of your snapshot to your S3 bucket, create a volume from the snapshot. Mount the volume to an Amazon Elastic Compute Cloud (Amazon EC2) Linux instance. Then, use the AWS CLI or S3 APIs to copy the data to your S3 bucket.

To copy the contents of your EBS snapshots to an Amazon S3 bucket, follow these steps:

1.    Create an EBS volume from the snapshot.

2.    Launch an EC2 Linux instance in the same Availability Zone as the volume that you created.

3.    Attach the volume to the instance.

4.    Connect to your Linux instance.

5.    Install the AWS CLI on your Linux instance.

6.    Grant Amazon EC2 instance access to your Amazon S3 bucket.

7.    Run the following command to mount the volume to your instance:

$ sudo mount /dev/xvdf /mnt

Note: The device (/dev/xvdf, in the preceding example) might be attached to the instance with a different device name. Use the lsblk command to view your available disk devices along with their mount points to determine the correct device names.

8.    Install the pv package to monitor the progress during tar archive creation:

Amazon Linux and Red Hat Enterprise Linux (RHEL) distributions

$ sudo yum install pv

Note: Before you install the pv package for Amazon Linux and RHEL distributions, you must turn on the Extra Packages for Enterprise Linux (EPEL) repository. See How do I turn on the EPEL repository for my Amazon EC2 instance running CentOS, RHEL, or Amazon Linux?

Ubuntu and Debian based distributions

$ sudo apt install pv

9.    Run the following command to copy the EBS volume data to your S3 bucket:

$ tar c /mnt | pv -s $(($(du -sk /mnt | awk '{print $1}') \\\* 1024)) | gzip | aws s3 cp - "s3://my-bucket/backup1.tar.gz"

Note: Replace my-bucket with your S3 bucket's name and backup1 with your file's name.

This command creates a compressed file from the /mnt directory and uploads the file to the S3 bucket named my-bucket.

10.   Use the Amazon S3 console to confirm that the compressed file is uploaded to your S3 bucket.

11.    Run the following command to unmount the volume:

$ sudo umount /mnt

12.   Detach the EBS volume from the Linux instance.

13.   Delete the volume, and terminate your instance.

Related information

Copy an Amazon EBS snapshot

How can I grant my Amazon EC2 instance access to an Amazon S3 bucket?

AWS OFFICIAL
AWS OFFICIALUpdated 9 months ago
4 Comments

I believe there might be a missing step to configuring the AWS CLI to handle the multi part upload appropriately by setting the chunk sizes used. If you do the above with a large volume it can result in "An error occurred (InvalidArgument) when calling the UploadPart operation: Part number must be an integer between 1 and 10000, inclusive"

replied a year ago

Thank you for your comment. We'll review and update the Knowledge Center article as needed.

profile pictureAWS
MODERATOR
replied a year ago

The command $ tar c /mnt | pv -s $(($(du -sk /mnt | awk '{print $1}') \\\* 1024)) | gzip | aws s3 cp - "s3://my-bucket/backup1.tar.gz" throws the error

tar: Removing leading `/' from member names
-bash: 19918940 \\* 1024: syntax error: invalid arithmetic operator (error token is "\\* 1024")

I believe the the backslashes are not needed as it worked without it.

profile pictureAWS
SUPPORT ENGINEER
replied 5 months ago

Thank you for your comment. We'll review and update the Knowledge Center article as needed.

profile pictureAWS
MODERATOR
replied 5 months ago