Cloning (Save and Restore) DynamoDB Table to Another AWS Account
By Hyuntaek Park
Senior full-stack engineer at Twigfarm
At Twigfarm, we created a new AWS account and some of the DynamoDB tables should be copied from the old account to the new account. I thought it would be a very simple process like other database systems have a dump and restore feature. But it wasn't that simple.
I tried a few different ways to achieve our goal but nothing really was simple enough. Lucky, one of the AWS Solutions Architects proposed me to a new solution that AWS just released: https://aws.amazon.com/blogs/database/amazon-dynamodb-can-now-import-amazon-s3-data-into-a-new-table. You should read the above article first to get a sense of how it is done.
In this article I will explain how I save the DynamoDB table into the S3 which is in a different AWS account, then create the DynamoDB table from what is saved in S3.
architecture
Architecture is simply simple. In this article, I will explain how to save and restore. Some are done using the console and some are done through the AWS CLI.
Prerequisites
You need to have your AWS CLI profiles for both source and destination accounts. In this article, I use source-user and destination-user for profile names. Please refer to https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-profiles.html.
DynamoDB table
I have a simple DynamoDB table in the source account. Our final goal is to have a table with those two items in the destination account.
Enable point-in-time recovoer (PITR)
Choose the source DynamoDB table > Bakcups > Edit button under point-in-time recovery (PITR). Then enable the point-in-time-recovery feature.
Destination S3 bucket
Log into the destination AWS account. It is convenient if you use a different web browser or open a new window with incognito mode.
Create a bucket as the following:
Bucket name: <YOUR_UNIQUE_BUCKET_NAME>
Choose ACL enabled and Object Writer for Object Ownership.
S3 bucket policy
Copy and paste the following JSON for your destination bucket. Ensure to replace SOURCE_ACCOUNT_NO, SOURCE_USER_NAME, and destination_bucket_namewith your own.
Export table to Amazon S3 using Command Line Interface (CLI)
Enter the following command in your terminal.
If there is no error after suppressing the command, you would see one line under exports to S3. Wait a few minutes. The status will change from Completed to Completed. Now the tempting part is done.
Restore from the S3
Now you go to DynamoDB > Imports from S3. Then click the Import from S3 button.
Click Browse S3 button and drill down folders until you see the data folder and choose the file with json.gz extension.
Then you fill out the form to create a new DynamoDB table.
Verification
Go to DynamoDB. Check the if table items are imported such as the following.
If you see the interesting table contents, congratulations! Imports from S3 feature is a new feature. Without the feature, you would have struggled with many threats AWS services and permissions. Here with the Imports from S3Feature, only S3 and DynamoDB are the services that we need to buy.
Thanks!