Migrating DigitalOcean Spaces to an Amazon S3 Bucket

Francesco Ciulla - Sep 9 '21 - - Dev Community

DigitalOcean Spaces provides Amazon S3-compatible object storage with a simplified pricing model. However, you may at some point find that you need to move your storage off of Spaces and onto Amazon S3. In this post, I'll show how to use the tool Rclone to move your data from Spaces to S3 quickly and easily.

Original Article By Jay Allen

Spaces vs. Amazon S3

Built on the object storage system Ceph, Spaces provides a competitive storage alternative to S3. The base Spaces plan charges a flat $5/month for up to 250GiB of storage and up to 1TiB of data transfer out. That can represent a nice cost saving over Amazon S3, where only the first GiB of data transfer to the Internet is free. And since Spaces is fully S3-compatible, SDK code that works with S3 will work with a Spaces account. Spaces even offers a Content Delivery Network (CDN) at no additional cost.

However, there may be times when you need to bring your data in Spaces over to Amazon S3:

  • You have data security requirements that are well met by features such as AWS PrivateLink for S3
  • You need your data in more regions than DigitalOcean supports (five regional endpoints as opposed to AWS's 24), or need to store data in a region supported by AWS to comply with data protection laws
  • You find that transfer from S3 to other AWS features is faster than transfer from Spaces for your scenario

Whatever the reason, it would be great if you could migrate your data over en masse without having to roll your own script.

Moving from Spaces to S3 with Rclone

Fortunately, the Rclone tool makes this easy. Rclone is a self-described Swiss army knife for storage that supports over 40 different cloud storage products and storage protocols.

Let's walk through this to show just how easy it is. For this walkthrough, I've created a Space on DigitalOcean that contains some random binary files.

image.png

We'll want to transfer this into an Amazon S3 bucket in our AWS account. I've created the following bucket for this purpose:

image.png

Installing AWS CLI and Rclone

Rclone will make use of the AWS CLI. If you don't have it installed, install and configure it with an access key and secret key that has access to your AWS account.

You'll also need to install Rclone. On Linux/Mac/BSD systems, you can run the following command:

curl https://rclone.org/install.sh | sudo bash
Enter fullscreen mode Exit fullscreen mode

Or, you can install using Homebrew:

brew install rclone
Enter fullscreen mode Exit fullscreen mode

On Windows systems, download and install the appropriate executable from the Rclone site. Make sure to add rclone to your system's PATH afterward so that the subsequent commands in this tutorial work.

Obtaining Your Spaces Connection Information

To use Rclone to perform the copy, you'll need to create an rclone.conf file that enables Rclone to connect to both your AWS S3 bucket and to your Spaces space.

If you've set up your AWS CLI, you already have your access key and secret key for AWS. You will need two pieces of information from Spaces:

  • The URL to the endpoint for your Space; and
  • An access key and secret key from DigitalOcean provides access to your Space.

Obtaining your Spaces endpoint is easy: just navigate to your Space in DigitalOcean, where you'll see the URL for your Space. The endpoint you'll use is the regional endpoint without the name of your space (the part highlighted in the red rectangle below):

image.png

To create an access key and secret key for Spaces, navigate to the API page on DigitalOcean. Underneath the section Spaces access keys, click Generate New Key.

image.png

Give your key a name and then click the blue checkmark next to the name field.

image.png

Spaces will generate an access key and a secret token for you, both listed under the column Key. (The actual values in the screenshot below have been blurred for security reasons.) Leave this screen as is - you'll be using these values in just a minute.

image.png

Configure rclone.conf File and Perform Copy

Now you need to tell Rclone how to connect to each of the services. To do this, create an rclone.conf file in ~/.config/rclone/rclone.conf (Linux/Mac/BSD) or in C:\Users\<username>\AppData\Roaming\rclone\rclone.conf (Windows). The file should use the following format:

[s3]
type = s3
env_auth = false
access_key_id = AWS_ACCESS_KEY
secret_access_key = AWS_SECRET
region = us-west-2
acl = private

[spaces]
type = s3
env_auth = false
access_key_id = SPACES_ACCESS_KEY
secret_access_key = SPACES_SECRET
endpoint = sfo3.digitaloceanspaces.com
acl = private
Enter fullscreen mode Exit fullscreen mode

Replace AWS_ACCESS_KEY and AWS_SECRET with your AWS credentials, and SPACES_ACCESS_KEY and SPACES_SECRET with your Spaces credentials. Also make sure that:

  • s3.region lists the correct region for the bucket you plan to copy data into;
  • spaces.endpoint is pointing to the correct Spaces region.

To test your connection to Amazon S3, save this file and, at a command prompt, type:

rclone lsd s3:
Enter fullscreen mode Exit fullscreen mode

If you configured everything correctly, you should see a list of your Amazon S3 buckets.

Next, test your connection to Spaces with the following command:

rclone lsd spaces:
Enter fullscreen mode Exit fullscreen mode

You should see a list of all Spaces you have created in that region.

If everything checks out, go ahead and copy all of your data from Spaces to Amazon S3 using the following command:

rclone sync spaces:jayallentest s3:jayallen-spaces-test
Enter fullscreen mode Exit fullscreen mode

(Make sure to replace the Space name and S3 bucket name with the values appropriate to your accounts.)

The Rclone command line won't give you any direct feedback even if the operation is successful. However, once it returns, you should see all of the data from your Spaces account now located in your Amazon S3 bucket:

image.png

And that's it! With just a little setup and configuration, you can now easily transfer data from DigitalOcean Spaces to Amazon S3.

. . . . . . . . . . . . . . . . .