Cloudflare R2 is a powerful object storage platform, and its S3-compatible feature makes it can be easily accessed and managed anywhere. Additionally, it is free with 10 GB storage per month, which is enough for most application scenarios.

You can start your practice from the official Cloudflare R2 website and manage your object content via a web dashboard. However, logging in and dragging and dropping from the system OS to the Cloudflare website is too complicated. For this, the CLI tools may be helpful.

The CLI I suggest is Rclone (GitHub: rclone/rclone), which supports a variety of cloud services. When it comes to Rclone and Cloudflare R2, you can find the manual from Rclone Installation, Cloudflare Docs, and Rclone Config websites. Now, I will give a minimum practice to use Rclone and Cloudflare R2.

Note

Remember to create your R2 bucket first.

  1. Install Rclone on your machine (macOS or Linux, for example):
    sudo -v ; curl https://rclone.org/install.sh | sudo bash
  2. Create an API token from Cloudflare R2:
    1. Token name: rclone
    2. Permissions: Object Read & Write
    3. Specify bucket(s): Apply to specific buckets only, then select your bucket
    4. Press Create API Token
    5. Copy access_key_id, secret_access_key and endpoint
  3. Config Rclone:
    1. Run vim ~/.config/rclone/rclone.conf to create the Rclone config file
    2. Write the following content into the file:
      [r2]
      type = s3
      provider = Cloudflare
      access_key_id = <YOUR ACCESS_KEY_ID>
      secret_access_key = <YOUR SECRET_ACCESS_KEY>
      endpoint = <YOUR ENDPOINT>
      acl = private
      no_check_bucket = true

That’s OK! Let’s try to work with rclone:

  1. List buckets & objects:
    rclone tree r2:bucket_name
  2. Upload file:
    rclone copy /path/to/local/file.txt r2:bucket_name/
  3. Download file:
    rclone copy r2:bucket_name/file.txt /path/to/local/
  4. Delete file:
    rclone delete r2:bucket_name/file.txt

It’s wonderful. But I don’t want to type the copy and tree command. Can I access the object files as if they’re local?

The answer is yes. Now, I’ll recommend the powerful mount feature of Rclone, it can mount the remote R2 bucket on the local OS.

  1. Install macFUSE (macOS, for example)
  2. Create a directory on the local machine as the mount destination:
    mkdir -p ~/mnt/r2/bucket_name
  3. Mount the bucket with rclone:
    rclone mount r2:bucket_name ~/mnt/r2/bucket_name --daemon

Now you can access the R2 bucket on the local machine!

If you want to unmount the remote directory, just kill the process:

ps -ef | grep rclone | grep -v grep | awk '{print $2}' | xargs kill

You can find more usage at Rclone Docs.