How to Upload Files to AWS S3 Using Command Line?

Introduction

AWS S3 is a Simple Storage Service used as an object storage service with high availability, security, and performance. All the files are stored as objects inside the containers called Buckets.

In this tutorial, you’ll create an S3 bucket, create subfolders and upload files to AWS S3 bucket using the AWS CLI.

Prerequisites

Creating S3 Bucket

In this section, you’ll create an S3 bucket that will logically group your files.

s3 mb command in aws cli is used to make a bucket. Use the below command to make a new bucket in your s3.

aws s3 mb s3://newbucketname --region "ap-south-1"
  • aws – Command to invoke AWS Client
  • S3 – Denotes the service where the operation to be performed
  • mb – Make bucket command to denote the make bucket operation
  • S3://newbucketname – S3 URI, desired bucket name to be created
  • region – keyword to specify on which region the bucket needs to be created
  • ap-south-1 – the region name

You’ve created a new bucket in your desired region. Now, you’ll create a subfolder in S3 Bucket.

Creating Subfolder in S3 Bucket

In this section, you’ll create a subfolder inside your existing S3 bucket.

There are no such things called as folders in the S3 bucket. You’ll just create sub-objects inside your existing bucket. It logically acts as a subfolder.

Use the S3API to create a new subdirectory inside your S3 bucket as given below.

aws s3api put-object --bucket existing_bucket_name --key new_sub_directory_name/ --region "ap-south-1"
  • aws – Command to invoke AWS Client
  • S3api – Denotes the service where the operation to be performed
  • put-object – Put object command to put a new object inside an existing object
  • bucket - Keyword bucket
  • existing_bucket_name – Name of the existing bucket where you want to create a sub object
  • –key – keyword to specify the new key name
  • new_sub_directory_name/ – Name of your desired new object name. / is mandatory at the end.
  • region – keyword to specify on which region the bucket needs to be created
  • ap-south-1 – the region name

A new subdirectory is created in your existing bucket. Now, you’ll upload files to the created bucket.

Uploading Single File to S3 Bucket

In this section, you’ll upload a single file to the s3 bucket in two ways.

  • Uploading a file to existing bucket
  • Create a subdirectory in the existing bucket and upload a file into it.

Uploading a Single File to an Existing Bucket

You can use the cp command to upload a file into your existing bucket as shown below.

aws s3 cp file_to_upload.txt s3://existing_bucket_name/ --region "ap-south-1"
  • aws – Command to invoke AWS Client
  • s3 – Denotes the service where the operation to be performed
  • cp – Copy command to copy the file to the bucket
  • file_to_upload.txt – File which needs to be uploaded
  • s3://existing_bucket_name – Existing bucket name to which the file needs to be uploaded
  • –region – Region keyword to specify the region
  • ap-south-1 – actual region to which the file needs to be uploaded

You’ve copied a single file to an existing bucket.

Creating a Subdirectory and Uploading a Single File

You can use s3api putobject command to add an object to your bucket. In this context, you’ll create a subfolder in the existing bucket and upload a file into it by using the –key parameter in the command.

aws s3api put-object --bucket existing_bucket_name --key new_sub_directory_name/file_to_be_uploaded.txt --body file_to_be_uploaded.txt
  • aws – Command to invoke AWS Client
  • S3api – Denotes the service where the operation to be performed
  • put-object – Put object command to put a new object inside an existing object
  • bucket - Keyword bucket
  • existing_bucket_name – Name of the existing bucket where you want to create a sub object
  • –key – keyword to specify the new key name
  • new_sub_directory_name/file_to_be_uploaded.txt – Name of your desired new object name. Here, you’ll specify the full name of the object name to be created. / is used to create the sub objects in the existing buckets and file name is used to upload the files in the specified path.

You’ve created a new subdirectory in the existing bucket and uploaded a file into it.

Uploading All Files From a Directory to S3 Bucket

In this section, you’ll upload all files from a directory to an S3 bucket using two ways.

  • Using copy recursive
  • Using Sync

For demonstration purposes, consider there are three files(firstfile.txt, secondfile.txt, thirdfile.txt in your local directory). Now you’ll see how to copy recursive and sync will work with three files.

You can use the option -- dryrun with both copy recursive and sync commands to check which files will be copied/synced without actually uploading the files. The keyword - - dryrun must be right after the keyword cp or sync

Using Copy Recursive

Copy recursive is a command used to copy the files recursively to the destination directory.

Recursive means it will copy the contents of the directories and if the source directory has the subdirectories, then it will be copied too.

Use the below command to copy the files recursively to your s3 bucket.

aws s3 cp --recursive your_local_directory s3://full_s3_bucket_name/ --region "ap-southeast-2"

You’ll see the below output which means the three files are uploaded to your s3 bucket.

upload: ./firstfile.txt to s3://maindirectory/subdirectory/firstfile.txt
upload: ./secondfile.txt to s3://maindirectory/subdirectory/secondfile.txt
upload: ./thirdfile.txt to s3://maindirectory/subdirectory/thirdfile.txt

You’ve copied files recursively to your s3 bucket. Now, you’ll see how to sync your local directory to your S3 bucket.

Using Sync

Sync is a command used to synchronize source and target directories. Sync is by default recursive which means all the files and subdirectories in the source will be copied to target recursively.

Use the below command to Sync your local directory to your S3 bucket.

aws s3 sync your_local_directory s3://full_s3_bucket_name/ --region "ap-southeast-2"

You’ll see the below output.

upload: ./firstfile.txt to s3://maindirectory/subdirectory/firstfile.txt
upload: ./secondfile.txt to s3://maindirectory/subdirectory/secondfile.txt
upload: ./thirdfile.txt to s3://maindirectory/subdirectory/thirdfile.txt

Since there are no files in your target bucket, all three files will be copied. If two files are already existing, then only one file will be copied.

You’ve copied files using CP and Sync command. Now, you’ll see how to copy specific files using the Wildcard character.

Copying Multiple Files Using Wildcard

In this section, you’ll see how to copy a group of files to your S3 bucket using the cp Wildcard upload function.

The wildcard is a function that allows you to copy files with names in a specific pattern.

Use the below command to copy the files to copy files with the name starts with first.

aws s3 cp --recursive  your_local_directory s3://full_s3_bucket_name/ --exclude "*" --include "first*" --region "ap-southeast-2"
  • aws – command to invoke AWS Client
  • S3 – denotes the service where the operation to be performed
  • cp – copy command to copy the files
  • your_local_directory – source directory from where the files to be copied
  • full_s3_bucket_name – target s3 bucket name to which the files to be copied
  • –exclude “*” – Exclude all files
  • –include “first*” – Include files with names starting as first
  • –region – Region keyword to specify the region
  • ap-south-1 – actual region to which the file needs to be uploaded

Ensure you use the exclude keyword first and then include keyword second to use the wildcard copy appropriately.

You’ll see the below output which means the file which starts with a name first (firstfile.txt) is copied to your S3 Bucket.

upload: ./firstfile.txt to s3://maindirectory/subdirectory/firstfile.txt

You’ve copied files to your s3 bucket using Wildcard copy.

Conclusion

You’ve created directories and Subdirectories in your S3 bucket and copied files to it using the cp and sync command. Copying files answers your question about How to upload files to the AWS S3 bucket.

What Next?

You can host a static website using the files copied to your S3 buckets. Refer to the guide on How to host a static website on AWS S3.

You May Also Like

How To check if a key exists in an S3 bucket using boto3 python

FAQ

upload failed: Could not connect to the endpoint URL

Check if you have access to the S3 Bucket. Also, check if you are using the correct region in the commands

What is the command to copy files recursively in a folder to an s3 bucket?

cp --recursive is the command used to copy files recursively to an s3 bucket. You can also use Sync command with the default recursive keyword.

How do I transfer files from ec2 to s3 bucket?

You can use any of the commands discussed in this article to transfer files from ec2 to s3 bucket.

Leave a Comment