How to Copy (or Move Files) From One Bucket to Another Using Boto3 [Python]?

Boto3 is an AWS SDK for Python. It allows users to create, and manage AWS services such as EC2 and S3.

You can use the Boto3 Session and bucket.copy() method to copy files between S3 buckets.

You need your AWS account credentials for performing copy or move operations.

If you’re in Hurry

import boto3

#Creating Session With Boto3.
session = boto3.Session(
aws_access_key_id='Your Access Key ID',
aws_secret_access_key='You Secret access key'
)

#Creating S3 Resource From the Session.
s3 = session.resource('s3')

#Create a Soucre Dictionary That Specifies Bucket Name and Key Name of the Object to Be Copied
copy_source = {
    'Bucket': 'your_source_bucket_name',
    'Key': 'Object_Key_with_file_extension'
}

bucket = s3.Bucket('target_bucket_name')

bucket.copy(copy_source, 'target_object_name_with_extension')

# Printing the Information That the File Is Copied.
print('Single File is copied')

If You Want to Understand Details, Read on…

In this tutorial, you’ll learn how to copy or move objects between S3 buckets using different methods.

Installing Boto3

Install Boto3 using the below command.

pip3 install boto3

Copying S3 Object From One Bucket to Another Using Boto3

In this section, you’ll copy an s3 object from one bucket to another.

Creating Boto3 Session and Resource

  • Create a session with Boto3. You need to specify credentials for connecting Boto3 to s3. The credentials required are AWS Access key id and secret access key.
  • Create an S3 resource using the Boto3 session. Use the below code to create an S3 resource.

Creating Source Bucket Dictionary

A source bucket dictionary is necessary to copy the objects using bucket.copy() method. Dictionary is a python implementation of data structures known as an associative array. It consists of a collection of key-value pairs.

  • Create a source bucket dictionary named copy_source with the source bucket name and the object key which needs to be copied to another bucket.

Creating Target Bucket Representation From S3 Resource

  • Create a Boto3 resource that represents your target AWS S3 bucket using the s3.bucket() function.

Copying the S3 Object to Target Bucket

  • Copy the s3 object to another bucket using the boto3 resource copy() function.

These are the detailed step-by-step code you can use to copy S3 objects from one bucket to another.

Full python script to copy S3 objects from one bucket to another is given below.

import boto3

#Creating Session With Boto3.
session = boto3.Session(
aws_access_key_id='Your Access Key ID',
aws_secret_access_key='You Secret access key'
)

#Creating S3 Resource From the Session.
s3 = session.resource('s3')

#Create a Soucre Dictionary That Specifies Bucket Name and Key Name of the Object to Be Copied
copy_source = {
    'Bucket': 'your_source_bucket_name',
    'Key': 'Object_Key_with_file_extension'
}

bucket = s3.Bucket('target_bucket_name')

bucket.copy(copy_source, 'target_object_name_with_extension')


# Printing the Information That the File Is Copied.
print('Single File is copied')

Update the highlighted variables based on your bucket names and object names. Then you’ll be able to copy your S3 objects.

You’ve learned how to copy an S3 object from one bucket to another using Boto3.

Copying All Files From One Bucket to Another Using Boto3

In this section, you’ll copy all files existing in one bucket to another bucket using Boto3.

You need to iterate over all the objects available in the source bucket for copying all files.

  • Create a source S3 bucket representation and the destination s3 bucket representation from the S3 resource

Use the below code to create a source s3 bucket representation.

srcbucket = s3.Bucket('your_source_bucket_name')

Use the below code to create a target s3 bucket representation.

destbucket = s3.Bucket('your_target_bucket_name')

Next, Iterate through your s3 bucket objects present in your source bucket by using objects.all() function available in bucket representation python object.

Use the below code to iterate through s3 bucket objects.

for file in srcbucket.objects.all():

During each iteration, the file object will hold details of the current object (including the name of the object).

Now, create a source bucket dictionary that can be used to copy files from one directory to another.

#Create a Soucre Dictionary That Specifies Bucket Name and Key Name of the Object to Be Copied
    copy_source = {
    'Bucket': 'your_source_Bucket_name',

    #file.key holds the name of the current object. Pass that name to the key
    'Key': file.key
    }

Next, copy the object from the source bucket to the destination bucket using the bucket.copy() function available in the S3 Bucket representation object.

Use the below code to copy the object from source to target.

destbucket.copy(copy_source, file.key)

Now, during each iteration, the file object will be copied to the target bucket.

Full python script to copy all S3 objects from one bucket to another is given below.

import boto3

#Creating Session With Boto3.
session = boto3.Session(
aws_access_key_id='Your Access Key ID',
aws_secret_access_key='Your Secret access key'
)

#Creating S3 Resource From the Session.
s3 = session.resource('s3')

srcbucket = s3.Bucket('your_source_bucket_name')

destbucket = s3.Bucket('your_target_bucket_name')

# Iterate All Objects in Your S3 Bucket Over the for Loop
for file in srcbucket.objects.all():
    
    #Create a Soucre Dictionary That Specifies Bucket Name and Key Name of the Object to Be Copied
    copy_source = {
    'Bucket': 'your_source_bucket_name',
    'Key': file.key
    }
    
    destbucket.copy(copy_source, file.key)
    
    print(file.key +'- File Copied')

Update the highlighted variables based on your bucket names and object names. Then you’ll be able to copy all files to another s3 bucket using Boto3.

Next, you’ll learn how to move between s3 buckets.

Moving S3 Object From One Bucket to Another Using Boto3

In this section, you’ll learn how to move an S3 object from one bucket to another.

In principle, there are no native methods available for moving s3 objects within buckets. However, the move operation can be achieved by copying the file to your target directory and deleting the objects in the source directory.

Copying an object to another bucket can be achieved using the Copy section of this tutorial.

Additionally, to delete the file in the source directory, you can use the s3.Object.delete() function.

s3.Object('your_source_bucket_name','Object_Key_with_file_extension').delete()
  • s3 – Resource created using the Boto3 session
  • Object() – Function to create a resource representing the Object name in your source bucket
  • delete() – Function to delete the object from your S3 bucket.

Full python script to move S3 objects from one bucket to another is given below. This will copy the objects to the target bucket and delete the object from the source bucket.

import boto3

#Creating Session With Boto3.
session = boto3.Session(
aws_access_key_id='Your Access Key ID',
aws_secret_access_key='Your Secret access key'
)

#Creating S3 Resource From the Session.
s3 = session.resource('s3')

#Create a Soucre Dictionary That Specifies Bucket Name and Key Name of the Object to Be Copied
copy_source = {
    'Bucket': 'your_source_bucket_name',
    'Key': 'Object_Key_with_file_extension'
}

#Creating Destination Bucket 
destbucket = s3.Bucket('your_target_bucket_name')

#Copying the Object to the Target Directory
destbucket.copy(copy_source, 'Object_Key_with_file_extension')

#To Delete the File After Copying It to the Target Directory
s3.Object('your_source_bucket_name','Object_Key_with_file_extension').delete()

# Printing the File Moved Information
print('Single File is moved')

Update the highlighted variables based on your bucket names and object names. Then you’ll be able to move s3 objects to another s3 bucket using Boto3.

Next, you’ll learn how to move all objects to another s3 bucket.

Moving All Files From One S3 Bucket to Another Using Boto3

In this section, you’ll move all files from One s3 bucket to another bucket using Boto3.

As said in the previous section, there are no native methods available for moving all s3 objects within buckets. The move operation can be achieved by copying all the files to your target directory and deleting the objects in the source directory.

Copying all objects to another bucket can be achieved using the Copy all files section of this tutorial.

Additionally, to delete the file in the source directory, you can use the s3.Object.delete() function. You’ll already have the s3 object during the iteration for the copy task. Once copied, you can directly call the delete() function to delete the file during each iteration.

Full python script to move all S3 objects from one bucket to another is given below. This will copy all the objects to the target bucket and delete the object from the source bucket once each file is copied.

import boto3

#Creating Session With Boto3.
session = boto3.Session(
aws_access_key_id='Your Access Key ID',
aws_secret_access_key='Your Secret access key'
)

#Creating S3 Resource From the Session.
s3 = session.resource('s3')

srcbucket = s3.Bucket('your_source_bucket_name')

destbucket = s3.Bucket('your_target_bucket_name')

# Iterate All Objects in Your S3 Bucket Over the for Loop
for file in srcbucket.objects.all():
    
    #Create a Soucre Dictionary That Specifies Bucket Name and Key Name of the Object to Be Copied
    copy_source = {
    'Bucket': 'your_source_bucket_name',
    'Key': 'Object_Key_with_file_extension'
    }
    
    destbucket.copy(copy_source, file.key)
    
    #to delete the file after copying   
    file.delete()

    print(file.key +'- File Moved')

Copy All Files From One S3 Bucket to Another Using S3cmd Sync

In this section, you’ll learn how to copy all files from one s3 bucket to another using s3cmd.

All the files can be copied to another s3 bucket just by running a single command in the terminal. It’ll sync which means, it’ll copy the files that don’t exist in the target directory.

You can also check which files will be copied by using the --dryrun option along with the sync command. It’ll show the list of files that will be copied to the target directory.

Use the below command to copy all files from your source bucket to the target bucket.

s3cmd sync s3://your/source/bucket/ s3://your/target/bucket/

Setting ACL for Copied Files

Object ACLs enable you to manage access rights to buckets and objects present in the bucket.

After copying or moving a file to a new bucket, you may need to make the file public for allowing public access.

You can do this by getting the current ACL of the object and putting the ACL='public-read' option to the current ACL object. as shown below.

s3 = boto3.resource('s3')
object_acl = s3.ObjectAcl('your_source_bucket_name','Object_Key_with_file_extension')
response = object_acl.put(ACL='public-read')

Now, the newly copied object will be accessible to the public and it can be accessed by anyone who has the object URI.

Running Boto3 Script in Command Line

You can run the Boto3 script in the command line using the python3 command. You must have python3 and Boto3 packages installed in your machine before you can run the Boto3 script in the command line (EC2).

For example, assume your python script to copy all files from one s3 bucket to another is saved as copy_all_objects.py.

You can run this file by using the below command.

python3 copy_all_objects.py

For more detailed information on running python script in the command line, refer How to Run Python File in terminal[Beginners Guide]?

Conclusion

In this tutorial, you’ve learned how to copy a single s3 object to another bucket, copy all files from the s3 bucket to another bucket, move a single object to another bucket and move all files to another bucket using boto3.

You may find the below tutorial useful.

How to specify credentials when connecting to boto3 S3?

You can specify credentials by using the session() method available in the Boto3 object as given below.
session = boto3.Session(
aws_access_key_id=‘Your Access Key ID’,
aws_secret_access_key=‘You Secret access key’
)

where does boto3 look for credentials?

Boto3 looks for credentials in the credentials file available in the location ~/.aws/config. This file contains access key id and secret access key and optionally default region. This same file is also used by AWS CLI to perform CLI operations.

9 thoughts on “How to Copy (or Move Files) From One Bucket to Another Using Boto3 [Python]?”

  1. Hi Vikram,

    Your blog was very useful. Can you please explain how to provide ACL ‘public-read’ access for the copied files?

    Reply
  2. Hi Vikram,
    When I am running the code it runs successful locally but failing when running through Lambda function.
    Its not passing through the for loop, any suggestion ?

    Reply
  3. Hey! vikram i have like 20k files to copy from one bucket to another and it takes a lot of time to copy these files with the above method. can you please suggest some alternatives.

    Reply
  4. Hey vikram above method is taking alot of time i have around 30k files to be copied from one bucket to another in the same account. can you please help me on this

    Reply
  5. Hey This is very usefull. I have been trying to use boto3 to copy object into bucket which resides in different location. is it possible to do this via boto3. if yes, would you give the example . thanks in advance

    Reply

Leave a Comment