How to Write a File or Data to an S3 Object using Boto3

S3 is an object storage service provided by AWS. You may need to upload data or file to S3 when working with AWS SageMaker notebook or a normal jupyter notebook in Python.

You can write a file or data to S3 Using Boto3 using the Object.put() method.

Other methods available to write a file to s3 are,

  • Object.put()
  • Upload_File()
  • Client.putObject()

If You’re in Hurry…

You can use the below code snippet to write a file to S3.

import boto3

#Creating Session With Boto3.
session = boto3.Session(
aws_access_key_id='<your_access_key_id>',
aws_secret_access_key='<your_secret_access_key>'
)

#Creating S3 Resource From the Session.
s3 = session.resource('s3')

txt_data = b'This is the content of the file uploaded from python boto3 asdfasdf'

object = s3.Object('<bucket_name>', 'file_name.txt')

result = object.put(Body=txt_data)

This is how you can upload files to S3 from Jupyter notebook and Python using Boto3.

If You Want to Understand Details, Read on…

In this tutorial, you’ll learn how to write a file or a data to S3 using Boto3.

Prerequisites

  • Generate the security credentials by clicking Your Profile Name -> My security Credentials -> Access keys (access key ID and secret access key) option. This is necessary to create session to your S3 bucket.
  • Understand the difference between boto3 resource and boto3 client. Object.put() and the upload_file() methods are from boto3 resource where as put_object() is from boto3 client.

Using Object.put()

You can use the Object.put() method available in the S3 object.

It allows two ways to write into the S3.

  • Writing a text content to an S3 object
  • Writing contents from the local file to the S3 object

Write Text Data To S3 Object

In this section, you’ll learn how to write a normal text data to the s3 object.

Follow the below steps to write a text data to an S3 Object.

  1. Create a Boto3 session using the security credentials
  2. With the session, create a resource object for the S3 service
  3. Create an S3 object using the object method. It accepts two parameters. BucketName and the File_Key. File_Key is the name you want to give it for the S3 object. If you would like to create sub-folders inside the bucket, you can prefix the locations in this File_key variable. For example, /subfolder/file_name.txt
  4. Create an text object which holds the text to be updated to the S3 object
  5. Use the put() action available in the S3 object and the set the body as the text data. E.g. Body=txt_data.
  6. put() actions returns a JSON response metadata. This metadata contains the HttpStatusCode which shows if the file upload is successful or not. If the status code is 200, then the file upload is successful. Else, it is not.

Note: Using this method will replace the existing S3 object in the same name. Hence ensure you’re using a unique name to this object.

Snippet

import boto3

#Creating Session With Boto3.
session = boto3.Session(
aws_access_key_id='<your_access_key_id>',
aws_secret_access_key='<your_secret_access_key>'
)

#Creating S3 Resource From the Session.
s3 = session.resource('s3')

object = s3.Object('<bucket_name>', 'file_name.txt')

txt_data = b'This is the content of the file uploaded from python boto3'

result = object.put(Body=txt_data)

res = result.get('ResponseMetadata')

if res.get('HTTPStatusCode') == 200:
    print('File Uploaded Successfully')
else:
    print('File Not Uploaded')

Output

    File Uploaded Succesfully

This is how you can update the text data to an S3 object using Boto3.

Reading a File from Local and Updating it to S3

In this section, you’ll learn how to read a file from a local system and update it to an S3 object.

It is similar to the steps explained in the previous step except for one step.

You just need to open a file in the binary mode and send its content to the put() method using the below snippet.

Use only forward slash for the file path. Backslash doesn’t work.

Note: Using this method will replace the existing S3 object in the same name. Hence ensure you’re using a unique name to this object.

result = object.put(Body=open('E:/temp/testfile.txt', 'rb'))

Snippet

import boto3

#Creating Session With Boto3.
session = boto3.Session(
aws_access_key_id='<your_access_key_id>',
aws_secret_access_key='<your_secret_access_key>'
)

#Creating S3 Resource From the Session.
s3 = session.resource('s3')

object = s3.Object('<bucket_name>', 'file_name.txt')

result = object.put(Body=open('E:/temp/testfile.txt', 'rb'))

res = result.get('ResponseMetadata')

if res.get('HTTPStatusCode') == 200:
    print('File Uploaded Successfully')
else:
    print('File Not Uploaded')

You can check if the file is successfully uploaded or not using the HTTPStatusCode available in the responsemetadata.

Output

    File Uploaded Successfully

This is how you can write the data from the text file to an S3 object using Boto3.

Using Upload_File()

In this section, you’ll learn how to use the upload_file() method to upload a file to an S3 bucket. It is a boto3 resource.

Follow the below steps to use the upload_file() action to upload file to S3 bucket.

  1. Create a boto3 session
  2. Create an object for S3 object
  3. Access the bucket in the S3 resource using the s3.Bucket() method and invoke the upload_file() method to upload the files
  4. upload_file() method accepts two parameters.
    • File_Path – Path of the file from the local system that needs to be uploaded. Use only forward slash when you mention the path name
    • Object_name – Name for the object that will be created by uploading this file

Unlike the other methods, the upload_file() method doesn’t return a meta-object to check the result. You can use the other methods to check if an object is available in the bucket.

Note: Using this method will replace the existing S3 object in the same name. Hence ensure you’re using a unique name for this object.

Snippet

import boto3

#Creating Session With Boto3.
session = boto3.Session(
aws_access_key_id='<your_access_key_id>',
aws_secret_access_key='<your_secret_access_key>'
)

#Creating S3 Resource From the Session.
s3 = session.resource('s3')

result = s3.Bucket('<bucket_name>').upload_file('E:/temp/testfile.txt','file_name.txt')

print(result)

File is updated successfully. But you’ll only see the status as None.

Output

    None

This is how you can use the upload_file() method to upload file to the S3 buckets.

Using Client.putObject()

In this section, you’ll learn how to use the put_object method from the boto3 client.

Follow the below steps to use the client.put_object() method to upload a file as an S3 object.

  1. Create a boto3 session using your AWS security credentials
  2. Create a resource object for S3
  3. Get the client from the S3 resource using s3.meta.client
  4. Invoke the put_object() method from the client. It accepts two parameters.
    • body – To pass the textual content for the S3 object. You can pass the text directly. Or you can use the file object by opening the file using open('E:/temp/testfile.txt', 'rb')
    • Name – Name for the new object that will be created.

put_object() also returns a ResponseMetaData which will let you know the status code to denote if the upload is successful or not.

Snippet

import boto3

#Creating Session With Boto3.
session = boto3.Session(
aws_access_key_id='<your_access_key_id>',
aws_secret_access_key='<your_secret_access_key>'
)

#Creating S3 Resource From the Session.
s3 = session.resource('s3')

result = s3.meta.client.put_object(Body='Text Contents', Bucket='<bucket_name>', Key='filename.txt')

res = result.get('ResponseMetadata')

if res.get('HTTPStatusCode') == 200:
    print('File Uploaded Successfully')
else:
    print('File Not Uploaded')

A new S3 object will be created and the contents of the file will be uploaded.

Output

    File Uploaded Successfully

This is how you can use the put_object() method available in boto3 S3 client to upload files to the S3 bucket.

Conclusion

To summarize, you’ve learned what is boto3 client and boto3 resource in the prerequisites and also learned the different methods available in the boto3 resource and boto3 client to upload files or data to the S3 buckets.

You May Also Like

How To Load Data From AWS S3 Into Sagemaker (Using Boto3 Or AWSWrangler)

Leave a Comment