Upload File to S3 Boto3 Python Example
In this tutorial, nosotros will larn nigh four different ways to upload a file to S3 using python. This is a continuation of the serial where we are writing scripts to work with AWS S3 in Python linguistic communication.
Setting up permissions for S3
For this tutorial to work, we will demand an IAM user who has access to upload a file to S3. Nosotros can configure this user on our local motorcar using AWS CLI or we can utilise its credentials direct in python script. We accept already covered this topic on how to create an IAM user with S3 admission. If yous do non accept this user setup delight follow that blog offset and and then continue with this blog.
Upload a file to S3 using s3 client
One of the almost common means to upload files on your local motorcar to S3 is using the client grade for S3. You lot need to provide the bucket proper noun, file which you desire to upload and object name in S3.
one 2 three 4 5 six vii eight ix 10 11 12 13 14 fifteen 16 17 eighteen 19 | import boto3 from pprint import pprint import pathlib import bone def upload_file_using_client ( ) : "" " Uploads file to S3 bucket using S3 customer object :return: None " "" s3 = boto3 . customer ( "s3" ) bucket_name = "binary-guy-frompython-1" object_name = "sample1.txt" file_name = os . path . join ( pathlib . Path ( __file__ ) . parent . resolve ( ) , "sample_file.txt" ) response = s3 . upload_file ( file_name , bucket_name , object_name ) pprint ( response ) # prints None |
When you run this role, it will upload "sample_file.txt" to S3 and it volition have the name "sample1.txt" in S3. We can verify this in the console.

In the above lawmaking, nosotros take not specified whatsoever user credentials. In such cases, boto3 uses the default AWS CLI profile ready on your local machine. You tin too specify which profile should be used past boto3 if you take multiple profiles on your auto. All you lot need to exercise is add the beneath line to your code.
boto3.setup_default_session(profile_name='PROFILE_NAME_FROM_YOUR_MACHINE')
Another option is you can specify the access central id and secret admission key in the code itself. This is non recommended approach and I strongly believe using IAM credentials directly in code should be avoided in most cases. You can utilize access key id and surreptitious access primal in lawmaking as shown beneath, in instance you have to practise this.
s3 = boto3.customer("s3", aws_access_key_id=ACCESS_KEY, aws_secret_access_key=SECRET_KEY)
Upload a file to S3 using S3 resources class
Some other selection to upload files to s3 using python is to use the S3 resource class.
def upload_file_using_resource ( ) : "" " Uploads file to S3 saucepan using S3 resources object. This is useful when you are dealing with multiple buckets st aforementioned time. :render: None " "" s3 = boto3 . resource ( "s3" ) bucket_name = "binary-guy-frompython-2" object_name = "sample2.txt" file_name = bone . path . bring together ( pathlib . Path ( __file__ ) . parent . resolve ( ) , "sample_file.txt" ) bucket = s3 . Bucket ( bucket_name ) response = bucket . upload_file ( file_name , object_name ) print ( response ) # Prints None |
The above code will also upload files to S3. The above approach is especially useful when you lot are dealing with multiple buckets. You tin create dissimilar bucket objects and use them to upload files.

Uploading a file to S3 using put object
Till now we have seen ii means to upload files to S3. Both of them are easy simply we do not accept much control over the files we are uploading to S3. What if nosotros want to add together encryption when we upload files to s3 or make up one's mind which kind of access level our file has (nosotros will swoop deep into file/object access levels in another weblog).
When we need such fine-grained control while uploading files to S3, we can use the put_object office equally shown in the below lawmaking.
1 ii 3 4 5 6 seven 8 9 10 eleven 12 13 14 15 16 17 18 19 20 21 22 23 | def upload_file_to_s3_using_put_object ( ) : "" " Uploads file to s3 using put_object function of resource object. Same function is bachelor for s3 client object as well. put_object part gives us much more than options and nosotros tin can set up object access policy, tags, encryption etc :return: None " "" s3 = boto3 . resource ( "s3" ) bucket_name = "binary-guy-frompython-2" object_name = "sample_using_put_object.txt" file_name = os . path . join ( pathlib . Path ( __file__ ) . parent . resolve ( ) , "sample_file.txt" ) bucket = s3 . Bucket ( bucket_name ) response = bucket . put_object ( ACL = "private" , Body = file_name , ServerSideEncryption = "AES256" , Key = object_name , Metadata = { "env" : "dev" , "owner" : "binary guy" } , ) print ( response ) # prints s3.Object(bucket_name='binary-guy-frompython-2', key='sample_using_put_object.txt') |

When we run the to a higher place code we can see that our file has been uploaded to S3. But nosotros also demand to check if our file has other properties mentioned in our code. In S3, to check object details click on that object. When we click on "sample_using_put_object.txt " we will see the below details.

We can come across that our object is encrypted and our tags showing in object metadata. There are many other options that you can set for objects using the put_object role. You can detect those details at boto3 documentation for put_object.
Uploading byte data to S3
In some cases, you may accept byte data as the output of some procedure and you want to upload that to S3. You can call up that it's easy. Nosotros write that data to file and upload that file to S3. Just what if in that location is a elementary manner where you practice not accept to write byte data to file?
Of course, there is. We use the upload_fileobj function to straight upload byte data to S3. In the below code, I am reading a file in binary format and then using that data to create object in S3. But yous have whatever binary data written to S3 using the below lawmaking.
1 two 3 4 5 6 vii 8 ix x eleven 12 13 14 15 sixteen 17 18 19 | def upload_file_to_s3_using_file_object ( ) : "" " Uploads to file to s3 using upload_fileobj function of s3 client object. Similar function is available for s3 resource object as well. In this case, instead of copying file, we open that file and re-create data of that file to S3. This tin exist useful when you take binary information already created as output of some process. We practice not have to write this binary data to local file and so upload that file. Nosotros tin employ upload_fileobj function :return: None " "" s3 = boto3 . client ( "s3" ) bucket_name = "binary-guy-frompython-1" object_name = "sample_file_object.txt" file_name = os . path . join ( pathlib . Path ( __file__ ) . parent . resolve ( ) , "sample_file.txt" ) with open ( file_name , "rb" ) as information : s3 . upload_fileobj ( data , bucket_name , object_name ) |
Allow united states check if this has created an object in S3 or not.

As we tin meet, information technology has successfully created an S3 object using our byte data.
Conclusion
In this blog, we have learned iv different ways to upload files and binary data to s3 using python. You can get all the code in this web log at GitHub. I hope you found this useful. In the next web log, we volition learn different ways to listing down objects in the S3 bucket. See you soon.
Source: https://binaryguy.tech/aws/s3/how-to-upload-a-file-to-s3-using-python/
Post a Comment for "Upload File to S3 Boto3 Python Example"