In this article we will use AWS Lambda service to copy objects/files from one S3 bucket to another. Below are the steps we will follow in order to do that:
- Create two buckets in S3 for source and destination.
- Create an IAM role and policy which can read and write to buckets.
- Create a Lamdba function to copy the objects between buckets.
- Assign IAM role to the Lambda function.
- Create an S3 event trigger to execute the Lambda function.
1. Create S3 Buckets:
Create 2 buckets in S3 for source and destination. You can refer to my previous post for the steps about creating a S3 bucket. I have created the buckets highlighted in blue below, that I will be using in this example:
2. Create IAM Policy and Role:
Now go to Security -> IAM (Identity and Access Management).
- Click on Policies -> Create policy
- Click on JSON tab and enter below lines. You will need to modify line number 10 and 18 with the source and destination buckets that you have created.
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": [
"s3:GetObject"
],
"Resource": "arn:aws:s3:::source-bucket104/*"
},
{
"Sid": "VisualEditor1",
"Effect": "Allow",
"Action": [
"s3:PutObject"
],
"Resource": "arn:aws:s3:::dest-bucket104/*"
}
]
}
In the above declaration, I am creating a policy with read permissions "GetObject" to source bucket "source-bucket104" and write permissions "PutObject" to destination bucket "dest-bucket104".
- Click on Review policy.
- Provide a name for your policy and click "Create Policy".
- Now click on Roles -> Create role
- Under "Select type of trusted entity", select "AWS Service"
- Select S3 service
- Click "Next: Permissions"
- Now attach the policy that you created in previous step to this role by selecting the checkbox next to the policy name and click next.
- Click Next on the add tags screen.
- Provide a name for the role and click "Create role".
3. Create Lambda Function:
- Go to Services -> Compute -> Lambda
- Click "Create function"
- Provide a name of the function.
- Select runtime as "Python 3.8"
- Under "Permissions", click on "Choose or create an execution role".
- From the drop down list choose the role that was created in previous step.
- Click "Use an existing role".
- Click "Create function"
- Under the function code, type the below code:
import json
import boto3
s31 = boto3.client("s3")
def lambda_handler(event, context):
dest_bucket = 'dest-bucket104'
src_bucket = event['Records'][0]['s3']['bucket']['name']
filename = event['Records'][0]['s3']['object']['key']
copy_source = {'Bucket': src_bucket, 'Key': filename}
s31.copy_object(CopySource=copy_source, Bucket=dest_bucket, Key=filename)
return {
'statusCode': 200,
'body': json.dumps('Files Copied')
}
4. Create Trigger:
- Click "Add Trigger"
- In Trigger configuration, select S3.
- Under bucket "Select the Source bucket"
- Under "Event Type", select "Put"
- Click "Add"
Test your Lambda function:
Do a test by uploading a file to the source S3 bucket. If all the configurations are correct, the file(s) should be copied to the destination bucket.
Comments
Post a Comment