How to Deny access to external Buckets from private instances using lambda and VPC S3 Endpoint

Herve Khg
3 min readFeb 19, 2020

--

Last week my client asking me a suitable solution in order that instances in private subnets could access buckets of account.

I told him naively, “It’s simple you just need to deploy VPC S3 Endpoint and attach it on default Route Table of the private subnets”. Few days ago he come to me again about the solution that I’ve suggested and said “Your solution with VPC Endpoint was good and its considerably improve download and upload transfer from and to our buckets nevertheless it’s possible to access external buckets through instances and for security reason it’s not possible to allow that”.

I answered to him and said “No problem, you just need to change the default policy of VPC S3 Endpoint and allow access only to buckets of your account”, he said, “no, we could not do that because in the future we will create and destroy a lot of buckets that we could not know names in advance and it’s not possible to have a generic pattern for the bucket naming”. I told him “ok lets me few hours I will check it out and come with a suitable solution for your needs”.

After hours of documentation and test, I finally found that AWS VPC S3 Endpoint Policy did not offer a solution to deny or allow access based on Account ID. The only solution to denied access to the external buckets is to explicitly allow or deny the buckets on the Endpoint Policy. Therefore I will need to implement a custom solution that will automatically update Endpoint policy to allow the bucket each time a new bucket is created. My Custom solution will be based on Cloudwatch Rules and Lambda function.

In this post, I focus on the python code of the lambda and the policy of the role use by the lambda. For the implementation of the infrastructure behind in terraform you could read this post (Only lambda code change)

Step 1 : Python lambda code

The Lamba function it’s triggered by Cloudwatch event rules when a new bucket it’s created. The function makes these actions:

  • Get the list of all buckets names in o
  • Get all the route table ids
  • Delete the current S3 Endpoint (if exist)
  • Create an S3 Endpoint for each route table and generate a policy that contains all the bucket of the account
  • Put tag “Name” on the Endpoint
Lambda Function that add New Bucket on VPC S3 Enpoint Policy

Step 2 : Policy for the lambda Role

The lambda needs the role with the policy below

Role for Lambda Function

Step 3 : Go back on AWS Console and Check

In our case I’ve created a bucket to-delete-as-soon. The Endpoint Policy content this new bucket and the others of the account

Enjoy !

--

--

Herve Khg
Herve Khg

Written by Herve Khg

CTO at HK-TECH. We are building web and mobile. High hands on experience in cloud infrastructure. I published my first tech book — https://amzn.eu/d/4R3gf5j

No responses yet