At our company, we have many requirements for enabling cross-account access with AWS. This requirement was around for usage of S3 and accessing the data from S3 for our in house Ingestion Service. To give a brief about Ingestion Service, it’s part of our Datamanagement ecosystem which is used to ingest data from different source platforms and dump the data to S3. These different source platform also includes AWS S3.
Our existing service on our AWS EC2, when it requires access to S3 files, stored in a different AWS account, we find it difficult to keep updating our access/secret keys. As a couple of companies have a requirement to rotate IAM user credentials over a defined period.
We have utilized the cross-account IAM role access strategy, wherein the roles can now assume role via temporary credentials (obtained every time we query).
How to set it up?
- Create an IAM Role in Account A which has access to Account A S3 something like arn:aws:iam::<acount-id>:role/clients/our-client
- Create an IAM Role in Account B which allows Assume Role Policy and also allows the trust relationship to assume role.
- Assume Roles: There are a couple of ways to switch roles and access the resources. I will be referring to programmatic access to fetch S3 files.
For programmatic switching the role, we tried to find out the best way to allow our clients to access our services and we caught inspiration from AWS documentation on Switching to an IAM Role (AWS API). This document provided us with insights on how we can leverage the script to trigger the access of data in S3.
- Add this policy under your Account B IAM Role:
- Ensure the trust relationship is edited as follow to your IAM Role in Account B
The Script for the rescue
To provide a gist of it, here is the script (we modified this to fit fetching of files and downloading wherever needed):
How to Execute:
python download_from_s3_via_assume_role.py -a <aws_account_id> -r <aws_role_name> -b <bucket-name> -p <prefix-path>
python download_from_s3_via_assume_role.py -a 111101111111 -r client -b client-bucket -p "some-prefix/which-has-files/"
Thank you for reading :)
Example on using aws cli
- First get the temporary token:
aws sts assume-role --role-arn arn:aws:iam::716444400266:role/MiQReadWriteS3Bucket --role-session-name s3-access-example
- Use the temporary Accesskey, SecretKey and SessionToken:
AWS_ACCESS_KEY_ID=<AccessKeyID> AWS_SECRET_ACCESS_KEY=<SecretKey> AWS_SESSION_TOKEN=<SessionToken> aws s3 cp s3://client-bucket/some-prefix/which-has-files/ . --recursive