Cross Account Scripting in an AWS Multi-Account Strategy

John Byrd
3 min readApr 15, 2019

Previously, I’ve described accomplishing specific tasks using a version of the method and script described here with Enable VPC Flow Logs Across All Regions in All Accounts, but didn’t go into detail on configuring the underlying script.

Since that article, I’ve updated the script with assistance from colleagues and would like to provide better explanation to the newest revision.

The Architecture

As the size of the AWS footprint grows, management across multiple AWS accounts requires more considerations and tools with which to work. A typical implementation of Organizations is all that is necessary to allow execution of this at scale with little effort.

The first consideration should be based around access. If new accounts are being deployed using the Organizations CLI (which they should), the accounts will have what could be a uniform name that would allow assumption as a break-glass account or a possible entity for execution of the commands. Another option would be to deploy roles to designated accounts with trust relationships that allow for use from a centralized account for remote execution of commands for that subset of accounts.

Assuming this exists in some capacity, the next requirement is that the centralized management account have a Linux EC2 (since this version of the script is written in bash) instance with an attached IAM role. This IAM role just needs the sts:AssumeRole, but if you want to add a condition with the target role(s) to enforce least privileged, you may get called a try hard, but at least you’re doing security right.

Additionally, the EC2 needs to have jq installed. This allows dissection and manipulation of JSON responses and is very useful outside of this use case.

The Script

The script can be found below for your convenience, but can also be located at this link for easier use.

## Update with list for accounts to be targeted                       file="/home/ec2-user/list-all.txt"                       YELLOW='\033[1;33m'                       
NC='\033[0m'
while IFS= read -r acctid
do
# Clears all affected env variables
unset AWS_ACCESS_KEY_ID
unset AWS_SECRET_ACCESS_KEY
unset AWS_SESSION_TOKEN
# Assume role and print info to idp variable
idp=$(aws sts assume-role --role-arn arn:aws:iam::$acctid:role/<RoleName> --role-session-name IDP)

# Sets env variables
export AWS_ACCESS_KEY_ID=$(echo $idp | jq -r .Credentials.AccessKeyId)
export AWS_SECRET_ACCESS_KEY=$(echo $idp | jq -r .Credentials.SecretAccessKey)
export AWS_SESSION_TOKEN=$(echo $idp | jq -r .Credentials.SessionToken)

# Populate per account commands below this line
AcctNumber=$(aws sts get-caller-identity | jq -r .Account)
echo -e Attempting commands against ${YELLOW}$AcctNumber${NC}
aws s3 ls
done < "$file"

The variable file currently refers to the home location for the default user in an AWS Linux EC2 instance. Within that location, there should be a text file identified in this variable that consists of a list of target AWS accounts. The list should have each target account on an individual line with no delimiting characters like commas.

This version includes a splash of color with some yellow account numbers for aesthetic consideration.

The environment variables used to authenticate to AWS are cleared for later repopulation.

Identify the idp variable in the script and replace <RoleName> with the role described in the first section of this article.

The next few lines assign the responses of the AssumeRole call and adds them as environment variables. Clearing these variables ensures no accidental execution on already assumed roles.

The next to last line that currently executes aws s3 ls can be replaced with whatever command needs to be executed against all the accounts listed in the text file.

Use Case

I picked this script this week after needing to clean up some resources that were stuck as the result of an inoperable stackset. After deploying some stacks using the default AWS StackSet Execution role, we decided to move to a nonstandard naming convention.

In a future article I’ll go into more detail as to how I used JMES path query in AWS CLI with the above script to delete stacks that started with a certain value that had been sequestered to solitude by a stackset snafu.

--

--

John Byrd
John Byrd

Written by John Byrd

Modernizing companies’ AWS security and governance programs at scale.

No responses yet