I built this tool in 2019 for a pentest involving Azure, as no other enumeration tools supported it at the time. It grew from there, and I learned a lot while adding features.

Building tools is fun, but maintaining tools is hard. I haven’t actively used this tool myself in a while, but I’ve done my best to fix bugs and review pull requests.

Moving forward, it makes sense to consolidate this functionality into a well-maintained project that handles the essentials (web/dns requests, threading, I/O, logging, etc.). Nuclei is really well suited for this. You can see my first PR to migrate cloud_enum functionality to Nuclei here.

I encourage others to contribute templates to Nuclei, allowing us to focus on detecting cloud resources while leaving the groundwork to Nuclei.

I’ll still try to review PRs here to address bugs as time permits, but likely won’t have time for major changes.

Thanks to all the great contributors. Good luck with your recon!

Overview

Multi-cloud OSINT tool. Enumerate public resources in AWS, Azure, and Google Cloud.

Currently enumerates the following:

Amazon Web Services:

  • Open / Protected S3 Buckets
  • awsapps (WorkMail, WorkDocs, Connect, etc.)

Microsoft Azure:

  • Storage Accounts
  • Open Blob Storage Containers
  • Hosted Databases
  • Virtual Machines
  • Web Apps

Google Cloud Platform

  • Open / Protected GCP Buckets
  • Open / Protected Firebase Realtime Databases
  • Google App Engine sites
  • Cloud Functions (enumerates project/regions with existing functions, then brute forces actual function names)
  • Open Firebase Apps

See it in action in Codingo‘s video demo here.

Usage

Setup

Several non-standard libaries are required to support threaded HTTP requests and dns lookups. You’ll need to install the requirements as follows:

pip3 install -r ./requirements.txt

Running

The only required argument is at least one keyword. You can use the built-in fuzzing strings, but you will get better results if you supply your own with -m and/or -b.

You can provide multiple keywords by specifying the -k argument multiple times.

Keywords are mutated automatically using strings from enum_tools/fuzz.txt or a file you provide with the -m flag.

Services that require a second-level of brute forcing (Azure Containers and GCP Functions) will also use fuzz.txt by default or a file you provide with the -b flag.

Let’s say you were researching “somecompany” whose website is “somecompany.io” that makes a product called “blockchaindoohickey”. You could run the tool like this:

./cloud_enum.py -k somecompany -k somecompany.io -k blockchaindoohickey

For more information click here.

LEAVE A REPLY

Please enter your comment!
Please enter your name here