Kali Linux

HTTPLoot : An Automated Tool Which Can Simultaneously Crawl, Fill Forms, Trigger Error/Debug Pages

HTTPLoot is a automated tool which can simultaneously crawl, fill forms, trigger error/debug pages and “loot” secrets out of the client-facing code of sites.

Usage

To use the tool, you can grab any one of the pre-built binaries from the Releases section of the repository. If you want to build the source code yourself, you will need Go > 1.16 to build it. Simply running go build will output a usable binary for you.

Additionally you will need two json files (lootdb.json and regexes.json) alongwith the binary which you can get from the repo itself. Once you have all 3 files in the same folder, you can go ahead and fire up the tool.

Video Demo

Here is the help usage of the tool:

$ ./httploot --help
      _____
       )=(
      /   \     H T T P L O O T
     (  $  )                  v0.1
      \___/

[+] HTTPLoot by RedHunt Labs - A Modern Attack Surface (ASM) Management Company
[+] Author: Pinaki Mondal (RHL Research Team)
[+] Continuously Track Your Attack Surface using https://redhuntlabs.com/nvadr.

Usage of ./httploot:
  -concurrency int
        Maximum number of sites to process concurrently (default 100)
  -depth int
        Maximum depth limit to traverse while crawling (default 3)
  -form-length int
        Length of the string to be randomly generated for filling form fields (default 5)
  -form-string string
        Value with which the tool will auto-fill forms, strings will be randomly generated if no value is supplied
  -input-file string
        Path of the input file containing domains to process
  -output-file string
        CSV output file path to write the results to (default "httploot-results.csv")
  -parallelism int
        Number of URLs per site to crawl parallely (default 15)
  -submit-forms
        Whether to auto-submit forms to trigger debug pages
  -timeout int
        The default timeout for HTTP requests (default 10)
  -user-agent string
        User agent to use during HTTP requests (default "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:98.0) Gecko/20100101 Firefox/98.0")
  -verify-ssl
        Verify SSL certificates while making HTTP requests
  -wildcard-crawl
        Allow crawling of links outside of the domain being scanned

Concurrent Scanning

There are two flags which help with the concurrent scanning:

  • -concurrency: Specifies the maximum number of sites to process concurrently.
  • -parallelism: Specifies the number of links per site to crawl parallely.

Both -concurrency and -parallelism are crucial to performance and reliability of the tool results.

Crawling

The crawl depth can be specified using the -depth flag. The integer value supplied to this is the maximum chain depth of links to crawl grabbed on a site.

An important flag -wildcard-crawl can be used to specify whether to crawl URLs outside the domain in scope.

NOTE: Using this flag might lead to infinite crawling in worst case scenarios if the crawler finds links to other domains continuously.

Filling Forms

If you want the tool to scan for debug pages, you need to specify the -submit-forms argument. This will direct the tool to autosubmit forms and try to trigger error/debug pages once a tech stack has been identified successfully.

If the -submit-forms flag is enabled, you can control the string to be submitted in the form fields. The -form-string specifies the string to be submitted, while the -form-length can control the length of the string to be randomly generated which will be filled into the forms.

Network Tuning

Flags like:

  • -timeout – specifies the HTTP timeout of requests.
  • -user-agent – specifies the user-agent to use in HTTP requests.
  • -verify-ssl – specifies whether or not to verify SSL certificates.

Input/Output

Input file to read can be specified using the -input-file argument. You can specify a file path containing a list of URLs to scan with the tool. The -output-file flag can be used to specify the result output file path — which by default goes into a file called httploot-results.csv.

Further Details

Further details about the research which led to the development of the tool can be found on our RedHunt Labs Blog.

License & Version

The tool is licensed under the MIT license. See LICENSE.

Currently the tool is at v0.1.

Credits

The RedHunt Labs Research Team would like to extend credits to the creators & maintainers of shhgit for the regular expressions provided by them in their repository.

To know more about our Attack Surface Management platform, check out NVADR.

R K

Recent Posts

MassVulScan : A Comprehensive Network Scanning Tool

MassVulScan is a powerful network scanning tool designed for pentesters and system administrators to identify…

2 hours ago

The-XSS-Rat : A Comprehensive Guide To Cross-Site Scripting Tools And Strategies

The-XSS-Rat, an experienced ethical hacker, provides valuable insights into the world of cross-site scripting (XSS)…

2 hours ago

NimPlant C2 : A Position Independent Code (PIC) Beacon

NimPlant C2 is a minimal Proof-of-Concept (PoC) beacon written in C, designed to operate as…

3 days ago

EUD : Exploring Qualcomm’s Embedded USB Debugger

The Embedded USB Debugger (EUD) is a sophisticated tool developed by Qualcomm to enhance the…

3 days ago

Unleashed Recompiled : A Technical Deep Dive Into Sonic’s PC Transformation

Unleashed Recompiled is an unofficial PC port of Sonic Unleashed, created through the process of…

3 days ago

XenonRecomp : A Tool For Recompiling Xbox 360 Executables

XenonRecomp is a powerful tool designed to convert Xbox 360 executables into C++ code, allowing…

3 days ago