Cariddi : Take A List Of Domains, Crawl Urls And Scan For Endpoints, Secrets, Api Keys, File Extensions, Tokens And More…

Cariddi is a tool to take a list of domains, crawl urls and scan for endpoints, secrets, api keys, file extensions, tokens and more.

Installation

You need Go.

Go is an open source programming language that makes it easy to build simplereliable, and efficient software.

  • Linux
    • git clone https://github.com/edoardottt/cariddi.git
    • cd cariddi
    • go get
    • make linux (to install)
    • make unlinux (to uninstall)Or in one line: git clone https://github.com/edoardottt/cariddi.git; cd cariddi; go get; make linux
  • Windows (executable works only in cariddi folder.)
    • git clone https://github.com/edoardottt/cariddi.git
    • cd cariddi
    • go get
    • .\make.bat windows (to install)
    • .\make.bat unwindows (to uninstall)

Get Started

cariddi -h prints the help in the command line.

Usage of cariddi:
-c int
Concurrency level. (default 20)
-cache
Use the .cariddi_cache folder as cache.
-d int
Delay between a page crawled and another.
-e Hunt for juicy endpoints.
-ef string
Use an external file (txt, one per line) to use custom parameters for endpoints hunting.
-examples
Print the examples.
-ext int
Hunt for juicy file extensions. Integer from 1(juicy) to 7(not juicy).
-h Print the help.
-i string
Ignore the URL containing at least one of the elements of this array.
-it string
Ignore the URL containing at least one of the lines of this file.
-oh string
Write the output into an HTML file.
-ot string
Write the output into a TXT file.
-plain
Print only the results.
-s Hunt for secrets.
-sf string
Use an external file (txt, one per line) to use custom regexes for secrets hunting.
-t int
Set timeout for the requests. (default 10)
-version
Print the version.

Examples

  • cariddi -version (Print the version)
  • cariddi -h (Print the help)
  • cariddi -examples (Print the examples)
  • cat urls | cariddi -s (Hunt for secrets)
  • cat urls | cariddi -d 2 (2 seconds between a page crawled and another)
  • cat urls | cariddi -c 200 (Set the concurrency level to 200)
  • cat urls | cariddi -e (Hunt for juicy endpoints)
  • cat urls | cariddi -plain (Print only useful things)
  • cat urls | cariddi -ot target_name (Results in txt file)
  • cat urls | cariddi -oh target_name (Results in html file)
  • cat urls | cariddi -ext 2 (Hunt for juicy (level 2 of 7) files)
  • cat urls | cariddi -e -ef endpoints_file (Hunt for custom endpoints)
  • cat urls | cariddi -s -sf secrets_file (Hunt for custom secrets)
  • cat urls | cariddi -i forum,blog,community,open (Ignore urls containing these words)
  • cat urls | cariddi -it ignore_file (Ignore urls containing at least one line in the input file)
  • cat urls | cariddi -cache (Use the .cariddi_cache folder as cache)
  • cat urls | cariddi -t 5 (Set the timeout for the requests)
  • For Windows use powershell.exe -Command "cat urls | .\cariddi.exe"
R K

Recent Posts

cp Command: Copy Files and Directories in Linux

The cp command, short for "copy," is the main Linux utility for duplicating files and directories. Whether…

1 week ago

Image OSINT

Introduction In digital investigations, images often hold more information than meets the eye. With the…

1 week ago

cat Command: Read and Combine File Contents in Linux

The cat command short for concatenate, It is a fast and versatile tool for viewing and merging…

1 week ago

Port In Networking

What is a Port? A port in networking acts like a gateway that directs data…

1 week ago

ls Command: List Directory Contents in Linux

The ls command is fundamental for anyone working with Linux. It’s used to display the files and…

1 week ago

pwd Command: Find Your Location in Linux

The pwd (Print Working Directory) command is essential for navigating the Linux filesystem. It instantly shows your…

2 weeks ago