Cyber security

Uscrapper 2.0 – Unleashing The Power Of OSINT Web Scraping For Data Extraction


Introducing Uscrapper 2.0, A powerfull OSINT webscrapper that allows users to extract various personal information from a website.

It leverages web scraping techniques and regular expressions to extract email addresses, social media links, author names, geolocations, phone numbers, and usernames from both hyperlinked and non-hyperlinked sources on the webpage, supports multithreading to make this process faster, Uscrapper 2.0 is equipped with advanced Anti-webscrapping bypassing modules and supports webcrawling to scrape from various sublinks within the same domain.

The tool also provides an option to generate a report containing the extracted details.

Extracted Details:

Uscrapper extracts the following details from the provided website:

  • Email Addresses: Displays email addresses found on the website.
  • Social Media Links: Displays links to various social media platforms found on the website.
  • Author Names: Displays the names of authors associated with the website.
  • Geolocations: Displays geolocation information associated with the website.
  • Non-Hyperlinked Details: Displays non-hyperlinked details found on the website including email addresses phone numbers and usernames.

Whats New?

Uscrapper 2.0:

  • Introduced multiple modules to bypass anti-webscrapping techniques.
  • Introducing Crawl and scrape: an advanced crawl and scrape module to scrape the websites from within.
  • Implemented Multithreading to make these processes faster.

Installation Steps:

git clone https://github.com/z0m31en7/Uscrapper.git
cd Uscrapper/install/ 
chmod +x ./install.sh && ./install.sh      #For Unix/Linux systems

Usage:

To run Uscrapper, use the following command-line syntax:

python Uscrapper-v2.0.py [-h] [-u URL] [-c (INT)] [-t THREADS] [-O] [-ns]

Arguments:

  • -h, –help: Show the help message and exit.
  • -u URL, –url URL: Specify the URL of the website to extract details from.
  • -c INT, –crawl INT: Specify the number of links to crawl
  • -t INT, –threads INT: Specify the number of threads to use while crawling and scraping.
  • -O, –generate-report: Generate a report file containing the extracted details.
  • -ns, –nonstrict: Display non-strict usernames during extraction.
Tamil S

Tamil has a great interest in the fields of Cyber Security, OSINT, and CTF projects. Currently, he is deeply involved in researching and publishing various security tools with Kali Linux Tutorials, which is quite fascinating.

Recent Posts

PwnedPasswordsDownloader – Efficient Downloading Of HIBP Password Hashes Using Curl Parallelism

Thanks for HIBP and this downloader. At first I was considering using it, but the…

4 days ago

Cybersecurity Conferences – A Comprehensive Slide Collection

Comprehensive repository for presentation slides from major cybersecurity conferences held in 2023 and 2024. It…

1 week ago

DLL Proxy Generator – Harnessing Advanced Proxy Capabilities

Generate a proxy dll for arbitrary dll, while also loading a user-defined secondary dll. In…

1 week ago

DLL Universal Patcher – A Comprehensive Guide To Advanced Binary Patching

DLL Universal Patcher is a flexible and convenient code patcher that doesn't touch the files…

1 week ago

RustiveDump : A Rust-Based Tool For Efficient Memory Dumping Of lsass.exe

RustiveDump is a Rust-based tool designed to dump the memory of the lsass.exe process using…

1 week ago

SharpExclusionFinder – Streamlining Windows Defender Exclusion Checks With Advanced Scanning Capabilities

This C# program finds Windows Defender folder exclusions using Windows Defender through its command-line tool…

2 weeks ago