CrossLinked is a LinkedIn enumeration tool that uses search engine scraping to collect valid employee names from an organization. This technique provides accurate results without the use of API keys, credentials, or accessing LinkedIn directly!
Install the last stable release from PyPi:
pip3 install crosslinked
Or, install the most recent code from GitHub:
git clone https://github.com/m8sec/crosslinked
cd crosslinked
python3 setup install
CrossLinked assumes the organization’s account naming convention has already been identified. This is required for execution and should be added to the CMD args based on your expected output. See the Naming Format
and Example Usage
sections below:
{f}.{last} = j.smith
{first.{last} = john.smith
CMP{first}{l} = CMP\johns
{f}{last}@company.com = jsmith@company.com
By default, CrossLinked will use google
and bing
search engines to identify employees of the target organization. After execution, two files (names.txt
& names.csv
) will appear in the current directory, unless modified in the CMD args.
Parse
section below for more.python3 crosslinked.py -f ‘{first}.{last}@domain.com’ company_name
python3 crosslinked.py -f ‘domain{f}{last}’ -t 15 -j 2 company_name
Account naming convention changed after execution and now your hitting CAPTCHA requests? No Problem!
CrossLinked v0.2.0 now includes a names.csv
output file, which stores all scraping data including: first name
, last name
, job
title
, and url
. This can be ingested and parsed to reformat user accounts as needed.
python3 crosslinked.py -f ‘{f}{last}@domain.com’ names.csv
The latest version of CrossLinked provides proxy support to rotate source addresses. Users can input a single proxy with --proxy 127.0.0.1:8080
or use multiple via --proxy-file proxies.txt
.
cat proxies.txt
127.0.0.1:8080
socks4://111.111.111.111
socks5://222.222.222.222
python3 crosslinked.py –proxy-file proxies.txt -f ‘{first}.{last}@company.com’ -t 10 “Company”
positional arguments:
company_name Target company name
optional arguments:
-h, –help show help message and exit
-t TIMEOUT Max timeout per search (Default=15)
-j JITTER Jitter between requests (Default=1)
Search arguments:
–search ENGINE Search Engine (Default=’google,bing’)
Output arguments:
-f NFORMAT Format names, ex: ‘domain{f}{last}’, ‘{first}.{last}@domain.com’
-o OUTFILE Change name of output file (omit_extension)
Proxy arguments:
–proxy PROXY Proxy requests (IP:Port)
–proxy-file PROXY Load proxies from file for rotation
shadow-rs is a Windows kernel rootkit written in Rust, demonstrating advanced techniques for kernel manipulation…
Extract and execute a PE embedded within a PNG file using an LNK file. The…
Embark on the journey of becoming a certified Red Team professional with our definitive guide.…
This repository contains proof of concept exploits for CVE-2024-5836 and CVE-2024-6778, which are vulnerabilities within…
This took me like 4 days (+2 days for an update), but I got it…
MaLDAPtive is a framework for LDAP SearchFilter parsing, obfuscation, deobfuscation and detection. Its foundation is…