CrossLinked is a LinkedIn enumeration tool that uses search engine scraping to collect valid employee names from an organization. This technique provides accurate results without the use of API keys, credentials, or accessing LinkedIn directly!
Install the last stable release from PyPi:
pip3 install crosslinked
Or, install the most recent code from GitHub:
git clone https://github.com/m8sec/crosslinked
cd crosslinked
python3 setup install
CrossLinked assumes the organization’s account naming convention has already been identified. This is required for execution and should be added to the CMD args based on your expected output. See the Naming Format
and Example Usage
sections below:
{f}.{last} = j.smith
{first.{last} = john.smith
CMP{first}{l} = CMP\johns
{f}{last}@company.com = jsmith@company.com
By default, CrossLinked will use google
and bing
search engines to identify employees of the target organization. After execution, two files (names.txt
& names.csv
) will appear in the current directory, unless modified in the CMD args.
Parse
section below for more.python3 crosslinked.py -f ‘{first}.{last}@domain.com’ company_name
python3 crosslinked.py -f ‘domain{f}{last}’ -t 15 -j 2 company_name
Account naming convention changed after execution and now your hitting CAPTCHA requests? No Problem!
CrossLinked v0.2.0 now includes a names.csv
output file, which stores all scraping data including: first name
, last name
, job
title
, and url
. This can be ingested and parsed to reformat user accounts as needed.
python3 crosslinked.py -f ‘{f}{last}@domain.com’ names.csv
The latest version of CrossLinked provides proxy support to rotate source addresses. Users can input a single proxy with --proxy 127.0.0.1:8080
or use multiple via --proxy-file proxies.txt
.
cat proxies.txt
127.0.0.1:8080
socks4://111.111.111.111
socks5://222.222.222.222
python3 crosslinked.py –proxy-file proxies.txt -f ‘{first}.{last}@company.com’ -t 10 “Company”
positional arguments:
company_name Target company name
optional arguments:
-h, –help show help message and exit
-t TIMEOUT Max timeout per search (Default=15)
-j JITTER Jitter between requests (Default=1)
Search arguments:
–search ENGINE Search Engine (Default=’google,bing’)
Output arguments:
-f NFORMAT Format names, ex: ‘domain{f}{last}’, ‘{first}.{last}@domain.com’
-o OUTFILE Change name of output file (omit_extension)
Proxy arguments:
–proxy PROXY Proxy requests (IP:Port)
–proxy-file PROXY Load proxies from file for rotation
Cybersecurity tools play a critical role in safeguarding digital assets, systems, and networks from malicious…
MODeflattener is a specialized tool designed to reverse OLLVM's control flow flattening obfuscation through static…
"My Awesome List" is a curated collection of tools, libraries, and resources spanning various domains…
CVE-2018-17463, a type confusion vulnerability in Chrome’s V8 JavaScript engine, allowed attackers to execute arbitrary…
The blog post "Chrome Browser Exploitation, Part 1: Introduction to V8 and JavaScript Internals" provides…
The exploitation of CVE-2018-17463, a type confusion vulnerability in Chrome’s V8 JavaScript engine, relies on…