Lk_scraper is an fully configurable LinkedIn scrape : scrape anything within LinkedIn
Installation
$ pip install git+git://github.com/jqueguiner/lk_scraper
Setup
$ docker-compose up -d
$ docker-compose run lk_scraper python3
First, you need to run a selenium server
$ docker run -d -p 4444:4444 –shm-size 2g selenium/standalone-firefox:3.141.59-20200326
After running this command, from the browser navigate to your IP address followed by the port number and /grid/console. So the command will be http://localhost:4444/grid/console.
Also Read – Lollipopz : Data Exfiltration Utility For Testing Detection Capabilities
Retrieving Cookie
Setting Up The Cookie
You can add your LinkedIn li_at cookie in the config file that is located in your home (~/.lk_scraper/config.yml) see
from lk_scraper import Scraper
li_at = “My_super_linkedin_cookie”
scraper = Scraper(li_at=li_at)
(Not implemented Yet)
$ export LI_AT=”My_super_linkedin_cookie”
Example
run the jupyter notebook linkedin-example.ipynb
>>from lk_scraper import Scraper
>>scraper = Scraper()
>>from lk_scraper import Scraper
>>scraper = Scraper()
>>company = scraper.get_object(object_name=’company’, object_id=’apple’)
>>from lk_scraper import Scraper
>>scraper = Scraper()
>>profil = scraper.get_object(object_name=’profil’, object_id=’jlqueguiner’)
Introduction to the Model Context Protocol (MCP) The Model Context Protocol (MCP) is an open…
While file extensions in Linux are optional and often misleading, the file command helps decode what a…
The touch command is one of the quickest ways to create new empty files or update timestamps…
Handling large numbers of files is routine for Linux users, and that’s where the find command shines.…
Managing files and directories is foundational for Linux workflows, and the mv (“move”) command makes it easy…
Creating directories is one of the earliest skills you'll use on a Linux system. The mkdir (make…