Lk_scraper is an fully configurable LinkedIn scrape : scrape anything within LinkedIn
Installation
$ pip install git+git://github.com/jqueguiner/lk_scraper
Setup
$ docker-compose up -d
$ docker-compose run lk_scraper python3
First, you need to run a selenium server
$ docker run -d -p 4444:4444 –shm-size 2g selenium/standalone-firefox:3.141.59-20200326
After running this command, from the browser navigate to your IP address followed by the port number and /grid/console. So the command will be http://localhost:4444/grid/console.
Also Read – Lollipopz : Data Exfiltration Utility For Testing Detection Capabilities
Retrieving Cookie
Setting Up The Cookie
You can add your LinkedIn li_at cookie in the config file that is located in your home (~/.lk_scraper/config.yml) see
from lk_scraper import Scraper
li_at = “My_super_linkedin_cookie”
scraper = Scraper(li_at=li_at)
(Not implemented Yet)
$ export LI_AT=”My_super_linkedin_cookie”
Example
run the jupyter notebook linkedin-example.ipynb
>>from lk_scraper import Scraper
>>scraper = Scraper()
>>from lk_scraper import Scraper
>>scraper = Scraper()
>>company = scraper.get_object(object_name=’company’, object_id=’apple’)
>>from lk_scraper import Scraper
>>scraper = Scraper()
>>profil = scraper.get_object(object_name=’profil’, object_id=’jlqueguiner’)
General Working of a Web Application Firewall (WAF) A Web Application Firewall (WAF) acts as…
How to Send POST Requests Using curl in Linux If you work with APIs, servers,…
If you are a Linux user, you have probably seen commands like chmod 777 while…
Vim and Vi are among the most powerful text editors in the Linux world. They…
Working with compressed files is a common task for any Linux user. Whether you are…
In the digital era, an email address can reveal much more than just a contact…