Code Monkey home page Code Monkey logo

linkedin_scraper's Introduction

linkedin_scraper_2

prerequisites

node version 14

installation

Install my-project with npm and build it

git clone https://github.com/railflow/linkedin_scraper_2.git
cd linkedin_scraper_2/
npm i
npm run build

usage

run scraper with this command.

node dist/index.js --session_id=<session-id of sales navigator account> --sesssion_id_profiles=<session-id of likedin account use to scrape profiles> --search_keyword=<keyword to search for in profiles> --saved_search=<url of saved search of sales navigator account> --output=<name of output csv file>

arguments

Required arguments:

 --session_id  	            value of li_at cookie.
                            steps required to get value
                            1. Login to LinkedIn using an account of your choice.
                            2. Open Chrome developer tools
                            3. Go to tab Application, then from left panel select Storage -> Cookies -> https://www.linkedin.com. In the main view locate row with name li_at and copy content from the column Value.
 --search_keyword           keyword you want to search for in profiles. i.e testrail

 --sesssion_id_profiles     session id of linkedin account (most probably fake. it may get banned. )

Note : you must provide either input or saved_search argument.

in the absence of both we don't have profile links to visit. if both saved_search and input is provided then input is at first priority.

Optional arguments:

argument Default Description
--saved_search no default value url of saved search from sales navigator
--input no default value name of text file containing profile links. if you don't provide
input file scraper will scrape profile links
from sales navigator
--output profiles.csv name of file where you want to store output of scraper i.e output.csv

extra note: you may some time encounter the error that certain selector is not found. Don't worry I have implemented multiple cases if it doesn't find one selector it will get another one.

there may be cases that i didn't encouter yet. they will not throw an error. and the only indication that something wrong happen will be in your data either jobTitle will be empty string or jobTitle value will be replaced with companyName. I have scrape over 200 profiles didn't find any problem yet. if you find any issue please let me know.

linkedin_scraper's People

Contributors

kashif-ghafoor avatar

Stargazers

 avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

Forkers

gonlyplease

linkedin_scraper's Issues

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.