linkedin-sourcer
npx skills add https://github.com/kylemclaren/linkedin-sourcer --skill linkedin-sourcer
Agent 安装分布
Skill 文档
LinkedIn Sourcer
Source candidates from LinkedIn, analyze their profiles, and evaluate fit against role requirements using the linkedin_scraper library (v3.0+, Playwright-based, async).
Prerequisites
Ensure dependencies are installed before any scraping:
pip install linkedin-scraper
playwright install chromium
An authenticated session file (session.json) is required. If one does not exist, create one:
Programmatic login (using credentials):
python3 scripts/create_session.py --email USER@EXAMPLE.COM --password PASS
Or via environment variables:
export LINKEDIN_EMAIL=user@example.com
export LINKEDIN_PASSWORD=mypassword
python3 scripts/create_session.py
Manual login (opens a browser window â use when programmatic login fails due to CAPTCHA/2FA):
python3 scripts/create_session.py
The session file is reusable until LinkedIn expires it. See references/linkedin_scraper_api.md for browser configuration options.
Workflow Decision Tree
Determine the task type:
- “Scrape this profile / these profiles” â Profile Scraping
- “Find candidates for this role” â Candidate Search
- “Evaluate this candidate for this role” â Candidate Evaluation
- “Compare these candidates” â Candidate Comparison
1. Profile Scraping
Run scripts/scrape_profile.py to extract structured profile data:
python3 scripts/scrape_profile.py "https://linkedin.com/in/username" --session session.json
For multiple profiles:
python3 scripts/scrape_profile.py URL1 URL2 URL3 --delay 2 --output profiles.json
Output is JSON with: name, headline, location, about, experiences, educations, skills.
For inline scraping within custom code, see references/linkedin_scraper_api.md â PersonScraper.
2. Candidate Search
Generate boolean search queries the user can paste into LinkedIn or Google to find candidates. See references/sourcing_workflows.md â Boolean Search String Patterns for templates and examples. Tailor the boolean string to the specific role requirements provided.
3. Candidate Evaluation
After scraping profile(s), evaluate fit against a job description:
- Scrape the candidate’s profile
- Apply the scorecard template from
references/sourcing_workflows.mdâ Candidate Scorecard Template - Rate each criterion (1-5) with notes based on the scraped data
- Assign an overall fit rating: STRONG_FIT, GOOD_FIT, PARTIAL_FIT, or WEAK_FIT
- Identify strengths, concerns, and key questions for outreach
Use the evaluation heuristics in references/sourcing_workflows.md â Evaluation Heuristics to guide ratings.
For quick single-candidate output, use the Candidate Summary Template instead.
4. Candidate Comparison
When evaluating multiple candidates for the same role:
- Scrape all candidate profiles
- Apply the comparison table from
references/sourcing_workflows.mdâ Candidate Comparison Table - Rank candidates with rationale
Error Handling
- AuthenticationError â Session expired. Re-run
scripts/create_session.pywith credentials or manual login - RateLimitError â Wait and retry. Increase
--delaybetween requests - ProfileNotFoundError â Profile is private or URL is invalid
See references/linkedin_scraper_api.md â Error Handling for try/except patterns.
Rate Limiting
Always use delays between requests (default 2s in scripts). For large batches, increase to 3-5s. Never scrape aggressively â respect LinkedIn’s rate limits.