Skip to content

Commit d2cf79d

Browse files
fix:rename the linkedin_scraper to linkedin_spider for imports
1 parent fae3b64 commit d2cf79d

39 files changed

+282
-159
lines changed

.gitIgnore

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -17,9 +17,9 @@ __pycache__/
1717
*.py[cod]
1818
*$py.class
1919

20-
.linkedin_scraper_profiles/*
20+
.linkedin_spider_profiles/*
2121
demo/*
22-
examples/.linkedin_scraper_profiles/*
22+
examples/.linkedin_spider_profiles/*
2323
# C extensions
2424
*.so
2525

README.md

Lines changed: 44 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -17,9 +17,27 @@ Effortless Linkedin scraping with zero detection. Extract, export, and automate
1717

1818
### Installation
1919

20+
Choose your preferred installation method:
21+
22+
#### Option 1: pip (Recommended for general use)
23+
```bash
24+
# For Python library only
25+
pip install linkedin-spider
26+
27+
# For CLI usage
28+
pip install linkedin-spider[cli]
29+
30+
# For MCP server usage
31+
pip install linkedin-spider[mcp]
32+
33+
# For all features (CLI + MCP + library)
34+
pip install linkedin-spider[all]
35+
```
36+
37+
#### Option 2: Development setup with uv
2038
```bash
2139
# Clone the repo
22-
github.com/vertexcover-io/linkedin-spider
40+
git clone https://github.com/vertexcover-io/linkedin-spider
2341
cd linkedin-spider
2442
# Install with uv
2543
uv sync
@@ -35,7 +53,7 @@ uv sync
3553
Perfect for integration into your existing Python applications:
3654

3755
```python
38-
from linkedin_scraper import LinkedinSpider, ScraperConfig
56+
from linkedin_spider import LinkedinSpider, ScraperConfig
3957

4058
config = ScraperConfig(headless=True, page_load_timeout=30)
4159
```
@@ -138,6 +156,12 @@ For more examples : [examples](./examples)
138156
Great for quick data extraction and scripting:
139157

140158
```bash
159+
# If installed via pip
160+
linkedin-spider-cli search -q "product manager" -n 10 -o results.json
161+
linkedin-spider-cli profile -u "https://linkedin.com/in/johndoe" -o profile.json
162+
linkedin-spider-cli company -u "https://linkedin.com/company/openai" -o company.json
163+
164+
# If using development setup
141165
uv run linkedin-spider-cli search -q "product manager" -n 10 -o results.json
142166
uv run linkedin-spider-cli profile -u "https://linkedin.com/in/johndoe" -o profile.json
143167
uv run linkedin-spider-cli company -u "https://linkedin.com/company/openai" -o company.json
@@ -166,16 +190,29 @@ PORT=8000
166190
Start the MCP server:
167191

168192
```bash
193+
# If installed via pip
194+
# Show available transport options
195+
linkedin-spider-mcp
196+
197+
# Start with specific transport
198+
linkedin-spider-mcp serve sse
199+
linkedin-spider-mcp serve http --host 0.0.0.0 --port 9000
200+
linkedin-spider-mcp serve stdio
201+
202+
# Or use environment variables
203+
TRANSPORT=sse linkedin-spider-mcp serve
204+
205+
# If using development setup
169206
# Show available transport options
170-
uv run linkedin_mcp
207+
uv run linkedin-spider-mcp
171208

172209
# Start with specific transport
173-
uv run linkedin_mcp sse
174-
uv run linkedin_mcp http --host 0.0.0.0 --port 9000
175-
uv run linkedin_mcp stdio
210+
uv run linkedin-spider-mcp serve sse
211+
uv run linkedin-spider-mcp serve http --host 0.0.0.0 --port 9000
212+
uv run linkedin-spider-mcp serve stdio
176213

177214
# Or use environment variables
178-
TRANSPORT=sse uv run linkedin_mcp
215+
TRANSPORT=sse uv run linkedin-spider-mcp serve
179216
```
180217

181218
#### Claude Code Integration

examples/basic_usage.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
Example: Basic Usage
33
"""
44

5-
from linkedin_scraper import LinkedinSpider, ScraperConfig
5+
from linkedin_spider import LinkedinSpider, ScraperConfig
66

77

88
def basic_example():

examples/company_scraper.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
Example: Company Scraping
33
"""
44

5-
from linkedin_scraper import LinkedinSpider, ScraperConfig
5+
from linkedin_spider import LinkedinSpider, ScraperConfig
66

77

88
def scrape_company_example():

examples/connections_scraper.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@
44

55
import asyncio
66

7-
from linkedin_scraper import LinkedinSpider, ScraperConfig
7+
from linkedin_spider import LinkedinSpider, ScraperConfig
88

99

1010
async def scrape_connections_example():

examples/conversations_scraper.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@
44

55
import asyncio
66

7-
from linkedin_scraper import LinkedinSpider, ScraperConfig
7+
from linkedin_spider import LinkedinSpider, ScraperConfig
88

99

1010
async def scrape_conversations_list_example():

examples/profile_scraper.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
Example: Profile Scraping
33
"""
44

5-
from linkedin_scraper import LinkedinSpider, ScraperConfig
5+
from linkedin_spider import LinkedinSpider, ScraperConfig
66

77

88
def scrape_profile_example():

examples/profile_search.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
Example: Profile Search
33
"""
44

5-
from linkedin_scraper import LinkedinSpider, ScraperConfig
5+
from linkedin_spider import LinkedinSpider, ScraperConfig
66

77

88
def search_profiles_example():

pyproject.toml

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -41,7 +41,7 @@ cli = [
4141
]
4242
mcp = [
4343
"fastmcp>=0.1.0",
44-
"mcp>=1.0.0"
44+
"cyclopts>=2.0.0"
4545
]
4646
dev = [
4747
"pytest>=7.2.0",
@@ -62,19 +62,19 @@ Issues = "https://github.com/vertexcover-io/linkedin-spider/issues"
6262
Documentation = "https://github.com/vertexcover-io/linkedin-spider#readme"
6363

6464
[project.scripts]
65-
linkedin-spider-cli = "linkedin_scraper.cli.__main__:main"
66-
linkedin-spider-mcp = "linkedin_scraper.mcp.server:main"
65+
linkedin-spider-cli = "linkedin_spider.cli.__main__:main"
66+
linkedin-spider-mcp = "linkedin_spider.mcp.server:cli_main"
6767

6868
[tool.hatch.build.targets.wheel]
69-
packages = ["src/linkedin_scraper"]
69+
packages = ["src/linkedin_spider"]
7070

7171
[tool.setuptools.packages.find]
7272
where = ["src"]
73-
include = ["linkedin_scraper*"]
73+
include = ["linkedin_spider*"]
7474
exclude = ["tests*", "venv*", ".venv*", "__pycache__*"]
7575

7676
[tool.setuptools.package-data]
77-
linkedin_scraper = ["py.typed"]
77+
linkedin_spider = ["py.typed"]
7878
"*" = ["*.md", "*.txt", "*.yml", "*.yaml"]
7979

8080
[tool.pytest.ini_options]
@@ -104,7 +104,7 @@ warn_unused_ignores = true
104104
warn_no_return = true
105105
warn_unreachable = true
106106
strict_equality = true
107-
files = ["src/linkedin_scraper", "cli", "mcp"]
107+
files = ["src/linkedin_spider", "cli", "mcp"]
108108

109109
[[tool.mypy.overrides]]
110110
module = [
@@ -153,7 +153,7 @@ ignore = [
153153
"tests/**/*" = ["S101", "S106", "S108", "S311", "A001", "A002", "A003"]
154154

155155
[tool.ruff.isort]
156-
known-first-party = ["linkedin_scraper", "cli", "mcp"]
156+
known-first-party = ["linkedin_spider", "cli", "mcp"]
157157

158158
[tool.black]
159159
line-length = 100
@@ -175,7 +175,7 @@ extend-exclude = '''
175175
'''
176176

177177
[tool.coverage.run]
178-
source = ["src/linkedin_scraper", "cli", "mcp"]
178+
source = ["src/linkedin_spider", "cli", "mcp"]
179179
omit = [
180180
"*/tests/*",
181181
"*/test_*",

src/linkedin_scraper/__init__.py

Lines changed: 0 additions & 15 deletions
This file was deleted.

0 commit comments

Comments
 (0)