Skip to content

Commit 5be21c1

Browse files
kgriteshclaude
andcommitted
docs: add MCP e2e test planning documents
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
1 parent 645a302 commit 5be21c1

File tree

3 files changed

+224
-0
lines changed

3 files changed

+224
-0
lines changed
Lines changed: 98 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,98 @@
1+
# Phase 1: MCP Protocol Tests
2+
3+
> **Status:** pending
4+
> **Depends on:** none
5+
6+
## Overview
7+
8+
Tests that verify the MCP server's tool registration, parameter schemas, input validation, and error handling when the scraper is not initialized. These run without LinkedIn credentials.
9+
10+
## Implementation
11+
12+
**Files:**
13+
14+
- Create: `tests/test_mcp_server.py`
15+
- Modify: `tests/conftest.py` — add `mcp_client` async fixture
16+
- Modify: `pyproject.toml` — add `asyncio_mode = "auto"`
17+
18+
**What to test:**
19+
20+
### 1. Tool Registration (`TestToolRegistration`)
21+
22+
- Exactly 11 tools registered, names match expected set
23+
- Each tool's required params and defaults verified:
24+
- `scrape_profile`: requires `profile_url`
25+
- `search_profiles`: requires `query`, `max_results` defaults to 5, 8 total params
26+
- `get_session_status`: no params
27+
- `reset_session`: no params
28+
- `scrape_company`: requires `company_url`
29+
- `search_posts`: requires `keywords`, `scroll_pause` defaults to 2.0, `max_comments` defaults to 10
30+
- `scrape_incoming_connections`: `max_results` defaults to 10
31+
- `scrape_outgoing_connections`: `max_results` defaults to 10
32+
- `scrape_conversations_list`: `max_results` defaults to 10
33+
- `scrape_conversation`: `participant_name` optional/nullable
34+
- `send_connection_request`: requires `profile_url`, `note` optional
35+
36+
### 2. Parameter Validation (`TestParameterValidation`)
37+
38+
- Empty `profile_url` on `scrape_profile``ToolError`
39+
- Empty `query` on `search_profiles``ToolError`
40+
- Empty `company_url` on `scrape_company``ToolError`
41+
- Empty `keywords` on `search_posts``ToolError`
42+
- Empty `profile_url` on `send_connection_request``ToolError`
43+
44+
### 3. Uninitialized Scraper (`TestUninitializedScraper`)
45+
46+
- Direct `get_scraper()` call raises `RuntimeError`
47+
- Each of 10 tools (except `reset_session`) returns error string containing "Scraper not initialized"
48+
- `reset_session` succeeds even when uninitialized (returns "reset successfully")
49+
50+
**What to build:**
51+
52+
Key fixture — `mcp_client` in conftest.py:
53+
54+
```python
55+
@pytest.fixture
56+
async def mcp_client() -> AsyncGenerator[Client, None]:
57+
async with Client(mcp_app) as client:
58+
yield client
59+
```
60+
61+
Key fixture — ensuring no scraper (defined inside `TestUninitializedScraper`):
62+
63+
```python
64+
@pytest.fixture(autouse=True)
65+
def _ensure_no_scraper() -> Generator[None, None, None]:
66+
original = mcp_server._scraper_instance
67+
mcp_server._scraper_instance = None
68+
yield
69+
mcp_server._scraper_instance = original
70+
```
71+
72+
Helper for extracting text from `CallToolResult`:
73+
74+
```python
75+
def _result_text(result: Any) -> str:
76+
assert result.content
77+
return result.content[0].text
78+
```
79+
80+
Helper for tool schema inspection:
81+
82+
```python
83+
def _find_tool(tools: list[Any], name: str) -> Any:
84+
for t in tools:
85+
if t.name == name:
86+
return t
87+
raise AssertionError(f"Tool '{name}' not found")
88+
```
89+
90+
**Commit:** `test(mcp): add protocol, validation, and uninitialized state tests`
91+
92+
## Done When
93+
94+
- [ ] 11+ tool registration tests pass
95+
- [ ] 5 parameter validation tests pass
96+
- [ ] 12 uninitialized scraper tests pass
97+
- [ ] All run without LINKEDIN_COOKIE: `uv run python -m pytest tests/test_mcp_server.py -m "not integration"`
98+
- [ ] `make check` passes
Lines changed: 76 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,76 @@
1+
# Phase 2: MCP E2E Integration Tests
2+
3+
> **Status:** pending
4+
> **Depends on:** Phase 1
5+
6+
## Overview
7+
8+
Full-stack tests that call MCP tools through `fastmcp.Client`, with a real `LinkedinSpider` injected into the MCP server's global state. These mirror `test_e2e.py` assertions but exercise the MCP protocol layer.
9+
10+
## Implementation
11+
12+
**Files:**
13+
14+
- Modify: `tests/test_mcp_server.py` — add integration test classes
15+
- Modify: `tests/conftest.py` — add `mcp_scraper` session-scoped fixture
16+
17+
**Pattern to follow:** `tests/test_e2e.py` — structural assertions, `pytest.skip` for empty data
18+
19+
**What to test (all `@pytest.mark.integration`):**
20+
21+
| Class | Tests |
22+
| ----------------------- | ----------------------------------------------------------------------------------------------- |
23+
| `TestMcpScrapeProfile` | JSON has expected keys, name populated, experience is list, invalid URL returns failure message |
24+
| `TestMcpScrapeCompany` | JSON has expected keys (name, company_url), name populated, invalid URL returns failure |
25+
| `TestMcpSearchProfiles` | Returns list with 1+ results respecting max_results, result has expected keys |
26+
| `TestMcpSearchPosts` | Returns list of posts, post has expected keys, engagement metrics are ints |
27+
| `TestMcpConversations` | Conversations list returns JSON or "No conversations", conversation has expected keys |
28+
| `TestMcpConnections` | Incoming/outgoing return JSON or "No ... found", connections have expected keys |
29+
| `TestMcpSessionStatus` | Status contains "Active", reset returns success (with state restoration) |
30+
31+
**What to build:**
32+
33+
Key fixture — `mcp_scraper` in conftest.py:
34+
35+
```python
36+
@pytest.fixture(scope="session")
37+
def mcp_scraper(spider: LinkedinSpider) -> Generator[LinkedinSpider, None, None]:
38+
mcp_server._scraper_instance = spider
39+
yield spider
40+
mcp_server._scraper_instance = None
41+
```
42+
43+
Session-scoped, depends on existing `spider` fixture. Phase 2 tests declare `mcp_scraper` as a parameter for its side effect (setting `_scraper_instance`).
44+
45+
Key fixture — state restoration for `reset_session` test:
46+
47+
```python
48+
@pytest.fixture
49+
def _restore_scraper_after() -> Generator[None, None, None]:
50+
original = mcp_server._scraper_instance
51+
yield
52+
mcp_server._scraper_instance = original
53+
```
54+
55+
Helper — JSON parsing for prefixed responses:
56+
57+
Several tools return `f"label:\n{json.dumps(data)}"`. Helper needed:
58+
59+
```python
60+
def _parse_prefixed_json(text: str) -> Any:
61+
if "\n" in text:
62+
_, json_str = text.split("\n", 1)
63+
return json.loads(json_str)
64+
return json.loads(text)
65+
```
66+
67+
Note: `scrape_profile` returns raw JSON (no prefix), while `scrape_company` returns `"company_profile:\n{json}"`, `search_profiles` returns `"profiles:\n{json}"`, etc.
68+
69+
**Commit:** `test(mcp): add e2e integration tests through MCP protocol`
70+
71+
## Done When
72+
73+
- [ ] All integration tests pass with LINKEDIN_COOKIE set
74+
- [ ] Tests skip gracefully without LINKEDIN_COOKIE
75+
- [ ] `make check` passes
76+
- [ ] `make test` passes

docs/plans/mcp-e2e-tests/plan.md

Lines changed: 50 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,50 @@
1+
# Plan: MCP Server E2E Tests
2+
3+
> **Source:** Feature request
4+
> **Created:** 2026-03-07
5+
> **Status:** planning
6+
7+
## Goal
8+
9+
Add comprehensive MCP-level tests that verify tool registration, parameter validation, error handling, and full-stack e2e behavior through the FastMCP protocol.
10+
11+
## Acceptance Criteria
12+
13+
- [ ] All 11 MCP tools verified as registered with correct names and parameter schemas
14+
- [ ] Parameter validation tested (empty required fields raise ToolError)
15+
- [ ] Uninitialized scraper behavior tested (error strings returned, not crashes)
16+
- [ ] Full e2e tests through MCP protocol for all scraper tools (integration, needs LINKEDIN_COOKIE)
17+
- [ ] `make check` and `make test` pass
18+
- [ ] Phase 1 tests run without credentials; Phase 2 tests skip without LINKEDIN_COOKIE
19+
20+
## Codebase Context
21+
22+
### Existing Patterns to Follow
23+
24+
- **Test structure**: `tests/test_e2e.py` — class-based tests with structural assertions, `pytest.skip` for missing data
25+
- **Fixtures**: `tests/conftest.py` — session-scoped `spider` fixture with LINKEDIN_COOKIE guard
26+
- **MCP server**: `src/linkedin_spider/mcp/server.py``mcp_app = FastMCP("linkedin-spider")`, global `_scraper_instance`, `get_scraper()` accessor
27+
28+
### Test Infrastructure
29+
30+
- pytest + pytest-asyncio, run via `uv run python -m pytest`
31+
- Markers: `slow`, `integration`
32+
- `fastmcp.Client` available for in-memory MCP testing (no subprocess needed)
33+
34+
### Key Details
35+
36+
- Tools return JSON strings (some with `"label:\n"` prefix, `scrape_profile` returns raw JSON)
37+
- Tools with validation raise `ValueError` on empty required params → FastMCP converts to `ToolError`
38+
- Tools catch `Exception` internally and return error message strings
39+
- `reset_session` modifies global `_scraper_instance` — needs state restoration in tests
40+
41+
## Phases
42+
43+
| # | Phase | Status | Depends On |
44+
| --- | ------------------------------------------------------------------ | ------- | ---------- |
45+
| 1 | MCP protocol tests (registration, validation, uninitialized state) | pending ||
46+
| 2 | MCP e2e integration tests (full stack through MCP protocol) | pending | Phase 1 |
47+
48+
## Phase Dependency Graph
49+
50+
Phase 1 --> Phase 2

0 commit comments

Comments
 (0)