Skip to content

Commit f8309a4

Browse files
feature: add test infrastructure and developer testing guide (PR 1) (#448)
- Add developer testing guide in docs/testing.md - Implement pytest fixtures in tests/conftest.py: - mock_session_factory for sequential response mocking - Response fixtures for login, errors, device info, hosts, xpath - Pre-configured client fixtures for all encryption methods - Add API response fixtures (6 JSON files) with realistic router responses - Add 4 example tests demonstrating key patterns: - Successful login with session state validation - Authentication error handling - XPath operations with sequential responses - Pre-configured fixture usage - Configure pytest in pyproject.toml: - Add pytest-asyncio, pytest-aiohttp, pytest-cov dependencies - Set asyncio_mode, testpaths, and markers Part of #447: Add unit and integration tests to have test coverage --------- Co-authored-by: Copilot <[email protected]>
1 parent aa320a4 commit f8309a4

File tree

11 files changed

+2055
-12
lines changed

11 files changed

+2055
-12
lines changed

docs/testing.md

Lines changed: 161 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,161 @@
1+
# Testing Guide for Developers
2+
3+
This guide covers testing strategies, patterns, and workflows for the Sagemcom API.
4+
5+
## Overview
6+
7+
The test suite uses `pytest` with async support to validate the client's behavior against mocked Sagemcom router API responses. Tests are split into:
8+
9+
- **Unit tests** (`tests/unit/`) - Mock-based tests for individual methods, error handling, and encryption
10+
- **Integration tests** (`tests/integration/`) - Tests against real or comprehensive simulated router APIs (requires device access)
11+
- **Fixtures** (`tests/fixtures/`) - Sample API response payloads from different router models
12+
13+
## Running Tests (from Dev Container)
14+
15+
```bash
16+
# Run all tests
17+
poetry run pytest
18+
19+
# Run only unit tests
20+
poetry run pytest tests/unit/
21+
22+
# Run with coverage report
23+
poetry run pytest --cov=sagemcom_api
24+
25+
# Run with coverage HTML report
26+
poetry run pytest --cov=sagemcom_api --cov-report=html
27+
28+
# Run specific test file
29+
poetry run pytest tests/unit/test_client_basic.py
30+
31+
# Run specific test
32+
poetry run pytest tests/unit/test_client_basic.py::test_login_success
33+
```
34+
35+
## Test Structure
36+
37+
### Mocking Strategy
38+
39+
We mock at the **`aiohttp.ClientSession.post`** level to:
40+
- Simulate realistic HTTP interactions
41+
- Test full request/response cycle including JSON encoding/decoding
42+
- Validate request payload structure
43+
- Control response status codes and payloads
44+
45+
### Fixture Patterns
46+
47+
All fixtures are defined in `tests/conftest.py` with **function scope** for test isolation:
48+
49+
- **`mock_session_factory`** - Factory for creating mock aiohttp sessions with custom responses
50+
- **`login_success_response`** - Mock response for successful login
51+
- **`login_auth_error_response`** - Mock response for authentication failure
52+
- **`mock_client_...`** - Pre-configured SagemcomClient with mocked session
53+
54+
Example usage:
55+
```python
56+
@pytest.mark.asyncio
57+
async def test_example(mock_session_factory, login_success_response):
58+
mock_session = mock_session_factory([login_success_response])
59+
client = SagemcomClient("192.168.1.1", "admin", "password",
60+
EncryptionMethod.MD5, session=mock_session)
61+
# Test implementation...
62+
```
63+
64+
### API Response Fixtures
65+
66+
Realistic API responses are stored in `tests/fixtures/` as JSON files mirroring actual router responses:
67+
68+
- `login_success.json` - Successful login with session_id and nonce
69+
- `login_auth_error.json` - Authentication failure (XMO_AUTHENTICATION_ERR)
70+
- `device_info.json` - Device information response
71+
- `hosts.json` - Connected devices list
72+
- `xpath_value.json` - Generic XPath query response
73+
74+
These fixtures preserve the actual JSON-RPC structure from Sagemcom routers:
75+
```json
76+
{
77+
"reply": {
78+
"error": {"description": "XMO_REQUEST_NO_ERR"},
79+
"actions": [{
80+
"callbacks": [{
81+
"parameters": {"id": 12345, "nonce": "abcdef123456"}
82+
}]
83+
}]
84+
}
85+
}
86+
```
87+
88+
## Testing Patterns
89+
90+
### 1. Testing Authentication
91+
92+
All three encryption methods (MD5, SHA512, MD5_NONCE) must be tested. See `test_client_basic.py` for examples:
93+
94+
```python
95+
@pytest.mark.asyncio
96+
async def test_login_success(mock_session_factory, login_success_response):
97+
"""Test successful login flow."""
98+
# Demonstrates mocking login with session_id/nonce exchange
99+
```
100+
101+
### 2. Testing Error Handling
102+
103+
Each `XMO_*_ERR` constant should have corresponding test cases:
104+
105+
```python
106+
@pytest.mark.asyncio
107+
async def test_authentication_error(mock_session_factory, login_auth_error_response):
108+
"""Test AuthenticationException is raised on XMO_AUTHENTICATION_ERR."""
109+
# Demonstrates error response mocking
110+
```
111+
112+
### 3. Testing XPath Operations
113+
114+
Validate URL encoding with safe characters preserved:
115+
116+
```python
117+
@pytest.mark.asyncio
118+
async def test_xpath_url_encoding(mock_session_factory):
119+
"""Test XPath values are URL-encoded with /=[]' preserved."""
120+
# Demonstrates XPath encoding validation
121+
```
122+
123+
### 4. Testing Sequential Responses
124+
125+
Most API operations require multiple HTTP requests (login → operation). Use `mock_session_factory` with response lists:
126+
127+
```python
128+
# Two sequential responses
129+
mock_session = mock_session_factory([login_success_response, xpath_value_response])
130+
131+
await client.login() # Consumes login_success_response (1st call)
132+
await client.get_value_by_xpath() # Consumes xpath_value_response (2nd call)
133+
await client.logout() # Would raise StopIteration (no 3rd response)
134+
```
135+
136+
## Adding New Tests
137+
138+
### For a new unit test:
139+
140+
1. Determine what you're testing (method, error case, encryption variant)
141+
2. Create or reuse fixture for API response in `tests/fixtures/`
142+
3. Create test file in `tests/unit/test_<module>.py`
143+
4. Use `mock_session_factory` to inject responses
144+
5. Write assertions for both success and error paths
145+
6. Run test with coverage to verify new lines are covered
146+
147+
### For a new integration test:
148+
149+
1. Document router model and firmware version in test docstring
150+
2. Create test in `tests/integration/test_<feature>.py`
151+
3. Add conditional skip if router not available: `@pytest.mark.skipif(...)`
152+
4. Use real credentials from environment variables, not hardcoded
153+
154+
## Test Coverage
155+
156+
Run coverage reports regularly:
157+
```bash
158+
poetry run pytest --cov=sagemcom_api --cov-report=term-missing
159+
```
160+
161+
The `--cov-report=term-missing` shows which lines lack coverage.

0 commit comments

Comments
 (0)