-
Notifications
You must be signed in to change notification settings - Fork 0
Redfish Developer's Guide
Table of Contents
- Operating System: Ubuntu 22.04 or later
- Go: Version 1.24 or higher
- Python: 3.x with PyYAML package
- Node.js: Version 16 or higher
- npm: Latest version (included with Node.js)
-
Newman: Postman CLI test runner (
npm install -g newman) - curl: Command-line HTTP client (pre-installed on most systems)
Optional:
-
newman-reporter-htmlextra: For enhanced HTML test reports (
npm install -g newman-reporter-htmlextra)
All commands must be run from the redfish/openapi/infra/ directory:
cd redfish/openapi/infra/Check your environment:
make rf-check-tools # Verify all required tools are installed
make rf-deps # Install any missing dependenciesRun the complete workflow:
make rf-all # Merge schemas → Validate → Generate code → Add authRun integration tests:
make rf-integration-test-
make help- Display all available targets with descriptions -
make rf-check-tools- Verify required tools are installed
-
make rf-deps- Install all required dependencies automatically -
make rf-install-missing-tools- Install only missing tools
-
make rf-merge- Merge DMTF YAML files into single OpenAPI spec -
make rf-validate- Validate the merged OpenAPI specification -
make rf-generate- Generate Go server code from OpenAPI spec -
make rf-auth- Add Basic Auth security to OpenAPI spec -
make rf-metadata-generate- Generate metadata.xml from DMTF schemas
-
make rf-all- Complete pipeline: merge → validate → generate → auth → metadata -
make rf-clean- Remove all generated files
-
make rf-integration-test- Run Newman integration tests against mock server
- Verify your environment:
make rf-check-tools- Install dependencies (if needed):
make rf-deps-
Make changes to OpenAPI specs (if adding/modifying endpoints):
- Edit YAML files in
dmtf/directory - Update
dmtf/openapi-reduced.yamlto reference new schemas - Follow DMTF Redfish schema format
- Edit YAML files in
-
Regenerate code:
make rf-all- Run tests:
make rf-integration-testFor granular control, run individual steps:
make rf-merge # Step 1: Merge YAML files only
make rf-validate # Step 2: Validate merged spec only
make rf-generate # Step 3: Generate Go code only
make rf-auth # Step 4: Add authentication onlyThe Redfish integration tests perform automated API-level testing using Newman (Postman CLI) to validate DMTF Redfish specification compliance and prevent regressions.
- Type: Black-box HTTP API testing
- Framework: Newman (Postman CLI)
- Test Data: Mock WSMAN repository
-
Port: 8181 (configurable via
HTTP_PORTenvironment variable)
cd redfish/openapi/infra
make rf-integration-testCustom port:
HTTP_PORT=9090 make rf-integration-test
- ✅ Service Root (
/redfish/v1/) - ✅ OData Service Document (
/redfish/v1/odata) - ✅ Metadata Document (
/redfish/v1/$metadata) - ✅ Systems Collection (
/redfish/v1/Systems) - ✅ Individual Systems (
/redfish/v1/Systems/{id}) - ✅ Power Control Actions
- ✅ Authentication: Public vs. protected endpoints with Basic Auth
- ✅ Power Actions: On, ForceOff, ForceRestart, GracefulShutdown, PowerCycle
- ✅ Error Handling: 400 (Bad Request), 401 (Unauthorized), 404 (Not Found), 405 (Method Not Allowed)
- ✅ DMTF Compliance: OData headers, JSON structure, response formats
- ✅ Edge Cases: Invalid IDs, malformed JSON, concurrent requests, caching
The test runner automatically:
- Builds the application from source
- Starts a test server with mock data (port 8181)
-
Waits for server readiness (polls
/redfish/v1/) - Executes Newman test collection
- Generates test reports (CLI + JSON)
- Cleans up server process
Results are saved to:
- Console output: Real-time test results
-
JSON report:
redfish/tests/postman/results/newman-report.json -
Server logs:
/tmp/redfish_test_server.log
Server fails to start:
cat /tmp/redfish_test_server.loglsof -i :8181Tests fail:
- Check server logs at
/tmp/redfish_test_server.log - Verify Newman is installed:
newman --version - Ensure port 8181 is not in use
Verify Installation:
go version # Should be 1.24+
python3 --version # Should be 3.x
node --version # Should be 16+
newman --version # Should show version numberInstall Newman:
npm install -g newman
npm install -g newman-reporter-htmlextra # Optional: HTML reportsThe Redfish integration tests use Newman (Postman's CLI) to validate the Redfish API implementation against the DMTF Redfish standard. Tests run against a mock server to ensure consistent, hardware-independent validation.
-
Postman Collection (
redfish/tests/postman/redfish-collection.json)- Contains all test requests and assertions
- Organized into folders by endpoint category
- Validates HTTP responses, headers, and JSON schemas
-
Environment File (
redfish/tests/postman/test-environment.json)- Defines variables (base_url, credentials, system_id)
- Keeps tests portable across environments
-
Test Runner (
redfish/tests/run_tests.sh)- Builds and starts the mock server
- Executes Newman tests
- Reports results and cleans up
-
Mock Repository (
redfish/internal/mocks/mock_repo.go)- Provides test data without hardware dependencies
- Simulates power state transitions
- Validates operation correctness
cd /path/to/console
bash redfish/tests/run_tests.shcd redfish/openapi/infra
make rf-integration-testnewman run redfish/tests/postman/redfish-collection.json \
--environment redfish/tests/postman/test-environment.jsonDetermine what you need to test:
- New Redfish endpoint
- Error handling case
- Authentication behavior
- Data validation
- DMTF compliance requirement
Edit redfish/tests/postman/redfish-collection.json:
{
"name": "Your Test Name",
"event": [
{
"listen": "test",
"script": {
"exec": [
"pm.test('Status code is 200', function () {",
" pm.response.to.have.status(200);",
"});",
"",
"pm.test('Response has required property', function () {",
" var jsonData = pm.response.json();",
" pm.expect(jsonData).to.have.property('PropertyName');",
"});"
],
"type": "text/javascript"
}
}
],
"request": {
"auth": {
"type": "basic" // or "noauth" for public endpoints
},
"method": "GET", // or POST, PATCH, DELETE
"header": [
{
"key": "Accept",
"value": "application/json"
}
],
"body": {
"mode": "raw",
"raw": "{\n \"Key\": \"Value\"\n}",
"options": {
"raw": {
"language": "json"
}
}
},
"url": {
"raw": "{{base_url}}/redfish/v1/YourEndpoint",
"host": ["{{base_url}}"],
"path": ["redfish", "v1", "YourEndpoint"]
}
},
"response": []
}Use Postman's test syntax in the event.test.script.exec array:
Status Code:
pm.test('Status code is 200', function () {
pm.response.to.have.status(200);
});Header Validation:
pm.test('OData-Version header is 4.0', function () {
var odataVersion = pm.response.headers.get('OData-Version');
pm.expect(odataVersion).to.equal('4.0');
});
pm.test('Content-Type is application/json', function () {
var contentType = pm.response.headers.get('Content-Type');
pm.expect(contentType).to.include('application/json');
});JSON Response Structure:
pm.test('Response has required properties', function () {
var jsonData = pm.response.json();
pm.expect(jsonData).to.have.property('@odata.context');
pm.expect(jsonData).to.have.property('@odata.id');
pm.expect(jsonData).to.have.property('@odata.type');
});Property Value Checks:
pm.test('PowerState has valid value', function () {
var jsonData = pm.response.json();
pm.expect(jsonData.PowerState).to.be.oneOf(['On', 'Off', 'PoweringOn', 'PoweringOff']);
});Error Response Validation:
pm.test('Response contains Redfish error format', function () {
var jsonData = pm.response.json();
pm.expect(jsonData).to.have.property('error');
pm.expect(jsonData.error).to.have.property('code');
pm.expect(jsonData.error).to.have.property('message');
});Multiple Status Codes:
pm.test('Status code is 202 Accepted or 409 Conflict', function () {
pm.expect(pm.response.code).to.be.oneOf([202, 409]);
});Place related tests in logical folders within the collection:
- Public Endpoints - No authentication required
- Protected Endpoints - Basic Auth required
- Power Control Actions - ComputerSystem.Reset operations
- Authentication Tests - Auth validation
- Error Handling Tests - Error scenarios
If testing new endpoints or data, update redfish/internal/mocks/mock_repo.go:
func NewMockComputerSystemRepo() *MockComputerSystemRepo {
return &MockComputerSystemRepo{
systems: map[string]*v1.ComputerSystem{
"test-system-1": {
OdataContext: "/redfish/v1/$metadata#ComputerSystem.ComputerSystem",
OdataID: "/redfish/v1/Systems/test-system-1",
OdataType: "#ComputerSystem.v1_0_0.ComputerSystem",
ID: "test-system-1",
Name: "Test Computer System 1",
SystemType: "Physical",
PowerState: "Off",
// Add new properties here
},
},
}
}Run tests to verify your changes:
# Kill any running servers on port 8181
lsof -ti:8181 | xargs -r kill -9
sleep 1
# Run the test suite
cd redfish/openapi/infra
make rf-integration-testCheck the output for:
- ✓ All assertions passing
- Correct status codes
- Expected response format
- No server errors in logs
If your test needs new variables, update test-environment.json:
{
"key": "new_variable",
"value": "test-value",
"enabled": true
}Then reference in tests with {{new_variable}}.
Redfish API Tests
├── Public Endpoints (no auth)
├── Protected Endpoints (Basic Auth)
├── Power Control Actions
├── Authentication Tests
└── Error Handling Tests
- Folders: Descriptive category names
- Tests: Action + Resource (e.g., "Get Specific System", "Reset System - ForceOff")
- Assertions: Clear, readable descriptions
For each endpoint, verify:
- ✓ Success case (200/202)
- ✓ Required headers (OData-Version, Content-Type)
- ✓ Response structure (@odata properties)
- ✓ Authentication (401 without auth)
- ✓ Invalid input (400 Bad Request)
- ✓ Not found (404)
- ✓ Method not allowed (405)
- ✓ Error response format
Set variables during test execution:
// Save response data for later tests
var jsonData = pm.response.json();
pm.collectionVariables.set('extracted_id', jsonData.Id);
// Use in subsequent requests
// URL: {{base_url}}/redfish/v1/Systems/{{extracted_id}}Execute code before sending a request:
{
"listen": "prerequest",
"script": {
"exec": [
"// Generate dynamic data",
"var timestamp = new Date().getTime();",
"pm.collectionVariables.set('timestamp', timestamp);"
]
}
}Skip assertions based on conditions:
pm.test('Optional property exists if condition is met', function () {
var jsonData = pm.response.json();
if (jsonData.PowerState === 'On') {
pm.expect(jsonData).to.have.property('ProcessorSummary');
}
});pm.test('Response time is less than 200ms', function () {
pm.expect(pm.response.responseTime).to.be.below(200);
});cat /tmp/redfish_test_server.logUse Postman GUI or Newman with specific folder:
newman run redfish-collection.json \
--environment test-environment.json \
--folder "Power Control Actions"newman run redfish-collection.json \
--environment test-environment.json \
--verboseAdd console logging in test scripts:
console.log('Request URL:', pm.request.url.toString());
console.log('Response Body:', pm.response.text());
console.log('Response Headers:', pm.response.headers);Tests run automatically in GitHub Actions via .github/workflows/redfish-api-test.yml:
- Checkout code
- Setup Go environment
- Run test script
- Upload test results
- Display summary
The workflow fails if any assertions fail.
- Public endpoints should use
"auth": { "type": "noauth" } - Protected endpoints inherit collection-level Basic Auth
- Verify credentials match config.yml
- Server startup takes time (~1-2 seconds)
- First request may be slower
- Mock transitions are instant, real hardware is not
- Always validate response is JSON before parsing
- Check for null/undefined properties
- Use optional chaining for nested properties
- Kill existing servers:
lsof -ti:8181 | xargs -r kill -9 - Check PORT environment variable
- Verify no other services on 8181
- Mock repo resets between test runs
- Power state transitions affect subsequent tests
- Order matters for stateful operations
- New Redfish Endpoint: Add test coverage
- API Changes: Update assertions
- Schema Updates: Validate new properties
- Bug Fixes: Add regression tests
- DMTF Compliance: Match spec updates
- Verify endpoint paths match OpenAPI definition
- Check authentication requirements (security: [{}] vs [{"BasicAuth": []}])
- Validate response schemas match generated types
- Test all documented status codes
- Newman Documentation
- Postman Test Scripts
- DMTF Redfish Specification
- Chai Assertion Library (used in Postman tests)
{
"name": "Get System by ID",
"event": [{
"listen": "test",
"script": {
"exec": [
"pm.test('Status code is 200', function () {",
" pm.response.to.have.status(200);",
"});",
"",
"pm.test('System has required properties', function () {",
" var system = pm.response.json();",
" pm.expect(system).to.have.property('Id');",
" pm.expect(system).to.have.property('Name');",
" pm.expect(system).to.have.property('PowerState');",
"});"
]
}
}],
"request": {
"method": "GET",
"url": "{{base_url}}/redfish/v1/Systems/{{system_id}}"
}
}{
"name": "Reset System - ForceOff",
"event": [{
"listen": "test",
"script": {
"exec": [
"pm.test('Status code is 202 Accepted', function () {",
" pm.response.to.have.status(202);",
"});",
"",
"pm.test('OData-Version header is 4.0', function () {",
" pm.expect(pm.response.headers.get('OData-Version')).to.equal('4.0');",
"});"
]
}
}],
"request": {
"method": "POST",
"header": [{"key": "Content-Type", "value": "application/json"}],
"body": {
"mode": "raw",
"raw": "{\"ResetType\": \"ForceOff\"}"
},
"url": "{{base_url}}/redfish/v1/Systems/{{system_id}}/Actions/ComputerSystem.Reset"
}
}{
"name": "Get System - Invalid ID",
"event": [{
"listen": "test",
"script": {
"exec": [
"pm.test('Status code is 404', function () {",
" pm.response.to.have.status(404);",
"});",
"",
"pm.test('Response contains Redfish error format', function () {",
" var jsonData = pm.response.json();",
" pm.expect(jsonData).to.have.property('error');",
" pm.expect(jsonData.error).to.have.property('code');",
" pm.expect(jsonData.error).to.have.property('message');",
"});"
]
}
}],
"request": {
"method": "GET",
"url": "{{base_url}}/redfish/v1/Systems/invalid-id-12345"
}
}When contributing tests:
- Follow existing patterns and conventions
- Add tests for both success and failure cases
- Document any new environment variables
- Update mock data if needed
- Verify all tests pass locally
- Include test updates in the same PR as code changes
For questions or issues with integration tests, please:
- Check this guide first
- Review existing test examples
- Check server logs at
/tmp/redfish_test_server.log - Open an issue with test output and error details