Skip to content

Redfish Developer's Guide

Natalie Gaston edited this page Dec 2, 2025 · 3 revisions
Table of Contents
  1. System Requirements

  2. Quick Start

  3. Makefile Reference

  4. Development Workflow

  5. Integration Testing

  6. Additional Resources

  7. Developing New Redfish Integration Tests

System Requirements

Base Requirements

  • Operating System: Ubuntu 22.04 or later
  • Go: Version 1.24 or higher

Code Generation Requirements

  • Python: 3.x with PyYAML package

Integration Testing Requirements

  • Node.js: Version 16 or higher
  • npm: Latest version (included with Node.js)
  • Newman: Postman CLI test runner (npm install -g newman)
  • curl: Command-line HTTP client (pre-installed on most systems)

Optional:

  • newman-reporter-htmlextra: For enhanced HTML test reports (npm install -g newman-reporter-htmlextra)

Quick Start

All commands must be run from the redfish/openapi/infra/ directory:

cd redfish/openapi/infra/

Check your environment:

make rf-check-tools    # Verify all required tools are installed
make rf-deps           # Install any missing dependencies

Run the complete workflow:

make rf-all            # Merge schemas → Validate → Generate code → Add auth

Run integration tests:

make rf-integration-test

Makefile Reference

Help & Diagnostics

  • make help - Display all available targets with descriptions
  • make rf-check-tools - Verify required tools are installed

Dependency Management

  • make rf-deps - Install all required dependencies automatically
  • make rf-install-missing-tools - Install only missing tools

OpenAPI Processing

  • make rf-merge - Merge DMTF YAML files into single OpenAPI spec
  • make rf-validate - Validate the merged OpenAPI specification
  • make rf-generate - Generate Go server code from OpenAPI spec
  • make rf-auth - Add Basic Auth security to OpenAPI spec
  • make rf-metadata-generate - Generate metadata.xml from DMTF schemas

Workflow Commands

  • make rf-all - Complete pipeline: merge → validate → generate → auth → metadata
  • make rf-clean - Remove all generated files

Testing

  • make rf-integration-test - Run Newman integration tests against mock server

Development Workflow

Standard Development Process

  1. Verify your environment:
make rf-check-tools
  1. Install dependencies (if needed):
make rf-deps
  1. Make changes to OpenAPI specs (if adding/modifying endpoints):

    • Edit YAML files in dmtf/ directory
    • Update dmtf/openapi-reduced.yaml to reference new schemas
    • Follow DMTF Redfish schema format
  2. Regenerate code:

make rf-all
  1. Run tests:
make rf-integration-test

Individual Steps (Advanced)

For granular control, run individual steps:

make rf-merge      # Step 1: Merge YAML files only
make rf-validate   # Step 2: Validate merged spec only
make rf-generate   # Step 3: Generate Go code only
make rf-auth       # Step 4: Add authentication only

Integration Testing

Overview

The Redfish integration tests perform automated API-level testing using Newman (Postman CLI) to validate DMTF Redfish specification compliance and prevent regressions.

Test Methodology

  • Type: Black-box HTTP API testing
  • Framework: Newman (Postman CLI)
  • Test Data: Mock WSMAN repository
  • Port: 8181 (configurable via HTTP_PORT environment variable)

Running Tests

cd redfish/openapi/infra
make rf-integration-test

Custom port:

HTTP_PORT=9090 make rf-integration-test

What Gets Tested

Endpoint Coverage

  • ✅ Service Root (/redfish/v1/)
  • ✅ OData Service Document (/redfish/v1/odata)
  • ✅ Metadata Document (/redfish/v1/$metadata)
  • ✅ Systems Collection (/redfish/v1/Systems)
  • ✅ Individual Systems (/redfish/v1/Systems/{id})
  • ✅ Power Control Actions

Functional Coverage

  • Authentication: Public vs. protected endpoints with Basic Auth
  • Power Actions: On, ForceOff, ForceRestart, GracefulShutdown, PowerCycle
  • Error Handling: 400 (Bad Request), 401 (Unauthorized), 404 (Not Found), 405 (Method Not Allowed)
  • DMTF Compliance: OData headers, JSON structure, response formats
  • Edge Cases: Invalid IDs, malformed JSON, concurrent requests, caching

Test Execution Flow

The test runner automatically:

  1. Builds the application from source
  2. Starts a test server with mock data (port 8181)
  3. Waits for server readiness (polls /redfish/v1/)
  4. Executes Newman test collection
  5. Generates test reports (CLI + JSON)
  6. Cleans up server process

Test Reports

Results are saved to:

  • Console output: Real-time test results
  • JSON report: redfish/tests/postman/results/newman-report.json
  • Server logs: /tmp/redfish_test_server.log

Troubleshooting

Server fails to start:

Check logs

cat /tmp/redfish_test_server.log

Verify port is available

lsof -i :8181

Tests fail:

  • Check server logs at /tmp/redfish_test_server.log
  • Verify Newman is installed: newman --version
  • Ensure port 8181 is not in use

Additional Resources

Verify Installation:

go version          # Should be 1.24+
python3 --version   # Should be 3.x
node --version      # Should be 16+
newman --version    # Should show version number

Install Newman:

npm install -g newman
npm install -g newman-reporter-htmlextra  # Optional: HTML reports

Developing New Redfish Integration Tests

Overview

The Redfish integration tests use Newman (Postman's CLI) to validate the Redfish API implementation against the DMTF Redfish standard. Tests run against a mock server to ensure consistent, hardware-independent validation.

Test Architecture

Components

  1. Postman Collection (redfish/tests/postman/redfish-collection.json)

    • Contains all test requests and assertions
    • Organized into folders by endpoint category
    • Validates HTTP responses, headers, and JSON schemas
  2. Environment File (redfish/tests/postman/test-environment.json)

    • Defines variables (base_url, credentials, system_id)
    • Keeps tests portable across environments
  3. Test Runner (redfish/tests/run_tests.sh)

    • Builds and starts the mock server
    • Executes Newman tests
    • Reports results and cleans up
  4. Mock Repository (redfish/internal/mocks/mock_repo.go)

    • Provides test data without hardware dependencies
    • Simulates power state transitions
    • Validates operation correctness

Running Tests

From Repository Root

cd /path/to/console
bash redfish/tests/run_tests.sh

Using Makefile (Recommended)

cd redfish/openapi/infra
make rf-integration-test

Direct Newman Execution

newman run redfish/tests/postman/redfish-collection.json \
    --environment redfish/tests/postman/test-environment.json

Creating New Tests

Step 1: Identify the Test Scenario

Determine what you need to test:

  • New Redfish endpoint
  • Error handling case
  • Authentication behavior
  • Data validation
  • DMTF compliance requirement

Step 2: Add Test Request to Collection

Edit redfish/tests/postman/redfish-collection.json:

{
  "name": "Your Test Name",
  "event": [
    {
      "listen": "test",
      "script": {
        "exec": [
          "pm.test('Status code is 200', function () {",
          "    pm.response.to.have.status(200);",
          "});",
          "",
          "pm.test('Response has required property', function () {",
          "    var jsonData = pm.response.json();",
          "    pm.expect(jsonData).to.have.property('PropertyName');",
          "});"
        ],
        "type": "text/javascript"
      }
    }
  ],
  "request": {
    "auth": {
      "type": "basic"  // or "noauth" for public endpoints
    },
    "method": "GET",  // or POST, PATCH, DELETE
    "header": [
      {
        "key": "Accept",
        "value": "application/json"
      }
    ],
    "body": {
      "mode": "raw",
      "raw": "{\n  \"Key\": \"Value\"\n}",
      "options": {
        "raw": {
          "language": "json"
        }
      }
    },
    "url": {
      "raw": "{{base_url}}/redfish/v1/YourEndpoint",
      "host": ["{{base_url}}"],
      "path": ["redfish", "v1", "YourEndpoint"]
    }
  },
  "response": []
}

Step 3: Write Test Assertions

Use Postman's test syntax in the event.test.script.exec array:

Common Assertions

Status Code:

pm.test('Status code is 200', function () {
    pm.response.to.have.status(200);
});

Header Validation:

pm.test('OData-Version header is 4.0', function () {
    var odataVersion = pm.response.headers.get('OData-Version');
    pm.expect(odataVersion).to.equal('4.0');
});

pm.test('Content-Type is application/json', function () {
    var contentType = pm.response.headers.get('Content-Type');
    pm.expect(contentType).to.include('application/json');
});

JSON Response Structure:

pm.test('Response has required properties', function () {
    var jsonData = pm.response.json();
    pm.expect(jsonData).to.have.property('@odata.context');
    pm.expect(jsonData).to.have.property('@odata.id');
    pm.expect(jsonData).to.have.property('@odata.type');
});

Property Value Checks:

pm.test('PowerState has valid value', function () {
    var jsonData = pm.response.json();
    pm.expect(jsonData.PowerState).to.be.oneOf(['On', 'Off', 'PoweringOn', 'PoweringOff']);
});

Error Response Validation:

pm.test('Response contains Redfish error format', function () {
    var jsonData = pm.response.json();
    pm.expect(jsonData).to.have.property('error');
    pm.expect(jsonData.error).to.have.property('code');
    pm.expect(jsonData.error).to.have.property('message');
});

Multiple Status Codes:

pm.test('Status code is 202 Accepted or 409 Conflict', function () {
    pm.expect(pm.response.code).to.be.oneOf([202, 409]);
});

Step 4: Organize Tests into Folders

Place related tests in logical folders within the collection:

  • Public Endpoints - No authentication required
  • Protected Endpoints - Basic Auth required
  • Power Control Actions - ComputerSystem.Reset operations
  • Authentication Tests - Auth validation
  • Error Handling Tests - Error scenarios

Step 5: Update Mock Data (if needed)

If testing new endpoints or data, update redfish/internal/mocks/mock_repo.go:

func NewMockComputerSystemRepo() *MockComputerSystemRepo {
    return &MockComputerSystemRepo{
        systems: map[string]*v1.ComputerSystem{
            "test-system-1": {
                OdataContext: "/redfish/v1/$metadata#ComputerSystem.ComputerSystem",
                OdataID:      "/redfish/v1/Systems/test-system-1",
                OdataType:    "#ComputerSystem.v1_0_0.ComputerSystem",
                ID:           "test-system-1",
                Name:         "Test Computer System 1",
                SystemType:   "Physical",
                PowerState:   "Off",
                // Add new properties here
            },
        },
    }
}

Step 6: Test Locally

Run tests to verify your changes:

# Kill any running servers on port 8181
lsof -ti:8181 | xargs -r kill -9
sleep 1

# Run the test suite
cd redfish/openapi/infra
make rf-integration-test

Check the output for:

  • ✓ All assertions passing
  • Correct status codes
  • Expected response format
  • No server errors in logs

Step 7: Update Environment Variables (Optional)

If your test needs new variables, update test-environment.json:

{
  "key": "new_variable",
  "value": "test-value",
  "enabled": true
}

Then reference in tests with {{new_variable}}.

Test Organization Best Practices

1. Folder Structure

Redfish API Tests
├── Public Endpoints (no auth)
├── Protected Endpoints (Basic Auth)
├── Power Control Actions
├── Authentication Tests
└── Error Handling Tests

2. Naming Conventions

  • Folders: Descriptive category names
  • Tests: Action + Resource (e.g., "Get Specific System", "Reset System - ForceOff")
  • Assertions: Clear, readable descriptions

3. Test Coverage Checklist

For each endpoint, verify:

  • ✓ Success case (200/202)
  • ✓ Required headers (OData-Version, Content-Type)
  • ✓ Response structure (@odata properties)
  • ✓ Authentication (401 without auth)
  • ✓ Invalid input (400 Bad Request)
  • ✓ Not found (404)
  • ✓ Method not allowed (405)
  • ✓ Error response format

Advanced Testing Techniques

Using Collection Variables

Set variables during test execution:

// Save response data for later tests
var jsonData = pm.response.json();
pm.collectionVariables.set('extracted_id', jsonData.Id);

// Use in subsequent requests
// URL: {{base_url}}/redfish/v1/Systems/{{extracted_id}}

Pre-request Scripts

Execute code before sending a request:

{
  "listen": "prerequest",
  "script": {
    "exec": [
      "// Generate dynamic data",
      "var timestamp = new Date().getTime();",
      "pm.collectionVariables.set('timestamp', timestamp);"
    ]
  }
}

Conditional Tests

Skip assertions based on conditions:

pm.test('Optional property exists if condition is met', function () {
    var jsonData = pm.response.json();
    if (jsonData.PowerState === 'On') {
        pm.expect(jsonData).to.have.property('ProcessorSummary');
    }
});

Response Time Validation

pm.test('Response time is less than 200ms', function () {
    pm.expect(pm.response.responseTime).to.be.below(200);
});

Debugging Tests

View Server Logs

cat /tmp/redfish_test_server.log

Run Single Test

Use Postman GUI or Newman with specific folder:

newman run redfish-collection.json \
    --environment test-environment.json \
    --folder "Power Control Actions"

Verbose Newman Output

newman run redfish-collection.json \
    --environment test-environment.json \
    --verbose

Check Request/Response Details

Add console logging in test scripts:

console.log('Request URL:', pm.request.url.toString());
console.log('Response Body:', pm.response.text());
console.log('Response Headers:', pm.response.headers);

Continuous Integration

Tests run automatically in GitHub Actions via .github/workflows/redfish-api-test.yml:

  1. Checkout code
  2. Setup Go environment
  3. Run test script
  4. Upload test results
  5. Display summary

The workflow fails if any assertions fail.

Common Pitfalls

1. Authentication Issues

  • Public endpoints should use "auth": { "type": "noauth" }
  • Protected endpoints inherit collection-level Basic Auth
  • Verify credentials match config.yml

2. Response Timing

  • Server startup takes time (~1-2 seconds)
  • First request may be slower
  • Mock transitions are instant, real hardware is not

3. JSON Structure

  • Always validate response is JSON before parsing
  • Check for null/undefined properties
  • Use optional chaining for nested properties

4. Port Conflicts

  • Kill existing servers: lsof -ti:8181 | xargs -r kill -9
  • Check PORT environment variable
  • Verify no other services on 8181

5. State Management

  • Mock repo resets between test runs
  • Power state transitions affect subsequent tests
  • Order matters for stateful operations

Test Maintenance

When to Update Tests

  1. New Redfish Endpoint: Add test coverage
  2. API Changes: Update assertions
  3. Schema Updates: Validate new properties
  4. Bug Fixes: Add regression tests
  5. DMTF Compliance: Match spec updates

Keeping Tests Aligned with OpenAPI Spec

  • Verify endpoint paths match OpenAPI definition
  • Check authentication requirements (security: [{}] vs [{"BasicAuth": []}])
  • Validate response schemas match generated types
  • Test all documented status codes

Resources

Examples

Example 1: Testing a GET Endpoint

{
  "name": "Get System by ID",
  "event": [{
    "listen": "test",
    "script": {
      "exec": [
        "pm.test('Status code is 200', function () {",
        "    pm.response.to.have.status(200);",
        "});",
        "",
        "pm.test('System has required properties', function () {",
        "    var system = pm.response.json();",
        "    pm.expect(system).to.have.property('Id');",
        "    pm.expect(system).to.have.property('Name');",
        "    pm.expect(system).to.have.property('PowerState');",
        "});"
      ]
    }
  }],
  "request": {
    "method": "GET",
    "url": "{{base_url}}/redfish/v1/Systems/{{system_id}}"
  }
}

Example 2: Testing a POST Action

{
  "name": "Reset System - ForceOff",
  "event": [{
    "listen": "test",
    "script": {
      "exec": [
        "pm.test('Status code is 202 Accepted', function () {",
        "    pm.response.to.have.status(202);",
        "});",
        "",
        "pm.test('OData-Version header is 4.0', function () {",
        "    pm.expect(pm.response.headers.get('OData-Version')).to.equal('4.0');",
        "});"
      ]
    }
  }],
  "request": {
    "method": "POST",
    "header": [{"key": "Content-Type", "value": "application/json"}],
    "body": {
      "mode": "raw",
      "raw": "{\"ResetType\": \"ForceOff\"}"
    },
    "url": "{{base_url}}/redfish/v1/Systems/{{system_id}}/Actions/ComputerSystem.Reset"
  }
}

Example 3: Testing Error Handling

{
  "name": "Get System - Invalid ID",
  "event": [{
    "listen": "test",
    "script": {
      "exec": [
        "pm.test('Status code is 404', function () {",
        "    pm.response.to.have.status(404);",
        "});",
        "",
        "pm.test('Response contains Redfish error format', function () {",
        "    var jsonData = pm.response.json();",
        "    pm.expect(jsonData).to.have.property('error');",
        "    pm.expect(jsonData.error).to.have.property('code');",
        "    pm.expect(jsonData.error).to.have.property('message');",
        "});"
      ]
    }
  }],
  "request": {
    "method": "GET",
    "url": "{{base_url}}/redfish/v1/Systems/invalid-id-12345"
  }
}

Contributing

When contributing tests:

  1. Follow existing patterns and conventions
  2. Add tests for both success and failure cases
  3. Document any new environment variables
  4. Update mock data if needed
  5. Verify all tests pass locally
  6. Include test updates in the same PR as code changes

Questions?

For questions or issues with integration tests, please:

  • Check this guide first
  • Review existing test examples
  • Check server logs at /tmp/redfish_test_server.log
  • Open an issue with test output and error details

Clone this wiki locally