diff --git a/.gitignore b/.gitignore index ee0a9f2e0..34e4899b5 100644 --- a/.gitignore +++ b/.gitignore @@ -109,3 +109,5 @@ tools/readpcap/readpcap_* swig openh264* level-zero-* + +venv* \ No newline at end of file diff --git a/doc/validation_framework.md b/doc/validation_framework.md new file mode 100644 index 000000000..30c74b97e --- /dev/null +++ b/doc/validation_framework.md @@ -0,0 +1,552 @@ +# MTL Validation Framework + +The Media Transport Library (MTL) Validation Framework provides comprehensive testing capabilities for various aspects of the MTL, including protocol compliance, performance, and integration testing. + +## Documentation Navigation + +🚀 **Quick Setup**: [Validation Quick Start Guide](validation_quickstart.md) - Get running in 3 steps +📁 **Local README**: [tests/validation/README.md](../tests/validation/README.md) - Quick reference and test categories +🔧 **Build Guide**: [build.md](build.md) - MTL build instructions + +--- + +## Overview + +The validation framework uses pytest to organize and execute tests across various scenarios, protocols, and backend implementations. It supports both automated testing in CI/CD environments and manual testing for development and troubleshooting. + +## Test Framework Structure + +The validation framework is organized into the following main components: + +- **common/**: Shared utilities for test functionality, including FFmpeg handlers, integrity verification tools, and network interface control +- **configs/**: Configuration files for test environment and network topology +- **mtl_engine/**: Core test framework components that manage test execution, application interfaces, and result reporting +- **tests/**: Test modules organized by scenario type: + - **single/**: Single-flow test scenarios for various protocols (ST2110-20/22/30/40), backends, and integrations + - **dual/**: Tests for multiple simultaneous flows + - **invalid/**: Error handling and negative test cases + +## Components Description + +### Common Utilities + +The `common/` directory contains shared utilities that provide fundamental functionality for test execution: + +- **FFmpeg Handler**: Manages FFmpeg operations for media processing and verification +- **Integrity Tools**: Provides functions for data integrity verification between source and received media +- **Network Interface Control**: Manages network interfaces required for testing + +#### gen_frames.sh + +A shell script for generating test frames for video testing: + +- Creates test patterns in various formats +- Supports different resolutions and frame rates +- Configurable color patterns and test signals +- Generates files like `ParkJoy_1080p.yuv`, test patterns, and various resolution formats + +**Prerequisites**: Requires FFmpeg with text filters enabled. + +**Usage**: +```bash +cd tests/validation/common # Must be in this directory +./gen_frames.sh # Generates test media files for validation +# Generated files will be available for test configuration +``` + +**Troubleshooting**: If you get "No such filter: 'drawtext'" errors, install a complete FFmpeg build or skip media generation. + +#### RxTxApp Test Tool + +**CRITICAL**: Tests require the RxTxApp tool which is not built by the main MTL build process. + +**Build Instructions** (required before running tests): +```bash +cd tests/tools/RxTxApp +meson setup build +meson compile -C build +cd ../../.. +``` + +**Location**: After building, RxTxApp is available at `tests/tools/RxTxApp/build/RxTxApp` + +**Supported Formats**: +- Resolutions: 3840x2160, 1920x1080, 1280x720, 640x360 +- Pixel formats: yuv422p, yuv422p10le +- Custom color patterns and test signals with timestamps +- Configurable frame rates and durations + +### Configuration Files + +The `configs/` directory contains YAML files that specify: + +- **Test Environment Settings**: Hardware specifications, media paths, and test parameters +- **Network Topology**: Interface configuration, IP addressing, and routing information + +#### [`test_config.yaml`](../tests/validation/configs/test_config.yaml) + +Location: `tests/validation/configs/test_config.yaml` + +Defines the test execution environment: + +**Key Parameters**: +- **build**: Path to MTL build directory +- **mtl_path**: Path to MTL installation directory +- **media_path**: Path to test media files directory +- **ramdisk.media.mountpoint**: Mount point for media RAM disk +- **ramdisk.media.size_gib**: Size of media RAM disk in GiB +- **ramdisk.pcap.mountpoint**: Mount point for packet capture RAM disk +- **ramdisk.pcap.size_gib**: Size of packet capture RAM disk in GiB + +#### [`topology_config.yaml`](../tests/validation/configs/topology_config.yaml) + +Location: `tests/validation/configs/topology_config.yaml` + +Defines the network topology and host configuration. + +### MTL Engine + +The `mtl_engine/` directory contains the core components of the framework: + +- **Execute Module**: Manages the execution flow of tests, including setup and teardown +- **Application Interfaces**: Provides interfaces to RX/TX, GStreamer, and FFmpeg applications +- **Reporting Tools**: Generates test reports and collects performance metrics + +### Test Modules + +The `tests/` directory contains test implementations organized by scenario type: + +- **Single Flow Tests**: Tests focusing on individual protocol implementations + - **ST2110-20**: Uncompressed video tests + - **ST2110-22**: Compressed video tests + - **ST2110-30**: Audio tests + - **ST2110-40**: Ancillary data tests + - Backend-specific tests (DMA, kernel socket, etc.) + - Integration tests (FFmpeg, GStreamer) + +- **Dual Flow Tests**: Tests involving multiple simultaneous flows +- **Invalid Tests**: Tests focusing on error handling and edge cases + +## Setup and Installation + +### Prerequisites + +#### 1. Build Media Transport Library First (CRITICAL) + +**⚠️ IMPORTANT**: The MTL library must be built before running validation tests! + +The tests require the RxTxApp binary and other MTL components. Follow these steps: + +```bash +# 1. Install build dependencies (see doc/build.md for your OS) +sudo apt-get update +sudo apt-get install git gcc meson python3 python3-pip pkg-config libnuma-dev libjson-c-dev libpcap-dev libgtest-dev libssl-dev +sudo pip install pyelftools ninja + +# 2. Build DPDK (required dependency) +git clone https://github.com/DPDK/dpdk.git +cd dpdk +git checkout v25.03 +git switch -c v25.03 +git am /path/to/Media-Transport-Library/patches/dpdk/25.03/*.patch +meson setup build +ninja -C build +sudo ninja install -C build +cd .. + +# 3. Build MTL +cd Media-Transport-Library +./build.sh + +# 4. Install MTL system-wide (REQUIRED for RxTxApp) +sudo ninja install -C build +sudo ldconfig + +# 5. Build required test tools (CRITICAL for validation) +cd tests/tools/RxTxApp +meson setup build +meson compile -C build +cd ../../.. +``` + +> **⚠️ CRITICAL**: +> - The RxTxApp tool is required for validation tests but not built by the main build process +> - RxTxApp requires MTL to be installed system-wide to build successfully +> - You must build it separately after installing MTL + +For complete build instructions, see [doc/build.md](build.md). + +#### 2. Other Prerequisites + +- **Python 3.9 or higher** +- **Test Media Files**: Input data files required for testing + - Test media files are necessary for running video, audio, and ancillary data tests + - These files are currently maintained on NFS in production environments + - For local testing, you can generate test frames using `tests/validation/common/gen_frames.sh` (see [gen_frames.sh section](#gen_framessh)) + - Configure the media file location in `configs/test_config.yaml` using the `media_path` parameter +- **Network Interfaces**: Configure interfaces according to MTL's [run.md](run.md) documentation + - Basic MTL network setup must be completed as described in run.md + - Virtual Functions (VFs) will be created automatically by the validation framework + - No manual VF creation is required +- **Root User Privileges**: MTL validation framework must run as root user + - Required for network management operations performed by `script/nicctl.sh` + - Direct network interface manipulation requires root access + - No alternative permission model is currently supported + - Use `sudo` with the full path to your virtual environment Python (e.g., `sudo ./venv/bin/python3`) +- **FFmpeg and GStreamer Plugins**: Required for integration tests + - Install FFmpeg: `sudo apt-get install ffmpeg` + - Install GStreamer and plugins: `sudo apt-get install gstreamer1.0-tools gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad` + - Some tests will fail if these are not installed + +### Environment Setup + +> **🚀 Quick Setup**: See [Validation Quick Start Guide](validation_quickstart.md) for streamlined setup steps. + +For detailed setup: + +1. Create Python virtual environment in `tests/validation/`: + +```bash +cd tests/validation +python3 -m venv venv +source venv/bin/activate +pip install -r requirements.txt +pip install -r common/integrity/requirements.txt +``` + +### Configuration + +#### Critical Configuration Steps + +1. **Update [`configs/topology_config.yaml`](../tests/validation/configs/topology_config.yaml)** with your actual network interface details: + +```yaml +--- +metadata: + version: '2.4' +hosts: + - name: host + instantiate: true + role: sut + network_interfaces: + - pci_device: 8086:1592 # Update with your NIC's PCI device ID + interface_index: 0 + connections: + - ip_address: 127.0.0.1 # Use actual IP for remote hosts + connection_type: SSHConnection + connection_options: + port: 22 + username: root # ⚠️ MUST be root for MTL validation + password: None # Use key-based auth when possible + key_path: /root/.ssh/id_rsa # Update path to your SSH key +``` + +**Device Specification Options**: +You can specify network devices in multiple ways: +- **PCI device ID** (recommended): `"0000:18:00.0"` (find with `lspci | grep Ethernet`) +- **Interface name**: `"enp175s0f0np0"` (find with `ip link show`) +- **System name**: Use your actual system hostname in the `name` field for the host +- **Environment variables**: `"${TEST_PF_PORT_P}"` (if you set them) + +**To find your device options**: +```bash +# Find PCI device IDs +lspci | grep Ethernet + +# Find system interface names +ip link show +``` + +2. **Update [`configs/test_config.yaml`](../tests/validation/configs/test_config.yaml)** with your environment paths: + +```yaml +build: /path/to/Media-Transport-Library/ # Update to your MTL root directory +mtl_path: /path/to/Media-Transport-Library/ # Update to your MTL root directory +media_path: /mnt/media # Update to your test media location +capture_cfg: + enable: false # Set to true if you want packet capture + test_name: test_name + pcap_dir: /mnt/ramdisk/pcap + capture_time: 5 + interface: null # Set to interface name if capture enabled +ramdisk: + media: + mountpoint: /mnt/ramdisk/media + size_gib: 32 + pcap: + mountpoint: /mnt/ramdisk/pcap + size_gib: 768 +``` + +**Important**: +- Set `build` and `mtl_path` to your actual MTL installation directory +- Set `media_path` to where your test media files are located +- Ensure the paths exist and are accessible + +#### Optional: Create VFs for Advanced Testing + +For NIC testing with Virtual Functions: + +```bash +# First, identify your network devices +lspci | grep Ethernet + +# Create VFs (replace with your actual PCI device IDs or interface names) +sudo ./script/nicctl.sh create_vf "0000:18:00.0" # Replace with your primary port +sudo ./script/nicctl.sh create_vf "0000:18:00.1" # Replace with your secondary port +``` + +**Examples of valid identifiers**: +- PCI device ID: `"0000:18:00.0"` +- Interface name: `"enp24s0f0"` +- Environment variables: `"${TEST_PF_PORT_P}"` (if you set them) + +## Running Tests + +> **⚠️ CRITICAL**: Tests must be run as **root user**, not regular user. MTL validation framework requires root privileges for network operations.### Basic Test Execution + +### Run specific test with parameters + +**Examples of running tests with specific parameters**: +```bash +# Run fps test with specific parameters +pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml "tests/single/st20p/fps/test_fps.py::test_fps[|fps = p60|-ParkJoy_1080p]" + +# Run specific integrity test with resolution parameters +pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml "tests/dual/st20p/integrity/test_integrity.py::test_integrity[yuv422p10le-1920x1080]" + +# Run specific packing test +pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml "tests/dual/st20p/packing/test_packing.py::test_packing[bpm-10]" + +# Run audio format test with specific format +pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml "tests/dual/st30p/st30p_format/test_st30p_format.py::test_st30p_format[pcm24]" +``` + +> **🚀 Quick Test Execution**: See [Quick Start Guide](validation_quickstart.md#3-run-tests) for basic test commands. + +For comprehensive test execution: + +### Running Specific Tests with Parameters + +Run a specific test case with custom parameters: + +```bash +pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml "tests/single/st20p/fps/test_fps.py::test_fps[|fps = p60|-ParkJoy_1080p]" +``` + +```bash +python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml -m smoke +``` + +Run specific test modules: + +```bash +sudo ./venv/bin/python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml tests/single/st20p/test_st20p_rx.py +``` + +Run specific test cases with parameters: + +```bash +sudo ./venv/bin/python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml "tests/single/st20p/fps/test_fps.py::test_fps[|fps = p60|-ParkJoy_1080p]" +``` + +### Test Categories + +The tests are categorized with markers that can be used to run specific test groups: + +- `@pytest.mark.smoke`: Basic functionality tests for quick validation +- `@pytest.mark.nightly`: Comprehensive tests for nightly runs +- `@pytest.mark.performance`: Performance benchmarking tests +- `@pytest.mark.dma`: Tests specific to DMA functionality +- `@pytest.mark.fwd`: Tests for packet forwarding +- `@pytest.mark.kernel_socket`: Tests for kernel socket backend +- `@pytest.mark.xdp`: Tests for XDP backend +- `@pytest.mark.gpu`: Tests involving GPU processing + +### Generating HTML Reports + +You can generate comprehensive HTML reports for test results that include test status, execution time, and detailed logs: + +```bash +python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml -m smoke --template=html/index.html --report=report.html +``` + +The generated report (report.html) provides: +- Test execution summary and statistics +- Detailed pass/fail status for each test +- Execution time and performance metrics +- Error logs and tracebacks for failed tests +- System information for better debugging context + +### Test Output and Reports + +- Logs are written to `pytest.log` +- Test results are displayed in the console +- HTML reports can be generated as described above +- CSV reports can be generated for performance benchmarks + +## Extending the Framework + +### Adding New Tests + +1. Create a new test file in the appropriate directory under `tests/` +2. Import the required fixtures from `conftest.py` +3. Implement test functions using pytest conventions +4. Add appropriate markers for test categorization + +Example: + +```python +import pytest +from mtl_engine.RxTxApp import RxTxApp + +@pytest.mark.smoke +@pytest.mark.st20p +def test_st20p_basic_flow(setup_interfaces, media_files): + """Test basic ST2110-20 flow from TX to RX""" + app = RxTxApp(setup_interfaces) + + # Test implementation + result = app.run_st20p_test(media_files["1080p"]) + + # Assertions + assert result.success, "ST2110-20 flow test failed" + assert result.packet_loss == 0, "Packet loss detected" +``` + +### Adding New Functionality + +To add new functionality to the framework: + +1. Add utility functions in the appropriate module under `common/` +2. Update the relevant application interface in `mtl_engine/` +3. Document the new functionality in code comments +4. Add tests that exercise the new functionality + +## Troubleshooting + +### Common Issues + +#### RxTxApp Command Not Found +**Error**: `sudo: ./tests/tools/RxTxApp/build/RxTxApp: command not found` +**Solution**: The MTL library hasn't been built yet. Follow the build instructions in the Prerequisites section above or see [doc/build.md](build.md). + +#### Virtual Environment Issues +**Problem**: Package installation conflicts or wrong Python interpreter +**Solution**: +```bash +# Remove existing venv and recreate +rm -rf venv +python3 -m venv venv +source venv/bin/activate +pip install -r requirements.txt +``` + +#### Configuration File Issues +**Problem**: Tests fail with connection or path errors +**Solution**: +- Verify `configs/test_config.yaml` has correct paths (especially `build` and `mtl_path`) +- Update `configs/topology_config.yaml` with actual network interface details +- Use `lspci | grep Ethernet` to find your PCI device IDs + +#### Network Interface Problems +**Problem**: Interface configuration errors +**Solution**: Ensure interfaces are properly configured and have the correct IP addresses + +#### Permission Issues +**Problem**: Network operation failures +**Solution**: Many tests require root privileges for network operations. Run with appropriate sudo permissions. + +#### Build and Setup Issues + +**Problem**: `RxTxApp: command not found` +**Solution**: Build the RxTxApp test tool separately: +```bash +cd tests/tools/RxTxApp +meson setup build +meson compile -C build +cd ../../.. +``` + +**Problem**: RxTxApp build fails with "ST20P_TX_FLAG_EXACT_USER_PACING undeclared" or other header errors +**Solution**: Install MTL system-wide before building RxTxApp: +```bash +cd /path/to/Media-Transport-Library +sudo ninja install -C build +sudo ldconfig +# Then build RxTxApp +cd tests/tools/RxTxApp +rm -rf build # Clean previous failed build +meson setup build +meson compile -C build +``` + +**Problem**: `No module named pytest` when using sudo +**Solution**: Use the virtual environment python with sudo: +```bash +# Wrong: sudo python3 -m pytest +# Correct: +sudo ./venv/bin/python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml +``` + +**Problem**: DSA SSH key errors: `ValueError: q must be exactly 160, 224, or 256 bits long` +**Solution**: Generate new RSA SSH keys and configure SSH access: +```bash +# Generate RSA keys (as your regular user, not root) +ssh-keygen -t rsa -b 2048 -f ~/.ssh/id_rsa + +# Set up SSH access for root@localhost +ssh-copy-id -i ~/.ssh/id_rsa.pub root@localhost + +# Update topology_config.yaml to use your user's key path: +# key_path: /home/your-username/.ssh/id_rsa (not /root/.ssh/id_rsa) +``` + +**Problem**: FFmpeg `No such filter: 'drawtext'` when running gen_frames.sh +**Solution**: Install complete FFmpeg build or skip media generation: +```bash +sudo apt install ffmpeg # Full installation +# Or skip: some tests may work without generated media +``` + +#### Media File Access +**Problem**: Media files not found +**Solution**: Verify that test media files are available and accessible at the path specified in `media_path` + +#### Test Timeouts +**Problem**: Tests timing out on slower systems +**Solution**: Increase timeout values in test_config.yaml for slower systems + +### Quick Reference Tables + +#### Build Issues + +| Problem | Solution | +|---------|----------| +| `RxTxApp: command not found` | Build RxTxApp: `cd tests/tools/RxTxApp && meson setup build && meson compile -C build` | +| `MTL library not found` | Install MTL system-wide: `sudo ninja install -C build && sudo ldconfig` | +| `DSA key error: q must be exactly 160, 224, or 256 bits` | Generate RSA keys: `ssh-keygen -t rsa -b 2048 -f ~/.ssh/id_rsa` | + +#### Runtime Issues + +| Problem | Solution | +|---------|----------| +| `Permission denied` | Use root user: `sudo ./venv/bin/python3 -m pytest` | +| `No module named pytest` | Don't use `sudo python3`, use `sudo ./venv/bin/python3` | +| `Config path errors` | Update placeholder paths in config files | +| `SSH connection failed` | Ensure SSH keys are set up for root@localhost access | +| `No such filter: 'drawtext'` | Install FFmpeg with text filters or skip media generation | + +### Debugging Tests + +Use pytest's debug features: + +```bash +python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml -v --pdb tests/single/st20p/test_st20p_rx.py +``` + +Increase log verbosity: + +```bash +python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml --log-cli-level=DEBUG tests/single/st20p/test_st20p_rx.py +``` diff --git a/doc/validation_quickstart.md b/doc/validation_quickstart.md new file mode 100644 index 000000000..2a847096a --- /dev/null +++ b/doc/validation_quickstart.md @@ -0,0 +1,109 @@ +# MTL Validation Framework - Quick Start Guide + +This quick start guide helps you get the MTL validation framework running with minimal setup. For detailed information, see the [complete validation framework documentation](validation_framework.md). + +## Prerequisites + +1. **🏗️ MTL Build Complete**: MTL must be built and test tools available + 👉 **[Follow complete build instructions](validation_framework.md#setup-and-installation)** + +2. **📋 Basic Requirements**: + - Python 3.9+ + - Root user access (MTL validation requires root privileges for network operations) + - Network interfaces configured per MTL's [run.md](run.md) (VFs created automatically) + - Test media files (see [media generation](validation_framework.md#gen_framessh) or use NFS-hosted files) + - FFmpeg and GStreamer plugins (required for integration tests) + - Compatible SSH keys (RSA recommended, not DSA) + +## Quick Setup (3 steps) + +### 1. Install Dependencies +**Run in tests/validation directory**: +```bash +cd tests/validation +python3 -m venv venv +source venv/bin/activate +pip install -r requirements.txt # Main framework requirements +pip install -r common/integrity/requirements.txt # Integrity test components +``` + +### 2. Configure Environment +Update two key files: + +**[tests/validation/configs/topology_config.yaml](../tests/validation/configs/topology_config.yaml)**: +```yaml +# Key settings to update: +username: root # Must be root for MTL operations +key_path: /home/your-username/.ssh/id_rsa # YOUR user's SSH key path (not /root/) +ip_address: 127.0.0.1 # For localhost testing +port: 22 # Standard SSH port +``` + +> **⚠️ SSH Key Setup**: +> - Use your regular user's SSH keys (e.g., `/home/gta/.ssh/id_rsa`), not root's keys +> - If you get DSA key errors, generate new RSA keys: +> ```bash +> ssh-keygen -t rsa -b 2048 -f ~/.ssh/id_rsa +> ssh-copy-id -i ~/.ssh/id_rsa.pub root@localhost +> ``` + +**[tests/validation/configs/test_config.yaml](../tests/validation/configs/test_config.yaml)**: +```yaml +# Replace MTL_PATH_PLACEHOLDER with your actual paths: +build: /home/gta/Media-Transport-Library/ +mtl_path: /home/gta/Media-Transport-Library/ +``` + +### 3. Run Tests +**Basic smoke test** (must run as root): +```bash +cd tests/validation +# Use full path to venv python with sudo: +sudo ./venv/bin/python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml -m smoke -v +``` + +> **💡 Root Execution**: Don't use `sudo python3` (uses system python). Use `sudo ./venv/bin/python3` to use the virtual environment. + +**Run specific test with parameters**: +```bash +pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml "tests/single/st20p/fps/test_fps.py::test_fps[|fps = p60|-ParkJoy_1080p]" +``` + +## Optional: Create VFs for Advanced Testing + +If you need VFs for NIC testing: +```bash +# Find your network device first +lspci | grep Ethernet + +# Create VFs (replace with your device identifier) +sudo ./script/nicctl.sh create_vf ${TEST_PF_PORT_P} +sudo ./script/nicctl.sh create_vf ${TEST_PF_PORT_R} +``` + +## Quick Troubleshooting + +| Common Error | Quick Solution | +|--------------|----------------| +| `RxTxApp: command not found` | [Follow build instructions](validation_framework.md#rxtxapp-test-tool) | +| `Permission denied` | Use root: `sudo ./venv/bin/python3 -m pytest` | +| `No module named pytest` | Don't use `sudo python3`, use `sudo ./venv/bin/python3` | +| `Config path errors` | Update placeholder paths in config files | +| SSH/FFmpeg issues | See [detailed troubleshooting](validation_framework.md#troubleshooting) | + +## Generate Test Media (Optional) + +For video testing, you may need test media files: +👉 **[See media generation instructions](validation_framework.md#gen_framessh)** + +--- + +## Documentation Navigation + +📖 **Complete Documentation**: [Validation Framework](validation_framework.md) - Detailed information, configuration, and advanced features +🔧 **Build Issues**: [Build Guide](build.md) - MTL build instructions +⚙️ **Configuration Help**: [Configuration Guide](configuration_guide.md) - Network and environment setup + +## Summary + +This quick start guide gets you running tests in minutes. For production use, detailed configuration, or troubleshooting complex issues, refer to the complete documentation above. \ No newline at end of file diff --git a/tests/validation/README.md b/tests/validation/README.md new file mode 100644 index 000000000..3e8af5c9f --- /dev/null +++ b/tests/validation/README.md @@ -0,0 +1,63 @@ +# MTL Validation Framework + +The Media Transport Library (MTL) Validation Framework provides comprehensive testing capabilities for various aspects of the MTL, including protocol compliance, performance, and integration testing. + +## Documentation Navigation + +📖 **Complete Documentation**: [Main validation framework documentation](../../doc/validation_framework.md) - Detailed configuration, troubleshooting, and advanced features +🚀 **Quick Start**: [Validation Quick Start Guide](../../doc/validation_quickstart.md) - Get running in 3 steps +🔧 **Build Issues**: [Build Guide](../../doc/build.md) - MTL build instructions + +--- + +## Quick Setup + +### Prerequisites + +- Python 3.9 or higher +- **⚠️ CRITICAL**: Media Transport Library built and installed (see [build instructions](../../doc/build.md)) +- **Test Media Files**: Input data files are necessary for video, audio, and ancillary data tests + - Files are currently maintained on NFS in production environments + - For local testing, generate frames using `common/gen_frames.sh` (see [documentation](../../doc/validation_framework.md#gen_framessh)) + - Configure media location in `configs/test_config.yaml` +- **Network Interfaces**: Configure according to MTL's [run.md](../../doc/run.md) documentation + - Basic MTL network setup required (see run.md) + - VFs will be created automatically by the validation framework +- **Root Privileges Required**: MTL validation must run as root user + - Required for network management operations + - No alternative permission model available + - Use `sudo ./venv/bin/python3` to run tests +- **FFmpeg and GStreamer Plugins**: Required for integration tests + - Install with: `sudo apt-get install ffmpeg gstreamer1.0-tools gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad` + +### Setup in 3 Simple Steps + +1. **🏗️ MTL Build**: Ensure MTL and test tools are built + 👉 **[Complete build instructions](../../doc/validation_framework.md#setup-and-installation)** + +2. **⚡ Quick Setup**: Follow 3-step setup process + 👉 **[Quick Start Guide](../../doc/validation_quickstart.md)** + +3. **🏃 Run Tests**: Execute validation tests + ```bash + # Quick smoke test + sudo ./venv/bin/python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml -m smoke + ``` + +## Available Tests + +The framework includes tests for: + +- **Media Flow Tests**: ST2110-20 (video), ST2110-22 (compressed video), ST2110-30 (audio), ST2110-40 (ancillary data) +- **Backend Tests**: DMA, Kernel Socket, XDP +- **Integration Tests**: FFmpeg, GStreamer +- **Performance Tests**: Throughput, latency, and other metrics + +Run tests by category using pytest markers: +```bash +python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml -m [marker] +``` + +Available markers: `smoke`, `nightly`, `performance`, `dma`, `kernel_socket`, `xdp`, etc. + +For more detailed information about configuration options, troubleshooting, and extending the framework, please refer to the [complete documentation](/doc/validation_framework.md). diff --git a/tests/validation/common/README.md b/tests/validation/common/README.md new file mode 100644 index 000000000..75d3c1b01 --- /dev/null +++ b/tests/validation/common/README.md @@ -0,0 +1,103 @@ +# Common Test Utilities + +This directory contains shared utilities used across the Media Transport Library validation test suite. These utilities provide common functionality for network interface management, media integrity verification, and FFmpeg handling. + +## Components + +### nicctl.py + +The `nicctl.py` module provides a `Nicctl` class for network interface control: + +- Interface configuration and management +- PCI device binding and unbinding +- Link status monitoring +- MTU and other interface parameter configuration + +Example usage: + +```python +from common.nicctl import Nicctl + +# Create a network interface controller +nic = Nicctl() + +# Configure interface +nic.configure_interface("enp1s0f0", "192.168.1.10", "255.255.255.0") + +# Check link status +status = nic.get_link_status("enp1s0f0") +``` + +### integrity/ + +This directory contains tools for verifying data integrity in media transport tests: + +- Pixel comparison utilities for video integrity checks +- Audio sample verification +- Ancillary data integrity checks +- Error statistics calculation + +Key modules: + +- `video_integrity.py`: Functions for comparing video frames before and after transport +- `audio_integrity.py`: Functions for comparing audio samples +- `ancillary_integrity.py`: Functions for comparing ancillary data + +### ffmpeg_handler/ + +This directory contains utilities for FFmpeg integration: + +- FFmpeg command generation +- Output parsing and analysis +- Media format detection and conversion +- Encoder and decoder integration + +Key modules: + +- `ffmpeg_cmd.py`: Functions for generating FFmpeg command lines +- `ffmpeg_output.py`: Functions for parsing and analyzing FFmpeg output +- `ffmpeg_formats.py`: Media format definitions and utilities + +### gen_frames.sh + +A shell script for generating test frames for video testing: + +- Creates test patterns in various formats +- Supports different resolutions and frame rates +- Configurable color patterns and test signals + +## Using Common Utilities in Tests + +These utilities are imported and used by test modules to set up test environments, execute tests, and validate results. + +Example: + +```python +from common.nicctl import Nicctl +from common.integrity.video_integrity import compare_frames + +def test_st20_transport(): + # Configure network interfaces + nic = Nicctl() + nic.configure_interface("enp1s0f0", "192.168.1.10", "255.255.255.0") + + # Run transport test + # ... + + # Verify frame integrity + result = compare_frames("reference_frame.yuv", "received_frame.yuv") + assert result.match_percentage > 99.9, "Frame integrity check failed" +``` + +## Extending Common Utilities + +To add new common utilities: + +1. Create new Python modules in the appropriate subdirectory +2. Document the module's purpose and API +3. Import the new utilities in test modules as needed + +## License + +BSD-3-Clause License +Copyright (c) 2024-2025 Intel Corporation diff --git a/tests/validation/configs/README.md b/tests/validation/configs/README.md new file mode 100644 index 000000000..9e4eb0e7b --- /dev/null +++ b/tests/validation/configs/README.md @@ -0,0 +1,252 @@ +# Test Configuration + +This directory contains configuration files for the Media Transport Library validation test suite. These files define the test environment, network topology, and test parameters. + +## ⚠️ Critical Setup Required + +**BEFORE RUNNING TESTS**: You must update the placeholder values in these configuration files with your actual system details. Tests will fail with default placeholder values. + +## Configuration Files + +### [`test_config.yaml`](test_config.yaml) + +**File Location**: `tests/validation/configs/test_config.yaml` + +This file contains general test environment settings: + +```yaml +build: MTL_PATH_PLACEHOLDER # ⚠️ UPDATE: Path to your MTL installation +mtl_path: MTL_PATH_PLACEHOLDER # ⚠️ UPDATE: Same as build path +media_path: /mnt/media # ⚠️ UPDATE: Path to your test media files +capture_cfg: + enable: false + test_name: test_name + pcap_dir: /mnt/ramdisk/pcap + capture_time: 5 + interface: null +ramdisk: + media: + mountpoint: /mnt/ramdisk/media + size_gib: 32 + pcap: + mountpoint: /mnt/ramdisk/pcap + size_gib: 768 +``` + +### Key Parameters + +- **build**: Path to the Media Transport Library build directory +- **mtl_path**: Path to the Media Transport Library installation +- **media_path**: Path to the directory containing test media files + +### ⚠️ Setup Instructions + +1. **Replace `MTL_PATH_PLACEHOLDER`** with your actual MTL installation path: + ```bash + # Example: if MTL is in /home/user/Media-Transport-Library/ + build: /home/user/Media-Transport-Library/ + mtl_path: /home/user/Media-Transport-Library/ + ``` + +2. **Update `media_path`** to point to your test media files location + +3. **Verify the paths exist**: + ```bash + ls /path/to/your/Media-Transport-Library/build + ls /path/to/your/media/files/ + ``` + +### Other Parameters +- **capture_cfg**: Network packet capture configuration + - **enable**: Enable/disable packet capture + - **test_name**: Name prefix for capture files + - **pcap_dir**: Directory to store capture files + - **capture_time**: Duration of packet capture in seconds + - **interface**: Network interface to capture from +- **ramdisk**: RAM disk configuration for high-performance testing + - **media.mountpoint**: Mount point for media RAM disk + - **media.size_gib**: Size of media RAM disk in GiB + - **pcap.mountpoint**: Mount point for packet capture RAM disk + - **pcap.size_gib**: Size of packet capture RAM disk in GiB + +### [`topology_config.yaml`](topology_config.yaml) + +**File Location**: `tests/validation/configs/topology_config.yaml` + +This file defines the network topology for testing: + +```yaml +--- +metadata: + version: '2.4' +hosts: + - name: host + instantiate: true + role: sut + network_interfaces: + - pci_device: 8086:1592 # ⚠️ UPDATE: Your NIC's PCI device ID + interface_index: 0 # all + connections: + - ip_address: IP_ADDRESS_PLACEHOLDER # ⚠️ UPDATE: Your system IP + connection_type: SSHConnection + connection_options: + port: SSH_PORT_PLACEHOLDER # ⚠️ UPDATE: SSH port (usually 22) + username: USERNAME_PLACEHOLDER # ⚠️ UPDATE: root + password: None + key_path: KEY_PATH_PLACEHOLDER # ⚠️ UPDATE: Path to your SSH key +``` + +### Topology Setup Instructions + +1. **Find your PCI device ID**: + ```bash + lspci | grep Ethernet + # Look for output like: 86:00.0 Ethernet controller: Intel Corporation... + # Use format: 8086:XXXX (8086 = Intel vendor ID) + ``` + +2. **Update placeholder values**: + ```yaml + # Replace placeholders with actual values: + ip_address: 127.0.0.1 # For localhost, or your actual IP + port: 22 # SSH port + username: your_actual_user # Your username + key_path: /home/your_user/.ssh/id_rsa # Path to your SSH key + ``` + +3. **Verify SSH key exists**: + ```bash + ls -la ~/.ssh/id_rsa + # If missing, generate one: ssh-keygen -t rsa -b 4096 + ``` + +### Topology Parameters + +- **metadata.version**: Configuration format version +- **hosts**: List of hosts in the test topology + - **name**: Host identifier + - **instantiate**: Whether to instantiate the host + - **role**: Host role (e.g., sut for System Under Test) + - **network_interfaces**: List of network interfaces + - **pci_device**: PCI device ID + - **interface_index**: Interface index + - **connections**: List of connections to the host + - **ip_address**: Host IP address + - **connection_type**: Type of connection + - **connection_options**: Connection parameters + - **port**: SSH port + - **username**: SSH username + - **password**: SSH password (or None for key-based authentication) + - **key_path**: Path to SSH private key + +## Customizing Configurations + +### Usage Examples + +Here are practical examples of how to use these configuration files: + +#### Example 1: Running Tests with Custom Configurations + +```bash +# Navigate to validation directory +cd tests/validation + +# Activate virtual environment +source venv/bin/activate + +# Run all smoke tests with your configurations +sudo ./venv/bin/python3 -m pytest \ + --topology_config=configs/topology_config.yaml \ + --test_config=configs/test_config.yaml \ + -m smoke -v + +# Run a specific test category +sudo ./venv/bin/python3 -m pytest \ + --topology_config=configs/topology_config.yaml \ + --test_config=configs/test_config.yaml \ + -m st20p -v + +# Run a specific test with parameters +sudo ./venv/bin/python3 -m pytest \ + --topology_config=configs/topology_config.yaml \ + --test_config=configs/test_config.yaml \ + "tests/single/st20p/fps/test_fps.py::test_fps[|fps = p60|-ParkJoy_1080p]" -v +``` + +#### Example 2: Using Environment-Specific Configurations + +```bash +# Create a local configuration for your environment +cp configs/test_config.yaml configs/test_config.local.yaml +cp configs/topology_config.yaml configs/topology_config.local.yaml + +# Edit local files with your settings +vim configs/test_config.local.yaml +vim configs/topology_config.local.yaml + +# Run tests with local configurations +sudo ./venv/bin/python3 -m pytest \ + --topology_config=configs/topology_config.local.yaml \ + --test_config=configs/test_config.local.yaml \ + -m smoke -v +``` + +#### Example 3: Complete Setup Workflow + +```bash +# 1. Update test_config.yaml +sed -i 's|MTL_PATH_PLACEHOLDER|/home/user/Media-Transport-Library|g' configs/test_config.yaml + +# 2. Update topology_config.yaml +sed -i 's|IP_ADDRESS_PLACEHOLDER|127.0.0.1|g' configs/topology_config.yaml +sed -i 's|SSH_PORT_PLACEHOLDER|22|g' configs/topology_config.yaml +sed -i 's|USERNAME_PLACEHOLDER|root|g' configs/topology_config.yaml +sed -i 's|KEY_PATH_PLACEHOLDER|/home/user/.ssh/id_rsa|g' configs/topology_config.yaml + +# 3. Verify your configuration +cat configs/test_config.yaml +cat configs/topology_config.yaml + +# 4. Run tests +cd tests/validation +source venv/bin/activate +sudo ./venv/bin/python3 -m pytest \ + --topology_config=configs/topology_config.yaml \ + --test_config=configs/test_config.yaml \ + -m smoke -v +``` + +### Environment-Specific Configuration + +To customize the configuration for different environments, create copies of these files with environment-specific settings: + +1. Copy `test_config.yaml` to `test_config.local.yaml` +2. Modify the parameters as needed +3. The test framework will prioritize `.local.yaml` files over the default ones + +### Temporary Configuration Changes + +For temporary configuration changes during test development: + +1. Modify the parameters directly in the YAML files +2. Run your tests +3. Revert changes when done or use git to discard changes + +### Programmatic Configuration Overrides + +Test modules can programmatically override configuration values: + +```python +def test_with_custom_config(config): + # Override configuration for this test + config.capture_cfg.enable = True + config.capture_cfg.interface = "enp1s0f0" + + # Run test with modified configuration + # ... +``` + +## License + +BSD-3-Clause License +Copyright (c) 2024-2025 Intel Corporation diff --git a/tests/validation/mtl_engine/README.md b/tests/validation/mtl_engine/README.md new file mode 100644 index 000000000..278db5c3b --- /dev/null +++ b/tests/validation/mtl_engine/README.md @@ -0,0 +1,130 @@ +# MTL Test Engine + +This directory contains the core components of the Media Transport Library validation test framework. The test engine provides utilities and abstractions for test execution, application management, and result reporting. + +## Components + +### execute.py + +The `execute.py` module provides functionality for executing commands and managing processes: + +- `RaisingThread`: A thread implementation that passes exceptions back to the caller +- `AsyncProcess`: Manages asynchronous process execution with output handling +- Functions for command execution with timeout and output handling + +### RxTxApp.py + +Provides a base class for RX/TX application interfaces used in testing: + +- Application lifecycle management (start, stop, monitoring) +- Common configuration parameters for media transport applications +- Interface for test result collection and reporting + +### GstreamerApp.py + +GStreamer-specific application interface for testing GStreamer integration: + +- Pipeline creation and management for GStreamer-based tests +- Configuration for GStreamer elements and properties +- Media processing validation utilities + +### ffmpeg_app.py + +FFmpeg-specific application interface for testing FFmpeg integration: + +- FFmpeg command generation and execution +- Output parsing and validation +- Support for various FFmpeg encoding/decoding options + +### csv_report.py + +Utilities for test result reporting in CSV format: + +- `csv_add_test`: Adds a test result to the report +- `csv_write_report`: Writes the report to a file +- `update_compliance_result`: Updates compliance-related results + +### integrity.py + +Data integrity verification tools: + +- Functions to verify media data integrity after transport +- Pixel comparison and error detection +- Statistical analysis of media quality + +### ramdisk.py + +RAM disk management for high-performance media testing: + +- `Ramdisk` class: Creates, mounts, and manages RAM disks +- Support for configurable size and mount points +- Cleanup and resource management + +### const.py + +Defines constants used throughout the test framework: + +- Log levels and directories +- Default parameter values +- Test categorization constants + +### stash.py + +Provides a mechanism for storing and retrieving test data: + +- Functions for stashing test results, logs, and notes +- Media file tracking and cleanup +- Issue tracking and reporting + +### media_creator.py and media_files.py + +Utilities for test media management: + +- Media file creation for different formats and codecs +- Reference media handling for comparison tests +- Media metadata management + +## Usage + +The test engine components are typically used by test modules and pytest fixtures to set up test environments, execute test cases, and validate results. + +Example usage in a test module: + +```python +from mtl_engine.execute import run_command +from mtl_engine.RxTxApp import RxTxApp +from mtl_engine.csv_report import csv_add_test + +def test_st20_rx(): + # Setup application + app = RxTxApp(config) + + # Start the application + app.start() + + # Run commands and validate results + result = run_command("some_validation_command") + + # Add test result to report + csv_add_test("st20_rx", result.success) + + # Assert test conditions + assert result.success, "Test failed" +``` + +## Configuration + +Most test engine components are configurable via the `test_config.yaml` and `topology_config.yaml` files in the `configs/` directory. See the main README.md for details on configuring these files. + +## Extension Points + +To extend the test engine with new functionality: + +1. For new application types, create a subclass of `RxTxApp` with specific implementation +2. For new validation methods, add functions to `integrity.py` or create new modules +3. For new reporting formats, extend `csv_report.py` with additional report generation functions + +## License + +BSD-3-Clause License +Copyright (c) 2024-2025 Intel Corporation diff --git a/tests/validation/tests/README.md b/tests/validation/tests/README.md new file mode 100644 index 000000000..a50f8dc11 --- /dev/null +++ b/tests/validation/tests/README.md @@ -0,0 +1,165 @@ +# Validation Test Modules + +This directory contains the test modules for the Media Transport Library validation test suite. The tests are organized into categories based on test scope and functionality. + +## Test Categories + +### Single Flow Tests (`single/`) + +Tests for single-flow scenarios, where a single source transmits to a single destination: + +- **dma/**: Tests for Direct Memory Access functionality + - Memory allocation and management + - DMA transfer performance and reliability + - Error handling and recovery + +- **ffmpeg/**: Tests for FFmpeg integration + - FFmpeg plugin functionality + - Encoding and decoding with FFmpeg + - Format conversion and compatibility + +- **gstreamer/**: Tests for GStreamer integration + - GStreamer plugin functionality + - Pipeline creation and management + - Element functionality and compatibility + +- **kernel_socket/**: Tests for kernel socket backend + - Socket creation and management + - Packet transmission and reception + - Performance and reliability + +- **performance/**: Performance benchmarking tests + - Throughput measurements + - Latency tests + - CPU and memory usage analysis + +- **ptp/**: Precision Time Protocol tests + - Clock synchronization + - Timestamp accuracy + - PTP profile compatibility + +- **rss_mode/**: Tests for Receive Side Scaling modes + - RSS configuration + - Multi-queue performance + - Load balancing effectiveness + +- **rx_timing/**: Tests for reception timing compliance + - Packet timing analysis + - Compliance with ST2110-21 timing specifications + - Jitter measurements + +- **st20p/**: Tests for ST2110-20 video transport + - Uncompressed video transmission and reception + - Format compatibility + - Video quality verification + +- **st22p/**: Tests for ST2110-22 compressed video transport + - Compressed video transmission and reception + - Encoder/decoder plugin functionality + - Compression quality and performance + +- **st30p/**: Tests for ST2110-30 audio transport + - Audio transmission and reception + - Format compatibility + - Audio quality verification + +- **st41/**: Tests for ST2110-40 ancillary data transport + - Ancillary data transmission and reception + - Format compatibility + - Data integrity verification + +- **udp/**: Tests for UDP functionality + - UDP packet transmission and reception + - MTU handling + - UDP-specific features + +- **virtio_user/**: Tests for virtio-user functionality + - Virtual device creation and management + - Performance in virtual environments + - Compatibility with virtualization platforms + +- **xdp/**: Tests for Express Data Path functionality + - XDP program loading and execution + - Packet filtering and processing + - Performance comparison with other backends + +### Dual Flow Tests (`dual/`) + +Tests involving dual connections or flows, typically for redundancy or multi-stream scenarios: + +- Redundant path tests (ST2022-7) +- Multi-stream synchronization +- Load balancing and failover + +### Invalid Tests (`invalid/`) + +Tests for error handling and negative test cases: + +- Invalid configuration handling +- Error recovery +- Resource exhaustion scenarios + +## Running Tests + +### Running Specific Test Categories + +To run all single flow tests: + +```bash +pytest tests/single/ +``` + +To run specific test types: + +```bash +pytest tests/single/st20p/ +``` + +### Test Markers + +Tests are marked with categories that can be used for selective execution: + +```bash +# Run smoke tests +pytest -m smoke + +# Run nightly tests +pytest -m nightly +``` + +## Adding New Tests + +To add a new test: + +1. Create a new test file in the appropriate directory +2. Use the pytest fixture pattern for setup and teardown +3. Add appropriate markers for test categorization +4. Document the test purpose and expectations + +Example test structure: + +```python +import pytest +from common.nicctl import Nicctl +from mtl_engine.RxTxApp import RxTxApp + +# Mark test as part of the smoke test suite +@pytest.mark.smoke +def test_st20_basic_transport(): + """ + Test basic ST2110-20 video transport functionality. + + This test verifies that a simple video stream can be + transmitted and received with proper formatting. + """ + # Test implementation + # ... + + # Assertions to verify test results + assert result == expected_result, "Transport failed" +``` + +## License + +BSD-3-Clause License +Copyright (c) 2024-2025 Intel Corporation