From 53c522c5c97fd485f279fe96018e848e80ab5b87 Mon Sep 17 00:00:00 2001 From: KarolinaPomian Date: Fri, 5 Sep 2025 10:59:36 +0000 Subject: [PATCH 01/15] Docs: Update README.md for validation --- tests/validation/README.md | 226 ++++++++++++++++++++++++++ tests/validation/common/README.md | 103 ++++++++++++ tests/validation/configs/README.md | 126 ++++++++++++++ tests/validation/mtl_engine/README.md | 130 +++++++++++++++ tests/validation/tests/README.md | 165 +++++++++++++++++++ 5 files changed, 750 insertions(+) create mode 100644 tests/validation/README.md create mode 100644 tests/validation/common/README.md create mode 100644 tests/validation/configs/README.md create mode 100644 tests/validation/mtl_engine/README.md create mode 100644 tests/validation/tests/README.md diff --git a/tests/validation/README.md b/tests/validation/README.md new file mode 100644 index 000000000..e30168703 --- /dev/null +++ b/tests/validation/README.md @@ -0,0 +1,226 @@ +# Media Transport Library Validation Test Suite + +This directory contains the automated validation test suite for the Media Transport Library. The tests are designed to verify the functionality, performance, and compliance of the Media Transport Library with SMPTE ST2110 standards. + +## Overview + +The validation framework uses pytest to organize and execute tests across various components of the Media Transport Library. It supports testing of single and dual flow scenarios, various transport protocols, and integration with media processing tools like FFmpeg and GStreamer. + +## Test Framework Structure + +``` +tests/validation/ +├── common/ # Shared utilities for tests +│ ├── ffmpeg_handler/ # FFmpeg integration utilities +│ ├── integrity/ # Data integrity verification tools +│ └── nicctl.py # Network interface control +├── configs/ # Test configuration files +│ ├── test_config.yaml # Test environment settings +│ └── topology_config.yaml # Network topology configuration +├── create_pcap_file/ # Tools for packet capture file creation +├── mtl_engine/ # Core test framework components +│ ├── execute.py # Test execution management +│ ├── RxTxApp.py # RX/TX application interface +│ ├── GstreamerApp.py # GStreamer integration +│ ├── ffmpeg_app.py # FFmpeg integration +│ ├── csv_report.py # Test result reporting +│ └── ramdisk.py # RAM disk management +├── tests/ # Test modules +│ ├── single/ # Single-flow test scenarios +│ │ ├── dma/ # DMA tests +│ │ ├── ffmpeg/ # FFmpeg integration tests +│ │ ├── gstreamer/ # GStreamer integration tests +│ │ ├── kernel_socket/ # Kernel socket tests +│ │ ├── performance/ # Performance benchmarking +│ │ ├── ptp/ # Precision Time Protocol tests +│ │ ├── st20p/ # ST2110-20 video tests +│ │ ├── st22p/ # ST2110-22 compressed video tests +│ │ ├── st30p/ # ST2110-30 audio tests +│ │ └── st41/ # ST2110-40 ancillary data tests +│ ├── dual/ # Dual-flow test scenarios +│ └── invalid/ # Error handling and negative test cases +├── conftest.py # pytest configuration and fixtures +├── pytest.ini # pytest settings +└── requirements.txt # Python dependencies +``` + +## Setup and Installation + +### Prerequisites + +- Python 3.9 or higher +- Media Transport Library built and installed +- Network interfaces configured for testing +- Sufficient permissions for network management + +### Environment Setup + +1. Create and activate a Python virtual environment: + +```bash +python -m venv venv +source venv/bin/activate +``` + +2. Install required dependencies: + +```bash +pip install -r requirements.txt +``` + +3. Configure test parameters: + +Edit `configs/test_config.yaml` with the appropriate paths: +- Set `build` and `mtl_path` to the path of your Media Transport Library build +- Configure `media_path` to point to your test media files +- Adjust RAM disk settings if needed + +Edit `configs/topology_config.yaml` to match your network configuration: +- Set the correct `ip_address`, `SSH_PORT`, `USERNAME`, and either use `KEY_PATH` +- Configure the appropriate `pci_device` for your network interfaces + +4. Start the MtlManager service: + +```bash +sudo MtlManager & +``` + +5. (Optional) Create VFs for NIC testing: + +```bash +sudo ./script/nicctl.sh create_vf "${TEST_PF_PORT_P}" +sudo ./script/nicctl.sh create_vf "${TEST_PF_PORT_R}" +``` + +Replace `${TEST_PF_PORT_P}` and `${TEST_PF_PORT_R}` with your physical port identifiers. + +## Running Tests + +### Basic Test Execution + +Run all tests with configuration files: + +```bash +python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml +``` + +Run specific test modules: + +```bash +python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml tests/single/st20p/test_st20p_rx.py +``` + +### Test Categories + +The tests are categorized with markers that can be used to run specific test groups: + +- **Smoke Tests**: Quick verification tests + ```bash + python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml -m smoke + ``` + +- **Nightly Tests**: Comprehensive tests suitable for nightly runs + ```bash + python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml -m nightly + ``` + +### Generating HTML Reports + +You can generate HTML reports for test results: + +```bash +python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml -m smoke --template=html/index.html --report=report.html +``` + +### Test Output and Reports + +- Logs are written to `pytest.log` +- CSV reports are generated for compliance results +- The framework stores test results in a structured format for later analysis + +## Test Configuration + +### RAM Disk Configuration + +Tests utilize RAM disks for high-performance media handling. Configure in `test_config.yaml`: + +```yaml +ramdisk: + media: + mountpoint: /mnt/ramdisk/media + size_gib: 32 + pcap: + mountpoint: /mnt/ramdisk/pcap + size_gib: 768 +``` + +### Network Capture + +Configure network packet capture settings in `test_config.yaml`: + +```yaml +capture_cfg: + enable: true + test_name: test_name + pcap_dir: /mnt/ramdisk/pcap + capture_time: 5 + interface: enp1s0f0 +``` + +## Test Types + +### Media Flow Tests + +- **ST20p**: Tests for ST2110-20 (uncompressed video) +- **ST22p**: Tests for ST2110-22 (compressed video) +- **ST30p**: Tests for ST2110-30 (audio) +- **ST41**: Tests for ST2110-40 (ancillary data) + +### Backend Tests + +- **DMA**: Direct Memory Access tests +- **Kernel Socket**: Tests for kernel socket backend +- **XDP**: Tests for Express Data Path backend + +### Integration Tests + +- **FFmpeg**: Tests for FFmpeg integration +- **GStreamer**: Tests for GStreamer integration + +### Performance Tests + +- Tests to measure throughput, latency, and other performance metrics + +## Extending the Test Suite + +### Adding New Tests + +1. Create a new test file in the appropriate directory under `tests/` +2. Follow the pytest format for test functions +3. Use existing fixtures from `conftest.py` or create new ones as needed +4. Add appropriate markers for test categorization + +### Adding New Test Categories + +1. Define the new marker in `pytest.ini` +2. Create a new directory under `tests/` if necessary +3. Add test files with the new marker + +## Troubleshooting + +### Common Issues + +- **Network Interface Not Found**: Verify the interface configuration in `topology_config.yaml` +- **Test Media Not Found**: Check the `media_path` setting in `test_config.yaml` +- **Permission Issues**: Ensure the user has sufficient permissions for network operations + +### Logs and Debugging + +- Check `pytest.log` for detailed test execution logs +- Use the `--verbose` flag for more detailed output +- For network issues, use the packet capture feature to analyze traffic + +## License + +BSD-3-Clause License +Copyright (c) 2024-2025 Intel Corporation diff --git a/tests/validation/common/README.md b/tests/validation/common/README.md new file mode 100644 index 000000000..75d3c1b01 --- /dev/null +++ b/tests/validation/common/README.md @@ -0,0 +1,103 @@ +# Common Test Utilities + +This directory contains shared utilities used across the Media Transport Library validation test suite. These utilities provide common functionality for network interface management, media integrity verification, and FFmpeg handling. + +## Components + +### nicctl.py + +The `nicctl.py` module provides a `Nicctl` class for network interface control: + +- Interface configuration and management +- PCI device binding and unbinding +- Link status monitoring +- MTU and other interface parameter configuration + +Example usage: + +```python +from common.nicctl import Nicctl + +# Create a network interface controller +nic = Nicctl() + +# Configure interface +nic.configure_interface("enp1s0f0", "192.168.1.10", "255.255.255.0") + +# Check link status +status = nic.get_link_status("enp1s0f0") +``` + +### integrity/ + +This directory contains tools for verifying data integrity in media transport tests: + +- Pixel comparison utilities for video integrity checks +- Audio sample verification +- Ancillary data integrity checks +- Error statistics calculation + +Key modules: + +- `video_integrity.py`: Functions for comparing video frames before and after transport +- `audio_integrity.py`: Functions for comparing audio samples +- `ancillary_integrity.py`: Functions for comparing ancillary data + +### ffmpeg_handler/ + +This directory contains utilities for FFmpeg integration: + +- FFmpeg command generation +- Output parsing and analysis +- Media format detection and conversion +- Encoder and decoder integration + +Key modules: + +- `ffmpeg_cmd.py`: Functions for generating FFmpeg command lines +- `ffmpeg_output.py`: Functions for parsing and analyzing FFmpeg output +- `ffmpeg_formats.py`: Media format definitions and utilities + +### gen_frames.sh + +A shell script for generating test frames for video testing: + +- Creates test patterns in various formats +- Supports different resolutions and frame rates +- Configurable color patterns and test signals + +## Using Common Utilities in Tests + +These utilities are imported and used by test modules to set up test environments, execute tests, and validate results. + +Example: + +```python +from common.nicctl import Nicctl +from common.integrity.video_integrity import compare_frames + +def test_st20_transport(): + # Configure network interfaces + nic = Nicctl() + nic.configure_interface("enp1s0f0", "192.168.1.10", "255.255.255.0") + + # Run transport test + # ... + + # Verify frame integrity + result = compare_frames("reference_frame.yuv", "received_frame.yuv") + assert result.match_percentage > 99.9, "Frame integrity check failed" +``` + +## Extending Common Utilities + +To add new common utilities: + +1. Create new Python modules in the appropriate subdirectory +2. Document the module's purpose and API +3. Import the new utilities in test modules as needed + +## License + +BSD-3-Clause License +Copyright (c) 2024-2025 Intel Corporation diff --git a/tests/validation/configs/README.md b/tests/validation/configs/README.md new file mode 100644 index 000000000..36132e987 --- /dev/null +++ b/tests/validation/configs/README.md @@ -0,0 +1,126 @@ +# Test Configuration + +This directory contains configuration files for the Media Transport Library validation test suite. These files define the test environment, network topology, and test parameters. + +## Configuration Files + +### test_config.yaml + +This file contains general test environment settings: + +```yaml +build: /path/to/mtl/build +mtl_path: /path/to/mtl +media_path: /mnt/media +capture_cfg: + enable: false + test_name: test_name + pcap_dir: /mnt/ramdisk/pcap + capture_time: 5 + interface: null +ramdisk: + media: + mountpoint: /mnt/ramdisk/media + size_gib: 32 + pcap: + mountpoint: /mnt/ramdisk/pcap + size_gib: 768 +``` + +#### Key Parameters + +- **build**: Path to the Media Transport Library build directory +- **mtl_path**: Path to the Media Transport Library installation +- **media_path**: Path to the directory containing test media files +- **capture_cfg**: Network packet capture configuration + - **enable**: Enable/disable packet capture + - **test_name**: Name prefix for capture files + - **pcap_dir**: Directory to store capture files + - **capture_time**: Duration of packet capture in seconds + - **interface**: Network interface to capture from +- **ramdisk**: RAM disk configuration for high-performance testing + - **media.mountpoint**: Mount point for media RAM disk + - **media.size_gib**: Size of media RAM disk in GiB + - **pcap.mountpoint**: Mount point for packet capture RAM disk + - **pcap.size_gib**: Size of packet capture RAM disk in GiB + +### topology_config.yaml + +This file defines the network topology for testing: + +```yaml +--- +metadata: + version: '2.4' +hosts: + - name: host + instantiate: true + role: sut + network_interfaces: + - pci_device: 8086:1592 + interface_index: 0 # all + connections: + - ip_address: 192.168.1.100 + connection_type: SSHConnection + connection_options: + port: 22 + username: user + password: None + key_path: /path/to/ssh/key +``` + +#### Key Parameters + +- **metadata.version**: Configuration format version +- **hosts**: List of hosts in the test topology + - **name**: Host identifier + - **instantiate**: Whether to instantiate the host + - **role**: Host role (e.g., sut for System Under Test) + - **network_interfaces**: List of network interfaces + - **pci_device**: PCI device ID + - **interface_index**: Interface index + - **connections**: List of connections to the host + - **ip_address**: Host IP address + - **connection_type**: Type of connection + - **connection_options**: Connection parameters + - **port**: SSH port + - **username**: SSH username + - **password**: SSH password (or None for key-based authentication) + - **key_path**: Path to SSH private key + +## Customizing Configurations + +### Environment-Specific Configuration + +To customize the configuration for different environments, create copies of these files with environment-specific settings: + +1. Copy `test_config.yaml` to `test_config.local.yaml` +2. Modify the parameters as needed +3. The test framework will prioritize `.local.yaml` files over the default ones + +### Temporary Configuration Changes + +For temporary configuration changes during test development: + +1. Modify the parameters directly in the YAML files +2. Run your tests +3. Revert changes when done or use git to discard changes + +### Programmatic Configuration Overrides + +Test modules can programmatically override configuration values: + +```python +def test_with_custom_config(config): + # Override configuration for this test + config.capture_cfg.enable = True + config.capture_cfg.interface = "enp1s0f0" + + # Run test with modified configuration + # ... +``` + +## License + +BSD-3-Clause License +Copyright (c) 2024-2025 Intel Corporation diff --git a/tests/validation/mtl_engine/README.md b/tests/validation/mtl_engine/README.md new file mode 100644 index 000000000..278db5c3b --- /dev/null +++ b/tests/validation/mtl_engine/README.md @@ -0,0 +1,130 @@ +# MTL Test Engine + +This directory contains the core components of the Media Transport Library validation test framework. The test engine provides utilities and abstractions for test execution, application management, and result reporting. + +## Components + +### execute.py + +The `execute.py` module provides functionality for executing commands and managing processes: + +- `RaisingThread`: A thread implementation that passes exceptions back to the caller +- `AsyncProcess`: Manages asynchronous process execution with output handling +- Functions for command execution with timeout and output handling + +### RxTxApp.py + +Provides a base class for RX/TX application interfaces used in testing: + +- Application lifecycle management (start, stop, monitoring) +- Common configuration parameters for media transport applications +- Interface for test result collection and reporting + +### GstreamerApp.py + +GStreamer-specific application interface for testing GStreamer integration: + +- Pipeline creation and management for GStreamer-based tests +- Configuration for GStreamer elements and properties +- Media processing validation utilities + +### ffmpeg_app.py + +FFmpeg-specific application interface for testing FFmpeg integration: + +- FFmpeg command generation and execution +- Output parsing and validation +- Support for various FFmpeg encoding/decoding options + +### csv_report.py + +Utilities for test result reporting in CSV format: + +- `csv_add_test`: Adds a test result to the report +- `csv_write_report`: Writes the report to a file +- `update_compliance_result`: Updates compliance-related results + +### integrity.py + +Data integrity verification tools: + +- Functions to verify media data integrity after transport +- Pixel comparison and error detection +- Statistical analysis of media quality + +### ramdisk.py + +RAM disk management for high-performance media testing: + +- `Ramdisk` class: Creates, mounts, and manages RAM disks +- Support for configurable size and mount points +- Cleanup and resource management + +### const.py + +Defines constants used throughout the test framework: + +- Log levels and directories +- Default parameter values +- Test categorization constants + +### stash.py + +Provides a mechanism for storing and retrieving test data: + +- Functions for stashing test results, logs, and notes +- Media file tracking and cleanup +- Issue tracking and reporting + +### media_creator.py and media_files.py + +Utilities for test media management: + +- Media file creation for different formats and codecs +- Reference media handling for comparison tests +- Media metadata management + +## Usage + +The test engine components are typically used by test modules and pytest fixtures to set up test environments, execute test cases, and validate results. + +Example usage in a test module: + +```python +from mtl_engine.execute import run_command +from mtl_engine.RxTxApp import RxTxApp +from mtl_engine.csv_report import csv_add_test + +def test_st20_rx(): + # Setup application + app = RxTxApp(config) + + # Start the application + app.start() + + # Run commands and validate results + result = run_command("some_validation_command") + + # Add test result to report + csv_add_test("st20_rx", result.success) + + # Assert test conditions + assert result.success, "Test failed" +``` + +## Configuration + +Most test engine components are configurable via the `test_config.yaml` and `topology_config.yaml` files in the `configs/` directory. See the main README.md for details on configuring these files. + +## Extension Points + +To extend the test engine with new functionality: + +1. For new application types, create a subclass of `RxTxApp` with specific implementation +2. For new validation methods, add functions to `integrity.py` or create new modules +3. For new reporting formats, extend `csv_report.py` with additional report generation functions + +## License + +BSD-3-Clause License +Copyright (c) 2024-2025 Intel Corporation diff --git a/tests/validation/tests/README.md b/tests/validation/tests/README.md new file mode 100644 index 000000000..a50f8dc11 --- /dev/null +++ b/tests/validation/tests/README.md @@ -0,0 +1,165 @@ +# Validation Test Modules + +This directory contains the test modules for the Media Transport Library validation test suite. The tests are organized into categories based on test scope and functionality. + +## Test Categories + +### Single Flow Tests (`single/`) + +Tests for single-flow scenarios, where a single source transmits to a single destination: + +- **dma/**: Tests for Direct Memory Access functionality + - Memory allocation and management + - DMA transfer performance and reliability + - Error handling and recovery + +- **ffmpeg/**: Tests for FFmpeg integration + - FFmpeg plugin functionality + - Encoding and decoding with FFmpeg + - Format conversion and compatibility + +- **gstreamer/**: Tests for GStreamer integration + - GStreamer plugin functionality + - Pipeline creation and management + - Element functionality and compatibility + +- **kernel_socket/**: Tests for kernel socket backend + - Socket creation and management + - Packet transmission and reception + - Performance and reliability + +- **performance/**: Performance benchmarking tests + - Throughput measurements + - Latency tests + - CPU and memory usage analysis + +- **ptp/**: Precision Time Protocol tests + - Clock synchronization + - Timestamp accuracy + - PTP profile compatibility + +- **rss_mode/**: Tests for Receive Side Scaling modes + - RSS configuration + - Multi-queue performance + - Load balancing effectiveness + +- **rx_timing/**: Tests for reception timing compliance + - Packet timing analysis + - Compliance with ST2110-21 timing specifications + - Jitter measurements + +- **st20p/**: Tests for ST2110-20 video transport + - Uncompressed video transmission and reception + - Format compatibility + - Video quality verification + +- **st22p/**: Tests for ST2110-22 compressed video transport + - Compressed video transmission and reception + - Encoder/decoder plugin functionality + - Compression quality and performance + +- **st30p/**: Tests for ST2110-30 audio transport + - Audio transmission and reception + - Format compatibility + - Audio quality verification + +- **st41/**: Tests for ST2110-40 ancillary data transport + - Ancillary data transmission and reception + - Format compatibility + - Data integrity verification + +- **udp/**: Tests for UDP functionality + - UDP packet transmission and reception + - MTU handling + - UDP-specific features + +- **virtio_user/**: Tests for virtio-user functionality + - Virtual device creation and management + - Performance in virtual environments + - Compatibility with virtualization platforms + +- **xdp/**: Tests for Express Data Path functionality + - XDP program loading and execution + - Packet filtering and processing + - Performance comparison with other backends + +### Dual Flow Tests (`dual/`) + +Tests involving dual connections or flows, typically for redundancy or multi-stream scenarios: + +- Redundant path tests (ST2022-7) +- Multi-stream synchronization +- Load balancing and failover + +### Invalid Tests (`invalid/`) + +Tests for error handling and negative test cases: + +- Invalid configuration handling +- Error recovery +- Resource exhaustion scenarios + +## Running Tests + +### Running Specific Test Categories + +To run all single flow tests: + +```bash +pytest tests/single/ +``` + +To run specific test types: + +```bash +pytest tests/single/st20p/ +``` + +### Test Markers + +Tests are marked with categories that can be used for selective execution: + +```bash +# Run smoke tests +pytest -m smoke + +# Run nightly tests +pytest -m nightly +``` + +## Adding New Tests + +To add a new test: + +1. Create a new test file in the appropriate directory +2. Use the pytest fixture pattern for setup and teardown +3. Add appropriate markers for test categorization +4. Document the test purpose and expectations + +Example test structure: + +```python +import pytest +from common.nicctl import Nicctl +from mtl_engine.RxTxApp import RxTxApp + +# Mark test as part of the smoke test suite +@pytest.mark.smoke +def test_st20_basic_transport(): + """ + Test basic ST2110-20 video transport functionality. + + This test verifies that a simple video stream can be + transmitted and received with proper formatting. + """ + # Test implementation + # ... + + # Assertions to verify test results + assert result == expected_result, "Transport failed" +``` + +## License + +BSD-3-Clause License +Copyright (c) 2024-2025 Intel Corporation From 15f616bee5bb6ebbf5cd977b9898953ee18c293c Mon Sep 17 00:00:00 2001 From: KarolinaPomian Date: Fri, 5 Sep 2025 11:05:35 +0000 Subject: [PATCH 02/15] Fix: update paths-ignore in smoke-tests.yml and clarify README parameters --- .github/workflows/smoke-tests.yml | 6 ++++++ tests/validation/README.md | 4 ++-- tests/validation/configs/README.md | 2 +- 3 files changed, 9 insertions(+), 3 deletions(-) diff --git a/.github/workflows/smoke-tests.yml b/.github/workflows/smoke-tests.yml index c8a13dc56..a252e206f 100644 --- a/.github/workflows/smoke-tests.yml +++ b/.github/workflows/smoke-tests.yml @@ -5,10 +5,16 @@ on: branches: - main - 'maint-**' + paths-ignore: + - '**.md' + - 'doc/**' pull_request: branches: - main - 'maint-**' + paths-ignore: + - '**.md' + - 'doc/**' env: BUILD_TYPE: 'Release' DPDK_VERSION: '25.03' diff --git a/tests/validation/README.md b/tests/validation/README.md index e30168703..bce55d5c2 100644 --- a/tests/validation/README.md +++ b/tests/validation/README.md @@ -8,7 +8,7 @@ The validation framework uses pytest to organize and execute tests across variou ## Test Framework Structure -``` +```plaintext tests/validation/ ├── common/ # Shared utilities for tests │ ├── ffmpeg_handler/ # FFmpeg integration utilities @@ -76,7 +76,7 @@ Edit `configs/test_config.yaml` with the appropriate paths: - Adjust RAM disk settings if needed Edit `configs/topology_config.yaml` to match your network configuration: -- Set the correct `ip_address`, `SSH_PORT`, `USERNAME`, and either use `KEY_PATH` +- Set the correct `ip_address`, `SSH_PORT`, `USERNAME`, and `KEY_PATH` - Configure the appropriate `pci_device` for your network interfaces 4. Start the MtlManager service: diff --git a/tests/validation/configs/README.md b/tests/validation/configs/README.md index 36132e987..e94f2019f 100644 --- a/tests/validation/configs/README.md +++ b/tests/validation/configs/README.md @@ -69,7 +69,7 @@ hosts: key_path: /path/to/ssh/key ``` -#### Key Parameters +#### Topology Parameters - **metadata.version**: Configuration format version - **hosts**: List of hosts in the test topology From f64812cc509e0351d1aec038c9e17366c193e262 Mon Sep 17 00:00:00 2001 From: KarolinaPomian Date: Fri, 5 Sep 2025 13:53:26 +0000 Subject: [PATCH 03/15] Docs: Revise README.md for clarity and structure in validation framework overview --- tests/validation/README.md | 65 ++++++++++++++++---------------------- 1 file changed, 27 insertions(+), 38 deletions(-) diff --git a/tests/validation/README.md b/tests/validation/README.md index bce55d5c2..56811cae9 100644 --- a/tests/validation/README.md +++ b/tests/validation/README.md @@ -8,41 +8,15 @@ The validation framework uses pytest to organize and execute tests across variou ## Test Framework Structure -```plaintext -tests/validation/ -├── common/ # Shared utilities for tests -│ ├── ffmpeg_handler/ # FFmpeg integration utilities -│ ├── integrity/ # Data integrity verification tools -│ └── nicctl.py # Network interface control -├── configs/ # Test configuration files -│ ├── test_config.yaml # Test environment settings -│ └── topology_config.yaml # Network topology configuration -├── create_pcap_file/ # Tools for packet capture file creation -├── mtl_engine/ # Core test framework components -│ ├── execute.py # Test execution management -│ ├── RxTxApp.py # RX/TX application interface -│ ├── GstreamerApp.py # GStreamer integration -│ ├── ffmpeg_app.py # FFmpeg integration -│ ├── csv_report.py # Test result reporting -│ └── ramdisk.py # RAM disk management -├── tests/ # Test modules -│ ├── single/ # Single-flow test scenarios -│ │ ├── dma/ # DMA tests -│ │ ├── ffmpeg/ # FFmpeg integration tests -│ │ ├── gstreamer/ # GStreamer integration tests -│ │ ├── kernel_socket/ # Kernel socket tests -│ │ ├── performance/ # Performance benchmarking -│ │ ├── ptp/ # Precision Time Protocol tests -│ │ ├── st20p/ # ST2110-20 video tests -│ │ ├── st22p/ # ST2110-22 compressed video tests -│ │ ├── st30p/ # ST2110-30 audio tests -│ │ └── st41/ # ST2110-40 ancillary data tests -│ ├── dual/ # Dual-flow test scenarios -│ └── invalid/ # Error handling and negative test cases -├── conftest.py # pytest configuration and fixtures -├── pytest.ini # pytest settings -└── requirements.txt # Python dependencies -``` +The validation framework is organized into the following main components: + +- **common/**: Shared utilities for test functionality, including FFmpeg handlers, integrity verification tools, and network interface control +- **configs/**: Configuration files for test environment and network topology +- **mtl_engine/**: Core test framework components that manage test execution, application interfaces, and result reporting +- **tests/**: Test modules organized by scenario type: + - **single/**: Single-flow test scenarios for various protocols (ST2110-20/22/30/40), backends, and integrations + - **dual/**: Tests for multiple simultaneous flows + - **invalid/**: Error handling and negative test cases ## Setup and Installation @@ -50,8 +24,10 @@ tests/validation/ - Python 3.9 or higher - Media Transport Library built and installed -- Network interfaces configured for testing -- Sufficient permissions for network management +- Test media files (currently maintained on NFS) +- Network interfaces as specified in MTL's run.md document (VFs will be created automatically) +- Root privileges or equivalent (sudo) for network operations done by script/nicctl.sh +- FFmpeg and GStreamer plugins installed (required for integration tests) ### Environment Setup @@ -110,6 +86,12 @@ Run specific test modules: python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml tests/single/st20p/test_st20p_rx.py ``` +Run specific test cases with parameters: + +```bash +python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml "tests/single/st20p/fps/test_fps.py::test_fps[|fps = p60|-ParkJoy_1080p]" +``` + ### Test Categories The tests are categorized with markers that can be used to run specific test groups: @@ -126,12 +108,19 @@ The tests are categorized with markers that can be used to run specific test gro ### Generating HTML Reports -You can generate HTML reports for test results: +You can generate comprehensive HTML reports for test results that include test status, execution time, and detailed logs: ```bash python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml -m smoke --template=html/index.html --report=report.html ``` +The generated report (report.html) provides: +- Test execution summary and statistics +- Detailed pass/fail status for each test +- Execution time and performance metrics +- Error logs and tracebacks for failed tests +- System information for better debugging context + ### Test Output and Reports - Logs are written to `pytest.log` From 849a37da826b57ac242fc8ee3ee3b68ac5483f31 Mon Sep 17 00:00:00 2001 From: KarolinaPomian Date: Wed, 24 Sep 2025 08:24:43 +0000 Subject: [PATCH 04/15] Docs: Move validation framework documentation to doc/ directory Address GitHub review comment by: 1. Moving detailed documentation to doc/validation_framework.md 2. Simplifying tests/validation/README.md to be a quickstart guide 3. Adding a clear reference to the main documentation This follows the project's approach of keeping detailed documentation in the doc/ directory while providing a concise quickstart guide in the component's README.md. --- doc/validation_framework.md | 245 ++++++++++++++++++++++++++++++++++++ tests/validation/README.md | 235 ++++++---------------------------- 2 files changed, 285 insertions(+), 195 deletions(-) create mode 100644 doc/validation_framework.md diff --git a/doc/validation_framework.md b/doc/validation_framework.md new file mode 100644 index 000000000..9f45beb12 --- /dev/null +++ b/doc/validation_framework.md @@ -0,0 +1,245 @@ +# MTL Validation Framework + +The Media Transport Library (MTL) Validation Framework provides comprehensive testing capabilities for various aspects of the MTL, including protocol compliance, performance, and integration testing. + +## Overview + +The validation framework uses pytest to organize and execute tests across various scenarios, protocols, and backend implementations. It supports both automated testing in CI/CD environments and manual testing for development and troubleshooting. + +## Test Framework Structure + +The validation framework is organized into the following main components: + +## Test Framework Structure + +The validation framework is organized into the following main components: + +- **common/**: Shared utilities for test functionality, including FFmpeg handlers, integrity verification tools, and network interface control +- **configs/**: Configuration files for test environment and network topology +- **mtl_engine/**: Core test framework components that manage test execution, application interfaces, and result reporting +- **tests/**: Test modules organized by scenario type: + - **single/**: Single-flow test scenarios for various protocols (ST2110-20/22/30/40), backends, and integrations + - **dual/**: Tests for multiple simultaneous flows + - **invalid/**: Error handling and negative test cases + +## Components Description + +### Common Utilities + +The `common/` directory contains shared utilities that provide fundamental functionality for test execution: + +- **FFmpeg Handler**: Manages FFmpeg operations for media processing and verification +- **Integrity Tools**: Provides functions for data integrity verification between source and received media +- **Network Interface Control**: Manages network interfaces required for testing + +### Configuration Files + +The `configs/` directory contains YAML files that specify: + +- **Test Environment Settings**: Hardware specifications, media paths, and test parameters +- **Network Topology**: Interface configuration, IP addressing, and routing information + +### MTL Engine + +The `mtl_engine/` directory contains the core components of the framework: + +- **Execute Module**: Manages the execution flow of tests, including setup and teardown +- **Application Interfaces**: Provides interfaces to RX/TX, GStreamer, and FFmpeg applications +- **Reporting Tools**: Generates test reports and collects performance metrics + +### Test Modules + +The `tests/` directory contains test implementations organized by scenario type: + +- **Single Flow Tests**: Tests focusing on individual protocol implementations + - **ST2110-20**: Uncompressed video tests + - **ST2110-22**: Compressed video tests + - **ST2110-30**: Audio tests + - **ST2110-40**: Ancillary data tests + - Backend-specific tests (DMA, kernel socket, etc.) + - Integration tests (FFmpeg, GStreamer) + +- **Dual Flow Tests**: Tests involving multiple simultaneous flows +- **Invalid Tests**: Tests focusing on error handling and edge cases + +## Setup and Installation + +### Prerequisites + +- Python 3.9 or higher +- Media Transport Library built and installed +- Test media files (currently maintained on NFS) +- Network interfaces as specified in MTL's run.md document (VFs will be created automatically) +- Root privileges or equivalent (sudo) for network operations done by script/nicctl.sh +- FFmpeg and GStreamer plugins installed (required for integration tests) + +### Environment Setup + +1. Create a virtual environment: + +```bash +cd tests/validation +python3 -m venv venv +source venv/bin/activate +``` + +2. Install dependencies: + +```bash +pip install -r requirements.txt +``` + +### Configuration + +1. Update `configs/topology_config.yaml` with your network interface details: + +```yaml +system: + hostname: testserver + interfaces: + - name: ens801f0 + pci: "86:00.0" + ip: "192.168.108.15" + - name: ens801f1 + pci: "86:00.1" + ip: "192.168.208.15" +``` + +2. Update `configs/test_config.yaml` with your test environment settings: + +```yaml +environment: + media_dir: "/path/to/test/media" + log_level: "INFO" + temp_dir: "/tmp/mtl_test" + +test_params: + timeout: 30 + retry_count: 3 + integrity_check: true +``` + +## Running Tests + +### Basic Usage + +Run all tests: + +```bash +cd tests/validation +python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml +``` + +Run smoke tests: + +```bash +python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml -m smoke +``` + +Run specific test modules: + +```bash +python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml tests/single/st20p/test_st20p_rx.py +``` + +Run specific test cases with parameters: + +```bash +python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml "tests/single/st20p/fps/test_fps.py::test_fps[|fps = p60|-ParkJoy_1080p]" +``` + +### Test Categories + +The tests are categorized with markers that can be used to run specific test groups: + +- `@pytest.mark.smoke`: Basic functionality tests for quick validation +- `@pytest.mark.nightly`: Comprehensive tests for nightly runs +- `@pytest.mark.performance`: Performance benchmarking tests +- `@pytest.mark.dma`: Tests specific to DMA functionality +- `@pytest.mark.fwd`: Tests for packet forwarding +- `@pytest.mark.kernel_socket`: Tests for kernel socket backend +- `@pytest.mark.xdp`: Tests for XDP backend +- `@pytest.mark.gpu`: Tests involving GPU processing + +### Generating HTML Reports + +You can generate comprehensive HTML reports for test results that include test status, execution time, and detailed logs: + +```bash +python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml -m smoke --template=html/index.html --report=report.html +``` + +The generated report (report.html) provides: +- Test execution summary and statistics +- Detailed pass/fail status for each test +- Execution time and performance metrics +- Error logs and tracebacks for failed tests +- System information for better debugging context + +### Test Output and Reports + +- Logs are written to `pytest.log` +- Test results are displayed in the console +- HTML reports can be generated as described above +- CSV reports can be generated for performance benchmarks + +## Extending the Framework + +### Adding New Tests + +1. Create a new test file in the appropriate directory under `tests/` +2. Import the required fixtures from `conftest.py` +3. Implement test functions using pytest conventions +4. Add appropriate markers for test categorization + +Example: + +```python +import pytest +from mtl_engine.RxTxApp import RxTxApp + +@pytest.mark.smoke +@pytest.mark.st20p +def test_st20p_basic_flow(setup_interfaces, media_files): + """Test basic ST2110-20 flow from TX to RX""" + app = RxTxApp(setup_interfaces) + + # Test implementation + result = app.run_st20p_test(media_files["1080p"]) + + # Assertions + assert result.success, "ST2110-20 flow test failed" + assert result.packet_loss == 0, "Packet loss detected" +``` + +### Adding New Functionality + +To add new functionality to the framework: + +1. Add utility functions in the appropriate module under `common/` +2. Update the relevant application interface in `mtl_engine/` +3. Document the new functionality in code comments +4. Add tests that exercise the new functionality + +## Troubleshooting + +### Common Issues + +- **Network Interface Problems**: Ensure interfaces are properly configured and have the correct IP addresses +- **Permission Issues**: Many tests require root privileges for network operations +- **Media File Access**: Verify that test media files are available and accessible +- **Test Timeouts**: Increase timeout values in test_config.yaml for slower systems + +### Debugging Tests + +Use pytest's debug features: + +```bash +python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml -v --pdb tests/single/st20p/test_st20p_rx.py +``` + +Increase log verbosity: + +```bash +python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml --log-cli-level=DEBUG tests/single/st20p/test_st20p_rx.py +``` diff --git a/tests/validation/README.md b/tests/validation/README.md index 56811cae9..1d8c59870 100644 --- a/tests/validation/README.md +++ b/tests/validation/README.md @@ -1,215 +1,60 @@ -# Media Transport Library Validation Test Suite +# MTL Validation Framework -This directory contains the automated validation test suite for the Media Transport Library. The tests are designed to verify the functionality, performance, and compliance of the Media Transport Library with SMPTE ST2110 standards. +The Media Transport Library (MTL) Validation Framework provides comprehensive testing capabilities for various aspects of the MTL, including protocol compliance, performance, and integration testing. -## Overview +> **For detailed documentation, please refer to [the main validation framework documentation](/doc/validation_framework.md)** -The validation framework uses pytest to organize and execute tests across various components of the Media Transport Library. It supports testing of single and dual flow scenarios, various transport protocols, and integration with media processing tools like FFmpeg and GStreamer. - -## Test Framework Structure - -The validation framework is organized into the following main components: - -- **common/**: Shared utilities for test functionality, including FFmpeg handlers, integrity verification tools, and network interface control -- **configs/**: Configuration files for test environment and network topology -- **mtl_engine/**: Core test framework components that manage test execution, application interfaces, and result reporting -- **tests/**: Test modules organized by scenario type: - - **single/**: Single-flow test scenarios for various protocols (ST2110-20/22/30/40), backends, and integrations - - **dual/**: Tests for multiple simultaneous flows - - **invalid/**: Error handling and negative test cases - -## Setup and Installation +## Quick Start Guide ### Prerequisites - Python 3.9 or higher - Media Transport Library built and installed -- Test media files (currently maintained on NFS) -- Network interfaces as specified in MTL's run.md document (VFs will be created automatically) -- Root privileges or equivalent (sudo) for network operations done by script/nicctl.sh -- FFmpeg and GStreamer plugins installed (required for integration tests) - -### Environment Setup - -1. Create and activate a Python virtual environment: - -```bash -python -m venv venv -source venv/bin/activate -``` +- Test media files (typically on NFS) +- Network interfaces configured for testing +- Root privileges for network operations +- FFmpeg and GStreamer plugins (for integration tests) -2. Install required dependencies: +### Setup in 3 Simple Steps -```bash -pip install -r requirements.txt -``` - -3. Configure test parameters: +1. **Create a virtual environment and install dependencies**: + ```bash + cd tests/validation + python3 -m venv venv + source venv/bin/activate + pip install -r requirements.txt + ``` -Edit `configs/test_config.yaml` with the appropriate paths: -- Set `build` and `mtl_path` to the path of your Media Transport Library build -- Configure `media_path` to point to your test media files -- Adjust RAM disk settings if needed +2. **Configure your environment**: + - Update network interfaces in `configs/topology_config.yaml` + - Set media paths and test parameters in `configs/test_config.yaml` -Edit `configs/topology_config.yaml` to match your network configuration: -- Set the correct `ip_address`, `SSH_PORT`, `USERNAME`, and `KEY_PATH` -- Configure the appropriate `pci_device` for your network interfaces +3. **Run tests**: + ```bash + # Run smoke tests (quick validation) + python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml -m smoke + + # Run specific test module + python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml tests/single/st20p/test_st20p_rx.py + + # Generate HTML report + python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml -m smoke --template=html/index.html --report=report.html + ``` -4. Start the MtlManager service: +## Available Tests -```bash -sudo MtlManager & -``` +The framework includes tests for: -5. (Optional) Create VFs for NIC testing: +- **Media Flow Tests**: ST2110-20 (video), ST2110-22 (compressed video), ST2110-30 (audio), ST2110-40 (ancillary data) +- **Backend Tests**: DMA, Kernel Socket, XDP +- **Integration Tests**: FFmpeg, GStreamer +- **Performance Tests**: Throughput, latency, and other metrics +Run tests by category using pytest markers: ```bash -sudo ./script/nicctl.sh create_vf "${TEST_PF_PORT_P}" -sudo ./script/nicctl.sh create_vf "${TEST_PF_PORT_R}" +python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml -m [marker] ``` -Replace `${TEST_PF_PORT_P}` and `${TEST_PF_PORT_R}` with your physical port identifiers. - -## Running Tests - -### Basic Test Execution - -Run all tests with configuration files: - -```bash -python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml -``` - -Run specific test modules: - -```bash -python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml tests/single/st20p/test_st20p_rx.py -``` - -Run specific test cases with parameters: - -```bash -python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml "tests/single/st20p/fps/test_fps.py::test_fps[|fps = p60|-ParkJoy_1080p]" -``` - -### Test Categories - -The tests are categorized with markers that can be used to run specific test groups: - -- **Smoke Tests**: Quick verification tests - ```bash - python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml -m smoke - ``` - -- **Nightly Tests**: Comprehensive tests suitable for nightly runs - ```bash - python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml -m nightly - ``` - -### Generating HTML Reports - -You can generate comprehensive HTML reports for test results that include test status, execution time, and detailed logs: - -```bash -python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml -m smoke --template=html/index.html --report=report.html -``` - -The generated report (report.html) provides: -- Test execution summary and statistics -- Detailed pass/fail status for each test -- Execution time and performance metrics -- Error logs and tracebacks for failed tests -- System information for better debugging context - -### Test Output and Reports - -- Logs are written to `pytest.log` -- CSV reports are generated for compliance results -- The framework stores test results in a structured format for later analysis - -## Test Configuration - -### RAM Disk Configuration - -Tests utilize RAM disks for high-performance media handling. Configure in `test_config.yaml`: - -```yaml -ramdisk: - media: - mountpoint: /mnt/ramdisk/media - size_gib: 32 - pcap: - mountpoint: /mnt/ramdisk/pcap - size_gib: 768 -``` - -### Network Capture - -Configure network packet capture settings in `test_config.yaml`: - -```yaml -capture_cfg: - enable: true - test_name: test_name - pcap_dir: /mnt/ramdisk/pcap - capture_time: 5 - interface: enp1s0f0 -``` - -## Test Types - -### Media Flow Tests - -- **ST20p**: Tests for ST2110-20 (uncompressed video) -- **ST22p**: Tests for ST2110-22 (compressed video) -- **ST30p**: Tests for ST2110-30 (audio) -- **ST41**: Tests for ST2110-40 (ancillary data) - -### Backend Tests - -- **DMA**: Direct Memory Access tests -- **Kernel Socket**: Tests for kernel socket backend -- **XDP**: Tests for Express Data Path backend - -### Integration Tests - -- **FFmpeg**: Tests for FFmpeg integration -- **GStreamer**: Tests for GStreamer integration - -### Performance Tests - -- Tests to measure throughput, latency, and other performance metrics - -## Extending the Test Suite - -### Adding New Tests - -1. Create a new test file in the appropriate directory under `tests/` -2. Follow the pytest format for test functions -3. Use existing fixtures from `conftest.py` or create new ones as needed -4. Add appropriate markers for test categorization - -### Adding New Test Categories - -1. Define the new marker in `pytest.ini` -2. Create a new directory under `tests/` if necessary -3. Add test files with the new marker - -## Troubleshooting - -### Common Issues - -- **Network Interface Not Found**: Verify the interface configuration in `topology_config.yaml` -- **Test Media Not Found**: Check the `media_path` setting in `test_config.yaml` -- **Permission Issues**: Ensure the user has sufficient permissions for network operations - -### Logs and Debugging - -- Check `pytest.log` for detailed test execution logs -- Use the `--verbose` flag for more detailed output -- For network issues, use the packet capture feature to analyze traffic - -## License +Available markers: `smoke`, `nightly`, `performance`, `dma`, `kernel_socket`, `xdp`, etc. -BSD-3-Clause License -Copyright (c) 2024-2025 Intel Corporation +For more detailed information about configuration options, troubleshooting, and extending the framework, please refer to the [complete documentation](/doc/validation_framework.md). From e781a79034c284ead9a734c34c338a86f1f91967 Mon Sep 17 00:00:00 2001 From: KarolinaPomian Date: Wed, 24 Sep 2025 08:43:45 +0000 Subject: [PATCH 05/15] Docs: Fix critical validation framework setup issues - Add missing MTL build prerequisite - tests fail without RxTxApp binary - Clarify virtual environment setup path discrepancy - Improve configuration file examples with realistic values - Add comprehensive troubleshooting section for common new developer issues - Update both main validation_framework.md and tests/validation/README.md Fixes issues identified during new developer testing of validation framework documentation. --- doc/validation_framework.md | 148 +++++++++++++++++++++++++++++------- tests/validation/README.md | 18 +++-- 2 files changed, 135 insertions(+), 31 deletions(-) diff --git a/doc/validation_framework.md b/doc/validation_framework.md index 9f45beb12..b75016bd7 100644 --- a/doc/validation_framework.md +++ b/doc/validation_framework.md @@ -66,8 +66,39 @@ The `tests/` directory contains test implementations organized by scenario type: ### Prerequisites +#### 1. Build Media Transport Library First (CRITICAL) + +**⚠️ IMPORTANT**: The MTL library must be built before running validation tests! + +The tests require the RxTxApp binary and other MTL components. Follow these steps: + +```bash +# 1. Install build dependencies (see doc/build.md for your OS) +sudo apt-get update +sudo apt-get install git gcc meson python3 python3-pip pkg-config libnuma-dev libjson-c-dev libpcap-dev libgtest-dev libssl-dev +sudo pip install pyelftools ninja + +# 2. Build DPDK (required dependency) +git clone https://github.com/DPDK/dpdk.git +cd dpdk +git checkout v25.03 +git switch -c v25.03 +git am /path/to/Media-Transport-Library/patches/dpdk/25.03/*.patch +meson setup build +ninja -C build +sudo ninja install -C build +cd .. + +# 3. Build MTL +cd Media-Transport-Library +./build.sh +``` + +For complete build instructions, see [doc/build.md](build.md). + +#### 2. Other Prerequisites + - Python 3.9 or higher -- Media Transport Library built and installed - Test media files (currently maintained on NFS) - Network interfaces as specified in MTL's run.md document (VFs will be created automatically) - Root privileges or equivalent (sudo) for network operations done by script/nicctl.sh @@ -75,7 +106,7 @@ The `tests/` directory contains test implementations organized by scenario type: ### Environment Setup -1. Create a virtual environment: +1. Create and activate a virtual environment: ```bash cd tests/validation @@ -83,42 +114,74 @@ python3 -m venv venv source venv/bin/activate ``` +**Note**: If you're using VS Code or other development tools that auto-configure Python environments, ensure you're using the correct Python interpreter. The tests require the packages from `tests/validation/requirements.txt`. + 2. Install dependencies: ```bash pip install -r requirements.txt ``` +Verify installation: +```bash +python -m pytest --version +``` + ### Configuration -1. Update `configs/topology_config.yaml` with your network interface details: +#### Critical Configuration Steps + +1. **Update `configs/topology_config.yaml`** with your actual network interface details: ```yaml -system: - hostname: testserver - interfaces: - - name: ens801f0 - pci: "86:00.0" - ip: "192.168.108.15" - - name: ens801f1 - pci: "86:00.1" - ip: "192.168.208.15" +--- +metadata: + version: '2.4' +hosts: + - name: host + instantiate: true + role: sut + network_interfaces: + - pci_device: 8086:1592 # Update with your NIC's PCI device ID + interface_index: 0 + connections: + - ip_address: 127.0.0.1 # Use actual IP for remote hosts + connection_type: SSHConnection + connection_options: + port: 22 + username: root # Update with your username + password: None # Use key-based auth when possible + key_path: /home/user/.ssh/id_rsa # Update path to your SSH key ``` -2. Update `configs/test_config.yaml` with your test environment settings: +**To find your PCI device ID**: `lspci | grep Ethernet` + +2. **Update `configs/test_config.yaml`** with your environment paths: ```yaml -environment: - media_dir: "/path/to/test/media" - log_level: "INFO" - temp_dir: "/tmp/mtl_test" - -test_params: - timeout: 30 - retry_count: 3 - integrity_check: true +build: /path/to/Media-Transport-Library/ # Update to your MTL root directory +mtl_path: /path/to/Media-Transport-Library/ # Update to your MTL root directory +media_path: /mnt/media # Update to your test media location +capture_cfg: + enable: false # Set to true if you want packet capture + test_name: test_name + pcap_dir: /mnt/ramdisk/pcap + capture_time: 5 + interface: null # Set to interface name if capture enabled +ramdisk: + media: + mountpoint: /mnt/ramdisk/media + size_gib: 32 + pcap: + mountpoint: /mnt/ramdisk/pcap + size_gib: 768 ``` +**Important**: +- Set `build` and `mtl_path` to your actual MTL installation directory +- Set `media_path` to where your test media files are located +- Ensure the paths exist and are accessible + ## Running Tests ### Basic Usage @@ -225,10 +288,43 @@ To add new functionality to the framework: ### Common Issues -- **Network Interface Problems**: Ensure interfaces are properly configured and have the correct IP addresses -- **Permission Issues**: Many tests require root privileges for network operations -- **Media File Access**: Verify that test media files are available and accessible -- **Test Timeouts**: Increase timeout values in test_config.yaml for slower systems +#### RxTxApp Command Not Found +**Error**: `sudo: ./tests/tools/RxTxApp/build/RxTxApp: command not found` +**Solution**: The MTL library hasn't been built yet. Follow the build instructions in the Prerequisites section above or see [doc/build.md](build.md). + +#### Virtual Environment Issues +**Problem**: Package installation conflicts or wrong Python interpreter +**Solution**: +```bash +# Remove existing venv and recreate +rm -rf venv +python3 -m venv venv +source venv/bin/activate +pip install -r requirements.txt +``` + +#### Configuration File Issues +**Problem**: Tests fail with connection or path errors +**Solution**: +- Verify `configs/test_config.yaml` has correct paths (especially `build` and `mtl_path`) +- Update `configs/topology_config.yaml` with actual network interface details +- Use `lspci | grep Ethernet` to find your PCI device IDs + +#### Network Interface Problems +**Problem**: Interface configuration errors +**Solution**: Ensure interfaces are properly configured and have the correct IP addresses + +#### Permission Issues +**Problem**: Network operation failures +**Solution**: Many tests require root privileges for network operations. Run with appropriate sudo permissions. + +#### Media File Access +**Problem**: Media files not found +**Solution**: Verify that test media files are available and accessible at the path specified in `media_path` + +#### Test Timeouts +**Problem**: Tests timing out on slower systems +**Solution**: Increase timeout values in test_config.yaml for slower systems ### Debugging Tests diff --git a/tests/validation/README.md b/tests/validation/README.md index 1d8c59870..809e1724d 100644 --- a/tests/validation/README.md +++ b/tests/validation/README.md @@ -9,7 +9,7 @@ The Media Transport Library (MTL) Validation Framework provides comprehensive te ### Prerequisites - Python 3.9 or higher -- Media Transport Library built and installed +- **⚠️ CRITICAL**: Media Transport Library built and installed (see [build instructions](../../doc/build.md)) - Test media files (typically on NFS) - Network interfaces configured for testing - Root privileges for network operations @@ -17,7 +17,14 @@ The Media Transport Library (MTL) Validation Framework provides comprehensive te ### Setup in 3 Simple Steps -1. **Create a virtual environment and install dependencies**: +1. **Ensure MTL is built first** (if not done already): + ```bash + cd /path/to/Media-Transport-Library + ./build.sh + ``` + See [detailed build instructions](../../doc/build.md) if needed. + +2. **Create a virtual environment and install dependencies**: ```bash cd tests/validation python3 -m venv venv @@ -25,11 +32,12 @@ The Media Transport Library (MTL) Validation Framework provides comprehensive te pip install -r requirements.txt ``` -2. **Configure your environment**: +3. **Configure your environment**: - Update network interfaces in `configs/topology_config.yaml` - - Set media paths and test parameters in `configs/test_config.yaml` + - Set correct paths in `configs/test_config.yaml` (especially `build` and `mtl_path`) + - Ensure media files are accessible at `media_path` -3. **Run tests**: +4. **Run tests**: ```bash # Run smoke tests (quick validation) python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml -m smoke From f9b22e301194f07cdbdee96e603ee71ad403794e Mon Sep 17 00:00:00 2001 From: KarolinaPomian Date: Wed, 24 Sep 2025 08:46:42 +0000 Subject: [PATCH 06/15] Docs: Improve validation config setup instructions - Add critical warnings about placeholder values that must be updated - Add step-by-step setup instructions for both config files - Include commands to find PCI device IDs and verify SSH keys - Highlight required updates with warning symbols Addresses configuration confusion identified during new developer testing. --- tests/validation/configs/README.md | 65 +++++++++++++++++++++++++----- 1 file changed, 56 insertions(+), 9 deletions(-) diff --git a/tests/validation/configs/README.md b/tests/validation/configs/README.md index e94f2019f..b0b56c1f0 100644 --- a/tests/validation/configs/README.md +++ b/tests/validation/configs/README.md @@ -2,6 +2,10 @@ This directory contains configuration files for the Media Transport Library validation test suite. These files define the test environment, network topology, and test parameters. +## ⚠️ Critical Setup Required + +**BEFORE RUNNING TESTS**: You must update the placeholder values in these configuration files with your actual system details. Tests will fail with default placeholder values. + ## Configuration Files ### test_config.yaml @@ -9,9 +13,9 @@ This directory contains configuration files for the Media Transport Library vali This file contains general test environment settings: ```yaml -build: /path/to/mtl/build -mtl_path: /path/to/mtl -media_path: /mnt/media +build: MTL_PATH_PLACEHOLDER # ⚠️ UPDATE: Path to your MTL installation +mtl_path: MTL_PATH_PLACEHOLDER # ⚠️ UPDATE: Same as build path +media_path: /mnt/media # ⚠️ UPDATE: Path to your test media files capture_cfg: enable: false test_name: test_name @@ -30,8 +34,27 @@ ramdisk: #### Key Parameters - **build**: Path to the Media Transport Library build directory -- **mtl_path**: Path to the Media Transport Library installation +- **mtl_path**: Path to the Media Transport Library installation - **media_path**: Path to the directory containing test media files + +#### ⚠️ Setup Instructions + +1. **Replace `MTL_PATH_PLACEHOLDER`** with your actual MTL installation path: + ```bash + # Example: if MTL is in /home/user/Media-Transport-Library/ + build: /home/user/Media-Transport-Library/ + mtl_path: /home/user/Media-Transport-Library/ + ``` + +2. **Update `media_path`** to point to your test media files location + +3. **Verify the paths exist**: + ```bash + ls /path/to/your/Media-Transport-Library/build + ls /path/to/your/media/files/ + ``` + +#### Other Parameters - **capture_cfg**: Network packet capture configuration - **enable**: Enable/disable packet capture - **test_name**: Name prefix for capture files @@ -57,18 +80,42 @@ hosts: instantiate: true role: sut network_interfaces: - - pci_device: 8086:1592 + - pci_device: 8086:1592 # ⚠️ UPDATE: Your NIC's PCI device ID interface_index: 0 # all connections: - - ip_address: 192.168.1.100 + - ip_address: IP_ADDRESS_PLACEHOLDER # ⚠️ UPDATE: Your system IP connection_type: SSHConnection connection_options: - port: 22 - username: user + port: SSH_PORT_PLACEHOLDER # ⚠️ UPDATE: SSH port (usually 22) + username: USERNAME_PLACEHOLDER # ⚠️ UPDATE: Your username password: None - key_path: /path/to/ssh/key + key_path: KEY_PATH_PLACEHOLDER # ⚠️ UPDATE: Path to your SSH key ``` +#### ⚠️ Setup Instructions + +1. **Find your PCI device ID**: + ```bash + lspci | grep Ethernet + # Look for output like: 86:00.0 Ethernet controller: Intel Corporation... + # Use format: 8086:XXXX (8086 = Intel vendor ID) + ``` + +2. **Update placeholder values**: + ```yaml + # Replace placeholders with actual values: + ip_address: 127.0.0.1 # For localhost, or your actual IP + port: 22 # SSH port + username: your_actual_user # Your username + key_path: /home/your_user/.ssh/id_rsa # Path to your SSH key + ``` + +3. **Verify SSH key exists**: + ```bash + ls -la ~/.ssh/id_rsa + # If missing, generate one: ssh-keygen -t rsa -b 4096 + ``` + #### Topology Parameters - **metadata.version**: Configuration format version From a267ab4b0f1536dab2e1bc9b07ea95ff5c4ce805 Mon Sep 17 00:00:00 2001 From: KarolinaPomian Date: Wed, 24 Sep 2025 09:30:20 +0000 Subject: [PATCH 07/15] docs: Comprehensive MTL validation framework documentation improvements - Add interactive paths and clear directory requirements for all commands - Create validation_quickstart.md for rapid setup with step-by-step guide - Enhance validation_framework.md with root user requirements and device options - Add gen_frames.sh usage documentation with supported formats - Include specific test parameter examples and VF creation instructions - Update tests/validation/README.md with clickable config file links - Add comprehensive troubleshooting and multiple device specification methods - Improve documentation hierarchy following repository standards Addresses developer feedback for clearer setup instructions and eliminates common configuration issues that block new developers. --- .gitignore | 2 + doc/validation_framework.md | 94 ++++++++++++++++++++++++++++++---- doc/validation_quickstart.md | 99 ++++++++++++++++++++++++++++++++++++ tests/validation/README.md | 25 +++++---- 4 files changed, 202 insertions(+), 18 deletions(-) create mode 100644 doc/validation_quickstart.md diff --git a/.gitignore b/.gitignore index f24e106f5..1423133e1 100644 --- a/.gitignore +++ b/.gitignore @@ -108,3 +108,5 @@ tools/readpcap/readpcap_* swig openh264 level-zero-* + +venv* \ No newline at end of file diff --git a/doc/validation_framework.md b/doc/validation_framework.md index b75016bd7..0ee15e1f9 100644 --- a/doc/validation_framework.md +++ b/doc/validation_framework.md @@ -2,6 +2,8 @@ The Media Transport Library (MTL) Validation Framework provides comprehensive testing capabilities for various aspects of the MTL, including protocol compliance, performance, and integration testing. +> **🚀 Quick Start**: For rapid setup, see [Validation Quick Start Guide](validation_quickstart.md) + ## Overview The validation framework uses pytest to organize and execute tests across various scenarios, protocols, and backend implementations. It supports both automated testing in CI/CD environments and manual testing for development and troubleshooting. @@ -32,6 +34,26 @@ The `common/` directory contains shared utilities that provide fundamental funct - **Integrity Tools**: Provides functions for data integrity verification between source and received media - **Network Interface Control**: Manages network interfaces required for testing +#### gen_frames.sh + +A shell script for generating test frames for video testing: + +- Creates test patterns in various formats +- Supports different resolutions and frame rates +- Configurable color patterns and test signals + +**Usage**: +```bash +cd tests/validation/common +./gen_frames.sh +``` + +**Supported Formats**: +- Resolutions: 3840x2160, 1920x1080, 1280x720, 640x360 +- Pixel formats: yuv422p, yuv422p10le +- Custom color patterns and test signals with timestamps +- Configurable frame rates and durations + ### Configuration Files The `configs/` directory contains YAML files that specify: @@ -39,6 +61,10 @@ The `configs/` directory contains YAML files that specify: - **Test Environment Settings**: Hardware specifications, media paths, and test parameters - **Network Topology**: Interface configuration, IP addressing, and routing information +#### [test_config.yaml](../tests/validation/configs/test_config.yaml) + +Defines the test execution environment: + ### MTL Engine The `mtl_engine/` directory contains the core components of the framework: @@ -106,20 +132,26 @@ For complete build instructions, see [doc/build.md](build.md). ### Environment Setup -1. Create and activate a virtual environment: +> **⚠️ IMPORTANT**: Run all commands in the `tests/validation/` directory + +1. Create and activate a Python virtual environment: ```bash -cd tests/validation +cd tests/validation # Must be in this directory! python3 -m venv venv source venv/bin/activate ``` **Note**: If you're using VS Code or other development tools that auto-configure Python environments, ensure you're using the correct Python interpreter. The tests require the packages from `tests/validation/requirements.txt`. -2. Install dependencies: +2. Install required dependencies: ```bash +# Main framework requirements (run in tests/validation/) pip install -r requirements.txt + +# Additional integrity test components (optional but recommended) +pip install -r common/integrity/requirements.txt ``` Verify installation: @@ -131,7 +163,7 @@ python -m pytest --version #### Critical Configuration Steps -1. **Update `configs/topology_config.yaml`** with your actual network interface details: +1. **Update [`configs/topology_config.yaml`](../tests/validation/configs/topology_config.yaml)** with your actual network interface details: ```yaml --- @@ -149,14 +181,25 @@ hosts: connection_type: SSHConnection connection_options: port: 22 - username: root # Update with your username + username: root # ⚠️ MUST be root for MTL validation password: None # Use key-based auth when possible - key_path: /home/user/.ssh/id_rsa # Update path to your SSH key + key_path: /root/.ssh/id_rsa # Update path to your SSH key ``` -**To find your PCI device ID**: `lspci | grep Ethernet` +**Device Specification Options**: +- **PCI device ID** (recommended): Find with `lspci | grep Ethernet` → use format like "0000:18:00.0" +- **System interface name**: Find with `ip link show` → use format like "enp24s0f0" + +**To find your options**: +```bash +# Find PCI device IDs +lspci | grep Ethernet + +# Find system interface names +ip link show +``` -2. **Update `configs/test_config.yaml`** with your environment paths: +2. **Update [`configs/test_config.yaml`](../tests/validation/configs/test_config.yaml)** with your environment paths: ```yaml build: /path/to/Media-Transport-Library/ # Update to your MTL root directory @@ -182,14 +225,35 @@ ramdisk: - Set `media_path` to where your test media files are located - Ensure the paths exist and are accessible +#### Optional: Create VFs for Advanced Testing + +For NIC testing with Virtual Functions: + +```bash +# First, identify your network devices +lspci | grep Ethernet + +# Create VFs (replace with your actual PCI device IDs or interface names) +sudo ./script/nicctl.sh create_vf "0000:18:00.0" # Replace with your primary port +sudo ./script/nicctl.sh create_vf "0000:18:00.1" # Replace with your secondary port +``` + +**Examples of valid identifiers**: +- PCI device ID: `"0000:18:00.0"` +- Interface name: `"enp24s0f0"` +- Environment variables: `"${TEST_PF_PORT_P}"` (if you set them) + ## Running Tests -### Basic Usage +> **⚠️ CRITICAL**: Tests must be run as **root user**, not regular user. MTL validation framework requires root privileges for network operations. + +### Basic Test Execution Run all tests: ```bash cd tests/validation +source venv/bin/activate # Activate virtual environment python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml ``` @@ -199,6 +263,18 @@ Run smoke tests: python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml -m smoke ``` +### Running Specific Tests with Parameters + +Run a specific test case with custom parameters: + +```bash +pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml "tests/single/st20p/fps/test_fps.py::test_fps[|fps = p60|-ParkJoy_1080p]" +``` + +```bash +python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml -m smoke +``` + Run specific test modules: ```bash diff --git a/doc/validation_quickstart.md b/doc/validation_quickstart.md new file mode 100644 index 000000000..b6c2d4a96 --- /dev/null +++ b/doc/validation_quickstart.md @@ -0,0 +1,99 @@ +# MTL Validation Framework - Quick Start Guide + +This quick start guide helps you get the MTL validation framework running with minimal setup. For detailed information, see the [complete validation framework documentation](validation_framework.md). + +## Prerequisites (Must Complete First!) + +1. **🏗️ Build MTL** (CRITICAL - tests will fail without this): + ```bash + cd /path/to/Media-Transport-Library + ./build.sh + ``` + > If this fails, see [detailed build instructions](build.md) + +2. **📋 Basic Requirements**: + - Python 3.9+ + - Root user access (MTL validation requires root privileges) + - Network interfaces configured for testing + +## Quick Setup (3 steps) + +### 1. Install Dependencies +**Run in tests/validation directory**: +```bash +cd tests/validation +python3 -m venv venv +source venv/bin/activate +pip install -r requirements.txt # Main framework requirements +pip install -r common/integrity/requirements.txt # Integrity test components +``` + +### 2. Configure Environment +Update two key files: + +**[tests/validation/configs/topology_config.yaml](../tests/validation/configs/topology_config.yaml)**: +```yaml +# Key settings to update: +username: root # Must be root for MTL operations +key_path: /root/.ssh/id_rsa # Your SSH key path +``` + +**[tests/validation/configs/test_config.yaml](../tests/validation/configs/test_config.yaml)**: +```yaml +# Replace MTL_PATH_PLACEHOLDER with your actual paths: +build: /home/gta/Media-Transport-Library/ +mtl_path: /home/gta/Media-Transport-Library/ +``` + +### 3. Run Tests +**Basic smoke test**: +```bash +cd tests/validation +source venv/bin/activate +python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml -m smoke -v +``` + +**Run specific test with parameters**: +```bash +pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml "tests/single/st20p/fps/test_fps.py::test_fps[|fps = p60|-ParkJoy_1080p]" +``` + +## Optional: Create VFs for Advanced Testing + +If you need VFs for NIC testing: +```bash +# Find your network device first +lspci | grep Ethernet + +# Create VFs (replace with your device identifier) +sudo ./script/nicctl.sh create_vf ${TEST_PF_PORT_P} +sudo ./script/nicctl.sh create_vf ${TEST_PF_PORT_R} +``` + +## Quick Troubleshooting + +| Error | Solution | +|-------|----------| +| `RxTxApp: command not found` | Build MTL first with `./build.sh` | +| `Permission denied` | Use root user (not regular user) | +| `Config path errors` | Update placeholder paths in config files | + +## Generate Test Media (Optional) + +For video testing, generate test frames: +```bash +cd tests/validation/common +./gen_frames.sh +``` + +The script supports: +- Multiple resolutions (3840x2160, 1920x1080, 1280x720, 640x360) +- Different pixel formats (yuv422p, yuv422p10le) +- Configurable color patterns and test signals +- Various frame rates + +--- + +**Need more details?** → [Complete Documentation](validation_framework.md) +**Build issues?** → [Build Guide](build.md) +**Configuration help?** → [Configuration Guide](configuration_guide.md) \ No newline at end of file diff --git a/tests/validation/README.md b/tests/validation/README.md index 809e1724d..7b4e62cd0 100644 --- a/tests/validation/README.md +++ b/tests/validation/README.md @@ -2,9 +2,11 @@ The Media Transport Library (MTL) Validation Framework provides comprehensive testing capabilities for various aspects of the MTL, including protocol compliance, performance, and integration testing. -> **For detailed documentation, please refer to [the main validation framework documentation](/doc/validation_framework.md)** +> **📖 For detailed documentation, please refer to [the main validation framework documentation](../../doc/validation_framework.md)** +> +> **🚀 Quick Start**: See [Validation Quick Start Guide](../../doc/validation_quickstart.md) -## Quick Start Guide +## Quick Setup ### Prerequisites @@ -12,7 +14,7 @@ The Media Transport Library (MTL) Validation Framework provides comprehensive te - **⚠️ CRITICAL**: Media Transport Library built and installed (see [build instructions](../../doc/build.md)) - Test media files (typically on NFS) - Network interfaces configured for testing -- Root privileges for network operations +- **Root privileges required** (MTL validation must run as root user) - FFmpeg and GStreamer plugins (for integration tests) ### Setup in 3 Simple Steps @@ -24,27 +26,32 @@ The Media Transport Library (MTL) Validation Framework provides comprehensive te ``` See [detailed build instructions](../../doc/build.md) if needed. -2. **Create a virtual environment and install dependencies**: +2. **Create virtual environment and install dependencies** (run in `tests/validation/`): ```bash - cd tests/validation + cd tests/validation # Must be in this directory! python3 -m venv venv source venv/bin/activate - pip install -r requirements.txt + pip install -r requirements.txt # Main framework requirements + pip install -r common/integrity/requirements.txt # Integrity test components ``` 3. **Configure your environment**: - - Update network interfaces in `configs/topology_config.yaml` - - Set correct paths in `configs/test_config.yaml` (especially `build` and `mtl_path`) + - Update network interfaces in [`configs/topology_config.yaml`](configs/topology_config.yaml) + - Set correct paths in [`configs/test_config.yaml`](configs/test_config.yaml) (especially `build` and `mtl_path`) - Ensure media files are accessible at `media_path` + - **Use root user** in topology_config.yaml (not regular user) 4. **Run tests**: ```bash - # Run smoke tests (quick validation) + # Run smoke tests (quick validation) - MUST be run as root python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml -m smoke # Run specific test module python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml tests/single/st20p/test_st20p_rx.py + # Run specific test with parameters + pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml "tests/single/st20p/fps/test_fps.py::test_fps[|fps = p60|-ParkJoy_1080p]" + # Generate HTML report python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml -m smoke --template=html/index.html --report=report.html ``` From facca624053d68931be17edab97564c8d2a6f458 Mon Sep 17 00:00:00 2001 From: KarolinaPomian Date: Wed, 24 Sep 2025 10:00:39 +0000 Subject: [PATCH 08/15] docs: address validation documentation clarity issues - Fix directory specification: clarify tests/validation/ must be current dir for pip install - Add gen_frames.sh usage details: specify directory and supported formats - Add interactive paths for test_config.yaml and topology_config.yaml - Add device specification options including system name method - Add comprehensive pytest parameter execution examples - Emphasize root user requirement more clearly throughout - Clean up duplicated device specification sections - Add clearer file location information for both config files --- doc/validation_framework.md | 53 ++++++++++++++++++++++++++++++++---- doc/validation_quickstart.md | 11 ++++---- 2 files changed, 53 insertions(+), 11 deletions(-) diff --git a/doc/validation_framework.md b/doc/validation_framework.md index 0ee15e1f9..88b1bba64 100644 --- a/doc/validation_framework.md +++ b/doc/validation_framework.md @@ -41,11 +41,13 @@ A shell script for generating test frames for video testing: - Creates test patterns in various formats - Supports different resolutions and frame rates - Configurable color patterns and test signals +- Generates files like `ParkJoy_1080p.yuv`, test patterns, and various resolution formats **Usage**: ```bash -cd tests/validation/common -./gen_frames.sh +cd tests/validation/common # Must be in this directory +./gen_frames.sh # Generates test media files for validation +# Generated files will be available for test configuration ``` **Supported Formats**: @@ -61,10 +63,27 @@ The `configs/` directory contains YAML files that specify: - **Test Environment Settings**: Hardware specifications, media paths, and test parameters - **Network Topology**: Interface configuration, IP addressing, and routing information -#### [test_config.yaml](../tests/validation/configs/test_config.yaml) +#### [`test_config.yaml`](../tests/validation/configs/test_config.yaml) + +Location: `tests/validation/configs/test_config.yaml` Defines the test execution environment: +**Key Parameters**: + - **build**: Path to MTL build directory + - **mtl_path**: Path to MTL installation directory + - **media_path**: Path to test media files directory + - **ramdisk.media.mountpoint**: Mount point for media RAM disk + - **ramdisk.media.size_gib**: Size of media RAM disk in GiB + - **ramdisk.pcap.mountpoint**: Mount point for packet capture RAM disk + - **ramdisk.pcap.size_gib**: Size of packet capture RAM disk in GiB + +#### [`topology_config.yaml`](../tests/validation/configs/topology_config.yaml) + +Location: `tests/validation/configs/topology_config.yaml` + +Defines the network topology and host configuration. + ### MTL Engine The `mtl_engine/` directory contains the core components of the framework: @@ -187,10 +206,13 @@ hosts: ``` **Device Specification Options**: -- **PCI device ID** (recommended): Find with `lspci | grep Ethernet` → use format like "0000:18:00.0" -- **System interface name**: Find with `ip link show` → use format like "enp24s0f0" +You can specify network devices in multiple ways: +- **PCI device ID** (recommended): `"0000:18:00.0"` (find with `lspci | grep Ethernet`) +- **Interface name**: `"enp175s0f0np0"` (find with `ip link show`) +- **System name**: Use your actual system hostname in the `name` field for the host +- **Environment variables**: `"${TEST_PF_PORT_P}"` (if you set them) -**To find your options**: +**To find your device options**: ```bash # Find PCI device IDs lspci | grep Ethernet @@ -249,6 +271,25 @@ sudo ./script/nicctl.sh create_vf "0000:18:00.1" # Replace with your secondary ### Basic Test Execution +**⚠️ CRITICAL**: All tests must be run as **root user**. Regular users will fail. + +#### Run specific test with parameters: + +**Examples of running tests with specific parameters**: +```bash +# Run fps test with specific parameters +pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml "tests/single/st20p/fps/test_fps.py::test_fps[|fps = p60|-ParkJoy_1080p]" + +# Run specific integrity test with resolution parameters +pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml "tests/dual/st20p/integrity/test_integrity.py::test_integrity[yuv422p10le-1920x1080]" + +# Run specific packing test +pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml "tests/dual/st20p/packing/test_packing.py::test_packing[bpm-10]" + +# Run audio format test with specific format +pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml "tests/dual/st30p/st30p_format/test_st30p_format.py::test_st30p_format[pcm24]" +``` + Run all tests: ```bash diff --git a/doc/validation_quickstart.md b/doc/validation_quickstart.md index b6c2d4a96..665529046 100644 --- a/doc/validation_quickstart.md +++ b/doc/validation_quickstart.md @@ -80,17 +80,18 @@ sudo ./script/nicctl.sh create_vf ${TEST_PF_PORT_R} ## Generate Test Media (Optional) -For video testing, generate test frames: +For video testing, generate test frames (must run from specific directory): ```bash -cd tests/validation/common -./gen_frames.sh +cd tests/validation/common # Must be in this directory +./gen_frames.sh # Generates test media files ``` The script supports: - Multiple resolutions (3840x2160, 1920x1080, 1280x720, 640x360) - Different pixel formats (yuv422p, yuv422p10le) -- Configurable color patterns and test signals -- Various frame rates +- Configurable color patterns and test signals with timestamps +- Various frame rates and durations +- Generates files like `ParkJoy_1080p.yuv` used in test examples --- From 59aa3b07228f2559f1f7fc11e3a9146a5268c2be Mon Sep 17 00:00:00 2001 From: KarolinaPomian Date: Wed, 24 Sep 2025 10:04:05 +0000 Subject: [PATCH 09/15] docs: add interactive paths to config files in configs README - Add clickable markdown links to test_config.yaml and topology_config.yaml - Include clear file location paths for easy navigation - Improve accessibility for developers navigating configuration files --- tests/validation/configs/README.md | 8 ++++++-- 1 file changed, 6 insertions(+), 2 deletions(-) diff --git a/tests/validation/configs/README.md b/tests/validation/configs/README.md index b0b56c1f0..9072f4c92 100644 --- a/tests/validation/configs/README.md +++ b/tests/validation/configs/README.md @@ -8,7 +8,9 @@ This directory contains configuration files for the Media Transport Library vali ## Configuration Files -### test_config.yaml +### [`test_config.yaml`](test_config.yaml) + +**File Location**: `tests/validation/configs/test_config.yaml` This file contains general test environment settings: @@ -67,7 +69,9 @@ ramdisk: - **pcap.mountpoint**: Mount point for packet capture RAM disk - **pcap.size_gib**: Size of packet capture RAM disk in GiB -### topology_config.yaml +### [`topology_config.yaml`](topology_config.yaml) + +**File Location**: `tests/validation/configs/topology_config.yaml` This file defines the network topology for testing: From a1f759f591e40f27ae3a46fc10f4d36548e51f96 Mon Sep 17 00:00:00 2001 From: KarolinaPomian Date: Wed, 24 Sep 2025 10:10:54 +0000 Subject: [PATCH 10/15] docs: fix markdown linting errors - Remove duplicate 'Test Framework Structure' heading - Fix list indentation (use 0 spaces instead of 2) - Remove trailing spaces throughout documentation - Fix trailing punctuation in heading (remove colon) - Rename duplicate 'Setup Instructions' heading to 'Topology Setup Instructions' - Ensure consistent markdown formatting across all documentation files Resolves all MD024, MD007, MD009, and MD026 linting issues. --- doc/validation_framework.md | 30 ++++++++++++------------------ tests/validation/README.md | 2 +- tests/validation/configs/README.md | 16 ++++++++-------- 3 files changed, 21 insertions(+), 27 deletions(-) diff --git a/doc/validation_framework.md b/doc/validation_framework.md index 88b1bba64..5f022a345 100644 --- a/doc/validation_framework.md +++ b/doc/validation_framework.md @@ -12,10 +12,6 @@ The validation framework uses pytest to organize and execute tests across variou The validation framework is organized into the following main components: -## Test Framework Structure - -The validation framework is organized into the following main components: - - **common/**: Shared utilities for test functionality, including FFmpeg handlers, integrity verification tools, and network interface control - **configs/**: Configuration files for test environment and network topology - **mtl_engine/**: Core test framework components that manage test execution, application interfaces, and result reporting @@ -70,13 +66,13 @@ Location: `tests/validation/configs/test_config.yaml` Defines the test execution environment: **Key Parameters**: - - **build**: Path to MTL build directory - - **mtl_path**: Path to MTL installation directory - - **media_path**: Path to test media files directory - - **ramdisk.media.mountpoint**: Mount point for media RAM disk - - **ramdisk.media.size_gib**: Size of media RAM disk in GiB - - **ramdisk.pcap.mountpoint**: Mount point for packet capture RAM disk - - **ramdisk.pcap.size_gib**: Size of packet capture RAM disk in GiB +- **build**: Path to MTL build directory +- **mtl_path**: Path to MTL installation directory +- **media_path**: Path to test media files directory +- **ramdisk.media.mountpoint**: Mount point for media RAM disk +- **ramdisk.media.size_gib**: Size of media RAM disk in GiB +- **ramdisk.pcap.mountpoint**: Mount point for packet capture RAM disk +- **ramdisk.pcap.size_gib**: Size of packet capture RAM disk in GiB #### [`topology_config.yaml`](../tests/validation/configs/topology_config.yaml) @@ -242,7 +238,7 @@ ramdisk: size_gib: 768 ``` -**Important**: +**Important**: - Set `build` and `mtl_path` to your actual MTL installation directory - Set `media_path` to where your test media files are located - Ensure the paths exist and are accessible @@ -267,13 +263,11 @@ sudo ./script/nicctl.sh create_vf "0000:18:00.1" # Replace with your secondary ## Running Tests -> **⚠️ CRITICAL**: Tests must be run as **root user**, not regular user. MTL validation framework requires root privileges for network operations. - -### Basic Test Execution +> **⚠️ CRITICAL**: Tests must be run as **root user**, not regular user. MTL validation framework requires root privileges for network operations.### Basic Test Execution **⚠️ CRITICAL**: All tests must be run as **root user**. Regular users will fail. -#### Run specific test with parameters: +### Run specific test with parameters **Examples of running tests with specific parameters**: ```bash @@ -411,7 +405,7 @@ To add new functionality to the framework: #### Virtual Environment Issues **Problem**: Package installation conflicts or wrong Python interpreter -**Solution**: +**Solution**: ```bash # Remove existing venv and recreate rm -rf venv @@ -422,7 +416,7 @@ pip install -r requirements.txt #### Configuration File Issues **Problem**: Tests fail with connection or path errors -**Solution**: +**Solution**: - Verify `configs/test_config.yaml` has correct paths (especially `build` and `mtl_path`) - Update `configs/topology_config.yaml` with actual network interface details - Use `lspci | grep Ethernet` to find your PCI device IDs diff --git a/tests/validation/README.md b/tests/validation/README.md index 7b4e62cd0..9d7245760 100644 --- a/tests/validation/README.md +++ b/tests/validation/README.md @@ -3,7 +3,7 @@ The Media Transport Library (MTL) Validation Framework provides comprehensive testing capabilities for various aspects of the MTL, including protocol compliance, performance, and integration testing. > **📖 For detailed documentation, please refer to [the main validation framework documentation](../../doc/validation_framework.md)** -> +> > **🚀 Quick Start**: See [Validation Quick Start Guide](../../doc/validation_quickstart.md) ## Quick Setup diff --git a/tests/validation/configs/README.md b/tests/validation/configs/README.md index 9072f4c92..fb92d915b 100644 --- a/tests/validation/configs/README.md +++ b/tests/validation/configs/README.md @@ -25,7 +25,7 @@ capture_cfg: capture_time: 5 interface: null ramdisk: - media: + media: mountpoint: /mnt/ramdisk/media size_gib: 32 pcap: @@ -33,13 +33,13 @@ ramdisk: size_gib: 768 ``` -#### Key Parameters +### Key Parameters - **build**: Path to the Media Transport Library build directory -- **mtl_path**: Path to the Media Transport Library installation +- **mtl_path**: Path to the Media Transport Library installation - **media_path**: Path to the directory containing test media files -#### ⚠️ Setup Instructions +### ⚠️ Setup Instructions 1. **Replace `MTL_PATH_PLACEHOLDER`** with your actual MTL installation path: ```bash @@ -56,7 +56,7 @@ ramdisk: ls /path/to/your/media/files/ ``` -#### Other Parameters +### Other Parameters - **capture_cfg**: Network packet capture configuration - **enable**: Enable/disable packet capture - **test_name**: Name prefix for capture files @@ -96,7 +96,7 @@ hosts: key_path: KEY_PATH_PLACEHOLDER # ⚠️ UPDATE: Path to your SSH key ``` -#### ⚠️ Setup Instructions +### Topology Setup Instructions 1. **Find your PCI device ID**: ```bash @@ -120,7 +120,7 @@ hosts: # If missing, generate one: ssh-keygen -t rsa -b 4096 ``` -#### Topology Parameters +### Topology Parameters - **metadata.version**: Configuration format version - **hosts**: List of hosts in the test topology @@ -166,7 +166,7 @@ def test_with_custom_config(config): # Override configuration for this test config.capture_cfg.enable = True config.capture_cfg.interface = "enp1s0f0" - + # Run test with modified configuration # ... ``` From 397411263d346e719288b80ec4d0745cb6723f1e Mon Sep 17 00:00:00 2001 From: KarolinaPomian <108665762+KarolinaPomian@users.noreply.github.com> Date: Wed, 24 Sep 2025 11:53:14 +0200 Subject: [PATCH 11/15] Update tests/validation/configs/README.md --- tests/validation/configs/README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/tests/validation/configs/README.md b/tests/validation/configs/README.md index fb92d915b..24db84745 100644 --- a/tests/validation/configs/README.md +++ b/tests/validation/configs/README.md @@ -91,7 +91,7 @@ hosts: connection_type: SSHConnection connection_options: port: SSH_PORT_PLACEHOLDER # ⚠️ UPDATE: SSH port (usually 22) - username: USERNAME_PLACEHOLDER # ⚠️ UPDATE: Your username + username: USERNAME_PLACEHOLDER # ⚠️ UPDATE: root password: None key_path: KEY_PATH_PLACEHOLDER # ⚠️ UPDATE: Path to your SSH key ``` From 298c449076436e301cdb722f0bd5c94ada464218 Mon Sep 17 00:00:00 2001 From: KarolinaPomian Date: Wed, 24 Sep 2025 11:35:03 +0000 Subject: [PATCH 12/15] docs: update validation framework and quick start guide with RxTxApp build instructions and troubleshooting tips --- doc/validation_framework.md | 156 +++++++++++++++++++++++++++-------- doc/validation_quickstart.md | 68 ++++++++------- tests/validation/README.md | 49 ++++------- 3 files changed, 173 insertions(+), 100 deletions(-) diff --git a/doc/validation_framework.md b/doc/validation_framework.md index 5f022a345..06b78fcba 100644 --- a/doc/validation_framework.md +++ b/doc/validation_framework.md @@ -2,7 +2,13 @@ The Media Transport Library (MTL) Validation Framework provides comprehensive testing capabilities for various aspects of the MTL, including protocol compliance, performance, and integration testing. -> **🚀 Quick Start**: For rapid setup, see [Validation Quick Start Guide](validation_quickstart.md) +## Documentation Navigation + +🚀 **Quick Setup**: [Validation Quick Start Guide](validation_quickstart.md) - Get running in 3 steps +📁 **Local README**: [tests/validation/README.md](../tests/validation/README.md) - Quick reference and test categories +🔧 **Build Guide**: [build.md](build.md) - MTL build instructions + +--- ## Overview @@ -39,6 +45,8 @@ A shell script for generating test frames for video testing: - Configurable color patterns and test signals - Generates files like `ParkJoy_1080p.yuv`, test patterns, and various resolution formats +**Prerequisites**: Requires FFmpeg with text filters enabled. + **Usage**: ```bash cd tests/validation/common # Must be in this directory @@ -46,6 +54,22 @@ cd tests/validation/common # Must be in this directory # Generated files will be available for test configuration ``` +**Troubleshooting**: If you get "No such filter: 'drawtext'" errors, install a complete FFmpeg build or skip media generation. + +#### RxTxApp Test Tool + +**CRITICAL**: Tests require the RxTxApp tool which is not built by the main MTL build process. + +**Build Instructions** (required before running tests): +```bash +cd tests/tools/RxTxApp +meson setup build +meson compile -C build +cd ../../.. +``` + +**Location**: After building, RxTxApp is available at `tests/tools/RxTxApp/build/RxTxApp` + **Supported Formats**: - Resolutions: 3840x2160, 1920x1080, 1280x720, 640x360 - Pixel formats: yuv422p, yuv422p10le @@ -133,8 +157,23 @@ cd .. # 3. Build MTL cd Media-Transport-Library ./build.sh + +# 4. Install MTL system-wide (REQUIRED for RxTxApp) +sudo ninja install -C build +sudo ldconfig + +# 5. Build required test tools (CRITICAL for validation) +cd tests/tools/RxTxApp +meson setup build +meson compile -C build +cd ../../.. ``` +> **⚠️ CRITICAL**: +> - The RxTxApp tool is required for validation tests but not built by the main build process +> - RxTxApp requires MTL to be installed system-wide to build successfully +> - You must build it separately after installing MTL + For complete build instructions, see [doc/build.md](build.md). #### 2. Other Prerequisites @@ -147,33 +186,20 @@ For complete build instructions, see [doc/build.md](build.md). ### Environment Setup -> **⚠️ IMPORTANT**: Run all commands in the `tests/validation/` directory +> **🚀 Quick Setup**: See [Validation Quick Start Guide](validation_quickstart.md) for streamlined setup steps. + +For detailed setup: -1. Create and activate a Python virtual environment: +1. Create Python virtual environment in `tests/validation/`: ```bash -cd tests/validation # Must be in this directory! +cd tests/validation python3 -m venv venv source venv/bin/activate -``` - -**Note**: If you're using VS Code or other development tools that auto-configure Python environments, ensure you're using the correct Python interpreter. The tests require the packages from `tests/validation/requirements.txt`. - -2. Install required dependencies: - -```bash -# Main framework requirements (run in tests/validation/) pip install -r requirements.txt - -# Additional integrity test components (optional but recommended) pip install -r common/integrity/requirements.txt ``` -Verify installation: -```bash -python -m pytest --version -``` - ### Configuration #### Critical Configuration Steps @@ -265,8 +291,6 @@ sudo ./script/nicctl.sh create_vf "0000:18:00.1" # Replace with your secondary > **⚠️ CRITICAL**: Tests must be run as **root user**, not regular user. MTL validation framework requires root privileges for network operations.### Basic Test Execution -**⚠️ CRITICAL**: All tests must be run as **root user**. Regular users will fail. - ### Run specific test with parameters **Examples of running tests with specific parameters**: @@ -284,19 +308,9 @@ pytest --topology_config=configs/topology_config.yaml --test_config=configs/test pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml "tests/dual/st30p/st30p_format/test_st30p_format.py::test_st30p_format[pcm24]" ``` -Run all tests: +> **🚀 Quick Test Execution**: See [Quick Start Guide](validation_quickstart.md#3-run-tests) for basic test commands. -```bash -cd tests/validation -source venv/bin/activate # Activate virtual environment -python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml -``` - -Run smoke tests: - -```bash -python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml -m smoke -``` +For comprehensive test execution: ### Running Specific Tests with Parameters @@ -313,13 +327,13 @@ python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=c Run specific test modules: ```bash -python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml tests/single/st20p/test_st20p_rx.py +sudo ./venv/bin/python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml tests/single/st20p/test_st20p_rx.py ``` Run specific test cases with parameters: ```bash -python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml "tests/single/st20p/fps/test_fps.py::test_fps[|fps = p60|-ParkJoy_1080p]" +sudo ./venv/bin/python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml "tests/single/st20p/fps/test_fps.py::test_fps[|fps = p60|-ParkJoy_1080p]" ``` ### Test Categories @@ -429,6 +443,58 @@ pip install -r requirements.txt **Problem**: Network operation failures **Solution**: Many tests require root privileges for network operations. Run with appropriate sudo permissions. +#### Build and Setup Issues + +**Problem**: `RxTxApp: command not found` +**Solution**: Build the RxTxApp test tool separately: +```bash +cd tests/tools/RxTxApp +meson setup build +meson compile -C build +cd ../../.. +``` + +**Problem**: RxTxApp build fails with "ST20P_TX_FLAG_EXACT_USER_PACING undeclared" or other header errors +**Solution**: Install MTL system-wide before building RxTxApp: +```bash +cd /path/to/Media-Transport-Library +sudo ninja install -C build +sudo ldconfig +# Then build RxTxApp +cd tests/tools/RxTxApp +rm -rf build # Clean previous failed build +meson setup build +meson compile -C build +``` + +**Problem**: `No module named pytest` when using sudo +**Solution**: Use the virtual environment python with sudo: +```bash +# Wrong: sudo python3 -m pytest +# Correct: +sudo ./venv/bin/python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml +``` + +**Problem**: DSA SSH key errors: `ValueError: q must be exactly 160, 224, or 256 bits long` +**Solution**: Generate new RSA SSH keys and configure SSH access: +```bash +# Generate RSA keys (as your regular user, not root) +ssh-keygen -t rsa -b 2048 -f ~/.ssh/id_rsa + +# Set up SSH access for root@localhost +ssh-copy-id -i ~/.ssh/id_rsa.pub root@localhost + +# Update topology_config.yaml to use your user's key path: +# key_path: /home/your-username/.ssh/id_rsa (not /root/.ssh/id_rsa) +``` + +**Problem**: FFmpeg `No such filter: 'drawtext'` when running gen_frames.sh +**Solution**: Install complete FFmpeg build or skip media generation: +```bash +sudo apt install ffmpeg # Full installation +# Or skip: some tests may work without generated media +``` + #### Media File Access **Problem**: Media files not found **Solution**: Verify that test media files are available and accessible at the path specified in `media_path` @@ -437,6 +503,26 @@ pip install -r requirements.txt **Problem**: Tests timing out on slower systems **Solution**: Increase timeout values in test_config.yaml for slower systems +### Quick Reference Tables + +#### Build Issues + +| Problem | Solution | +|---------|----------| +| `RxTxApp: command not found` | Build RxTxApp: `cd tests/tools/RxTxApp && meson setup build && meson compile -C build` | +| `MTL library not found` | Install MTL system-wide: `sudo ninja install -C build && sudo ldconfig` | +| `DSA key error: q must be exactly 160, 224, or 256 bits` | Generate RSA keys: `ssh-keygen -t rsa -b 2048 -f ~/.ssh/id_rsa` | + +#### Runtime Issues + +| Problem | Solution | +|---------|----------| +| `Permission denied` | Use root user: `sudo ./venv/bin/python3 -m pytest` | +| `No module named pytest` | Don't use `sudo python3`, use `sudo ./venv/bin/python3` | +| `Config path errors` | Update placeholder paths in config files | +| `SSH connection failed` | Ensure SSH keys are set up for root@localhost access | +| `No such filter: 'drawtext'` | Install FFmpeg with text filters or skip media generation | + ### Debugging Tests Use pytest's debug features: diff --git a/doc/validation_quickstart.md b/doc/validation_quickstart.md index 665529046..b03a35ee2 100644 --- a/doc/validation_quickstart.md +++ b/doc/validation_quickstart.md @@ -2,19 +2,17 @@ This quick start guide helps you get the MTL validation framework running with minimal setup. For detailed information, see the [complete validation framework documentation](validation_framework.md). -## Prerequisites (Must Complete First!) +## Prerequisites -1. **🏗️ Build MTL** (CRITICAL - tests will fail without this): - ```bash - cd /path/to/Media-Transport-Library - ./build.sh - ``` - > If this fails, see [detailed build instructions](build.md) +1. **🏗️ MTL Build Complete**: MTL must be built and test tools available + 👉 **[Follow complete build instructions](validation_framework.md#setup-and-installation)** 2. **📋 Basic Requirements**: - Python 3.9+ - Root user access (MTL validation requires root privileges) - Network interfaces configured for testing + - FFmpeg with text filters (for media generation) + - Compatible SSH keys (RSA recommended, not DSA) ## Quick Setup (3 steps) @@ -35,9 +33,19 @@ Update two key files: ```yaml # Key settings to update: username: root # Must be root for MTL operations -key_path: /root/.ssh/id_rsa # Your SSH key path +key_path: /home/your-username/.ssh/id_rsa # YOUR user's SSH key path (not /root/) +ip_address: 127.0.0.1 # For localhost testing +port: 22 # Standard SSH port ``` +> **⚠️ SSH Key Setup**: +> - Use your regular user's SSH keys (e.g., `/home/gta/.ssh/id_rsa`), not root's keys +> - If you get DSA key errors, generate new RSA keys: +> ```bash +> ssh-keygen -t rsa -b 2048 -f ~/.ssh/id_rsa +> ssh-copy-id -i ~/.ssh/id_rsa.pub root@localhost +> ``` + **[tests/validation/configs/test_config.yaml](../tests/validation/configs/test_config.yaml)**: ```yaml # Replace MTL_PATH_PLACEHOLDER with your actual paths: @@ -46,13 +54,15 @@ mtl_path: /home/gta/Media-Transport-Library/ ``` ### 3. Run Tests -**Basic smoke test**: +**Basic smoke test** (must run as root): ```bash cd tests/validation -source venv/bin/activate -python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml -m smoke -v +# Use full path to venv python with sudo: +sudo ./venv/bin/python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml -m smoke -v ``` +> **💡 Root Execution**: Don't use `sudo python3` (uses system python). Use `sudo ./venv/bin/python3` to use the virtual environment. + **Run specific test with parameters**: ```bash pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml "tests/single/st20p/fps/test_fps.py::test_fps[|fps = p60|-ParkJoy_1080p]" @@ -72,29 +82,27 @@ sudo ./script/nicctl.sh create_vf ${TEST_PF_PORT_R} ## Quick Troubleshooting -| Error | Solution | -|-------|----------| -| `RxTxApp: command not found` | Build MTL first with `./build.sh` | -| `Permission denied` | Use root user (not regular user) | +| Common Error | Quick Solution | +|--------------|----------------| +| `RxTxApp: command not found` | [Follow build instructions](validation_framework.md#rxtxapp-test-tool) | +| `Permission denied` | Use root: `sudo ./venv/bin/python3 -m pytest` | +| `No module named pytest` | Don't use `sudo python3`, use `sudo ./venv/bin/python3` | | `Config path errors` | Update placeholder paths in config files | +| SSH/FFmpeg issues | See [detailed troubleshooting](validation_framework.md#troubleshooting) | ## Generate Test Media (Optional) -For video testing, generate test frames (must run from specific directory): -```bash -cd tests/validation/common # Must be in this directory -./gen_frames.sh # Generates test media files -``` - -The script supports: -- Multiple resolutions (3840x2160, 1920x1080, 1280x720, 640x360) -- Different pixel formats (yuv422p, yuv422p10le) -- Configurable color patterns and test signals with timestamps -- Various frame rates and durations -- Generates files like `ParkJoy_1080p.yuv` used in test examples +For video testing, you may need test media files: +👉 **[See media generation instructions](validation_framework.md#gen_framessh)** --- -**Need more details?** → [Complete Documentation](validation_framework.md) -**Build issues?** → [Build Guide](build.md) -**Configuration help?** → [Configuration Guide](configuration_guide.md) \ No newline at end of file +## Documentation Navigation + +📖 **Complete Documentation**: [Validation Framework](validation_framework.md) - Detailed information, configuration, and advanced features +🔧 **Build Issues**: [Build Guide](build.md) - MTL build instructions +⚙️ **Configuration Help**: [Configuration Guide](configuration_guide.md) - Network and environment setup + +## Summary + +This quick start guide gets you running tests in minutes. For production use, detailed configuration, or troubleshooting complex issues, refer to the complete documentation above. \ No newline at end of file diff --git a/tests/validation/README.md b/tests/validation/README.md index 9d7245760..a608c0f04 100644 --- a/tests/validation/README.md +++ b/tests/validation/README.md @@ -2,9 +2,13 @@ The Media Transport Library (MTL) Validation Framework provides comprehensive testing capabilities for various aspects of the MTL, including protocol compliance, performance, and integration testing. -> **📖 For detailed documentation, please refer to [the main validation framework documentation](../../doc/validation_framework.md)** -> -> **🚀 Quick Start**: See [Validation Quick Start Guide](../../doc/validation_quickstart.md) +## Documentation Navigation + +📖 **Complete Documentation**: [Main validation framework documentation](../../doc/validation_framework.md) - Detailed configuration, troubleshooting, and advanced features +🚀 **Quick Start**: [Validation Quick Start Guide](../../doc/validation_quickstart.md) - Get running in 3 steps +🔧 **Build Issues**: [Build Guide](../../doc/build.md) - MTL build instructions + +--- ## Quick Setup @@ -19,41 +23,16 @@ The Media Transport Library (MTL) Validation Framework provides comprehensive te ### Setup in 3 Simple Steps -1. **Ensure MTL is built first** (if not done already): - ```bash - cd /path/to/Media-Transport-Library - ./build.sh - ``` - See [detailed build instructions](../../doc/build.md) if needed. - -2. **Create virtual environment and install dependencies** (run in `tests/validation/`): - ```bash - cd tests/validation # Must be in this directory! - python3 -m venv venv - source venv/bin/activate - pip install -r requirements.txt # Main framework requirements - pip install -r common/integrity/requirements.txt # Integrity test components - ``` +1. **🏗️ MTL Build**: Ensure MTL and test tools are built + 👉 **[Complete build instructions](../../doc/validation_framework.md#setup-and-installation)** -3. **Configure your environment**: - - Update network interfaces in [`configs/topology_config.yaml`](configs/topology_config.yaml) - - Set correct paths in [`configs/test_config.yaml`](configs/test_config.yaml) (especially `build` and `mtl_path`) - - Ensure media files are accessible at `media_path` - - **Use root user** in topology_config.yaml (not regular user) +2. **⚡ Quick Setup**: Follow 3-step setup process + 👉 **[Quick Start Guide](../../doc/validation_quickstart.md)** -4. **Run tests**: +3. **🏃 Run Tests**: Execute validation tests ```bash - # Run smoke tests (quick validation) - MUST be run as root - python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml -m smoke - - # Run specific test module - python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml tests/single/st20p/test_st20p_rx.py - - # Run specific test with parameters - pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml "tests/single/st20p/fps/test_fps.py::test_fps[|fps = p60|-ParkJoy_1080p]" - - # Generate HTML report - python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml -m smoke --template=html/index.html --report=report.html + # Quick smoke test + sudo ./venv/bin/python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml -m smoke ``` ## Available Tests From 00cd831a1bcce22eba7d434dfb2cae14cb5f405a Mon Sep 17 00:00:00 2001 From: KarolinaPomian Date: Thu, 6 Nov 2025 11:31:30 +0000 Subject: [PATCH 13/15] docs: clarify validation prerequisites and requirements Address PR review comments from @Sakoram: 1. Clarify test media files requirement: - Explain that input data files are necessary for testing - Note files are currently on NFS in production - Provide gen_frames.sh alternative for local testing - Show how to configure media_path 2. Clarify 'Network interfaces configured for testing': - Specify MTL's run.md setup is required - Explain VFs are created automatically by framework - No manual VF creation needed 3. Clarify 'Sufficient permissions for network management': - Explicitly state root user is required - Explain why (network operations via script/nicctl.sh) - Note no alternative permission model exists - Show correct sudo usage with venv 4. Add FFmpeg and GStreamer plugins requirement: - List as explicit prerequisite - Provide installation commands - Note some tests fail without them Fixes: OpenVisualCloud/Media-Transport-Library#1258 --- doc/validation_framework.md | 24 +++++++++++++++++++----- doc/validation_quickstart.md | 7 ++++--- tests/validation/README.md | 17 +++++++++++++---- 3 files changed, 36 insertions(+), 12 deletions(-) diff --git a/doc/validation_framework.md b/doc/validation_framework.md index 06b78fcba..30c74b97e 100644 --- a/doc/validation_framework.md +++ b/doc/validation_framework.md @@ -178,11 +178,25 @@ For complete build instructions, see [doc/build.md](build.md). #### 2. Other Prerequisites -- Python 3.9 or higher -- Test media files (currently maintained on NFS) -- Network interfaces as specified in MTL's run.md document (VFs will be created automatically) -- Root privileges or equivalent (sudo) for network operations done by script/nicctl.sh -- FFmpeg and GStreamer plugins installed (required for integration tests) +- **Python 3.9 or higher** +- **Test Media Files**: Input data files required for testing + - Test media files are necessary for running video, audio, and ancillary data tests + - These files are currently maintained on NFS in production environments + - For local testing, you can generate test frames using `tests/validation/common/gen_frames.sh` (see [gen_frames.sh section](#gen_framessh)) + - Configure the media file location in `configs/test_config.yaml` using the `media_path` parameter +- **Network Interfaces**: Configure interfaces according to MTL's [run.md](run.md) documentation + - Basic MTL network setup must be completed as described in run.md + - Virtual Functions (VFs) will be created automatically by the validation framework + - No manual VF creation is required +- **Root User Privileges**: MTL validation framework must run as root user + - Required for network management operations performed by `script/nicctl.sh` + - Direct network interface manipulation requires root access + - No alternative permission model is currently supported + - Use `sudo` with the full path to your virtual environment Python (e.g., `sudo ./venv/bin/python3`) +- **FFmpeg and GStreamer Plugins**: Required for integration tests + - Install FFmpeg: `sudo apt-get install ffmpeg` + - Install GStreamer and plugins: `sudo apt-get install gstreamer1.0-tools gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad` + - Some tests will fail if these are not installed ### Environment Setup diff --git a/doc/validation_quickstart.md b/doc/validation_quickstart.md index b03a35ee2..2a847096a 100644 --- a/doc/validation_quickstart.md +++ b/doc/validation_quickstart.md @@ -9,9 +9,10 @@ This quick start guide helps you get the MTL validation framework running with m 2. **📋 Basic Requirements**: - Python 3.9+ - - Root user access (MTL validation requires root privileges) - - Network interfaces configured for testing - - FFmpeg with text filters (for media generation) + - Root user access (MTL validation requires root privileges for network operations) + - Network interfaces configured per MTL's [run.md](run.md) (VFs created automatically) + - Test media files (see [media generation](validation_framework.md#gen_framessh) or use NFS-hosted files) + - FFmpeg and GStreamer plugins (required for integration tests) - Compatible SSH keys (RSA recommended, not DSA) ## Quick Setup (3 steps) diff --git a/tests/validation/README.md b/tests/validation/README.md index a608c0f04..3e8af5c9f 100644 --- a/tests/validation/README.md +++ b/tests/validation/README.md @@ -16,10 +16,19 @@ The Media Transport Library (MTL) Validation Framework provides comprehensive te - Python 3.9 or higher - **⚠️ CRITICAL**: Media Transport Library built and installed (see [build instructions](../../doc/build.md)) -- Test media files (typically on NFS) -- Network interfaces configured for testing -- **Root privileges required** (MTL validation must run as root user) -- FFmpeg and GStreamer plugins (for integration tests) +- **Test Media Files**: Input data files are necessary for video, audio, and ancillary data tests + - Files are currently maintained on NFS in production environments + - For local testing, generate frames using `common/gen_frames.sh` (see [documentation](../../doc/validation_framework.md#gen_framessh)) + - Configure media location in `configs/test_config.yaml` +- **Network Interfaces**: Configure according to MTL's [run.md](../../doc/run.md) documentation + - Basic MTL network setup required (see run.md) + - VFs will be created automatically by the validation framework +- **Root Privileges Required**: MTL validation must run as root user + - Required for network management operations + - No alternative permission model available + - Use `sudo ./venv/bin/python3` to run tests +- **FFmpeg and GStreamer Plugins**: Required for integration tests + - Install with: `sudo apt-get install ffmpeg gstreamer1.0-tools gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad` ### Setup in 3 Simple Steps From a61a858f664f5e62015b8471ad27cb02d8d7f546 Mon Sep 17 00:00:00 2001 From: KarolinaPomian Date: Thu, 6 Nov 2025 11:32:02 +0000 Subject: [PATCH 14/15] docs: add usage examples to configs README Address PR review comment from @DawidWesierski4: Add comprehensive usage examples section showing: - How to run tests with custom configurations - Using environment-specific configurations - Complete setup workflow with sed commands - Examples for different test categories and specific tests This provides practical guidance for users setting up and running validation tests. Fixes: OpenVisualCloud/Media-Transport-Library#1258 --- tests/validation/configs/README.md | 75 ++++++++++++++++++++++++++++++ 1 file changed, 75 insertions(+) diff --git a/tests/validation/configs/README.md b/tests/validation/configs/README.md index 24db84745..9e4eb0e7b 100644 --- a/tests/validation/configs/README.md +++ b/tests/validation/configs/README.md @@ -141,6 +141,81 @@ hosts: ## Customizing Configurations +### Usage Examples + +Here are practical examples of how to use these configuration files: + +#### Example 1: Running Tests with Custom Configurations + +```bash +# Navigate to validation directory +cd tests/validation + +# Activate virtual environment +source venv/bin/activate + +# Run all smoke tests with your configurations +sudo ./venv/bin/python3 -m pytest \ + --topology_config=configs/topology_config.yaml \ + --test_config=configs/test_config.yaml \ + -m smoke -v + +# Run a specific test category +sudo ./venv/bin/python3 -m pytest \ + --topology_config=configs/topology_config.yaml \ + --test_config=configs/test_config.yaml \ + -m st20p -v + +# Run a specific test with parameters +sudo ./venv/bin/python3 -m pytest \ + --topology_config=configs/topology_config.yaml \ + --test_config=configs/test_config.yaml \ + "tests/single/st20p/fps/test_fps.py::test_fps[|fps = p60|-ParkJoy_1080p]" -v +``` + +#### Example 2: Using Environment-Specific Configurations + +```bash +# Create a local configuration for your environment +cp configs/test_config.yaml configs/test_config.local.yaml +cp configs/topology_config.yaml configs/topology_config.local.yaml + +# Edit local files with your settings +vim configs/test_config.local.yaml +vim configs/topology_config.local.yaml + +# Run tests with local configurations +sudo ./venv/bin/python3 -m pytest \ + --topology_config=configs/topology_config.local.yaml \ + --test_config=configs/test_config.local.yaml \ + -m smoke -v +``` + +#### Example 3: Complete Setup Workflow + +```bash +# 1. Update test_config.yaml +sed -i 's|MTL_PATH_PLACEHOLDER|/home/user/Media-Transport-Library|g' configs/test_config.yaml + +# 2. Update topology_config.yaml +sed -i 's|IP_ADDRESS_PLACEHOLDER|127.0.0.1|g' configs/topology_config.yaml +sed -i 's|SSH_PORT_PLACEHOLDER|22|g' configs/topology_config.yaml +sed -i 's|USERNAME_PLACEHOLDER|root|g' configs/topology_config.yaml +sed -i 's|KEY_PATH_PLACEHOLDER|/home/user/.ssh/id_rsa|g' configs/topology_config.yaml + +# 3. Verify your configuration +cat configs/test_config.yaml +cat configs/topology_config.yaml + +# 4. Run tests +cd tests/validation +source venv/bin/activate +sudo ./venv/bin/python3 -m pytest \ + --topology_config=configs/topology_config.yaml \ + --test_config=configs/test_config.yaml \ + -m smoke -v +``` + ### Environment-Specific Configuration To customize the configuration for different environments, create copies of these files with environment-specific settings: From a1d5ce84b85e1a08cb4c0b50359e738f5204293e Mon Sep 17 00:00:00 2001 From: KarolinaPomian Date: Fri, 7 Nov 2025 12:57:35 +0000 Subject: [PATCH 15/15] ci: remove ignore markdown and documentation files in smoke tests workflow --- .github/workflows/smoke-tests.yml | 3 --- 1 file changed, 3 deletions(-) diff --git a/.github/workflows/smoke-tests.yml b/.github/workflows/smoke-tests.yml index 8276249de..2f105ea57 100644 --- a/.github/workflows/smoke-tests.yml +++ b/.github/workflows/smoke-tests.yml @@ -5,9 +5,6 @@ on: branches: - main - 'maint-**' - paths-ignore: - - '**.md' - - 'doc/**' pull_request: branches: - main