Skip to content

Converts logs/records from National Research Council (NRC) Canada's bWell VR application to real-time videos for data visualization.

Notifications You must be signed in to change notification settings

demengc/bwell-log2video

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

bwell-log2video

A Python tool that converts sensor data from the National Research Council's (NRC) bWell JSON logs into animated MP4 videos with real-time graphs. This project visualizes time-series data from multiple sensors (like VR/AR tracking devices) by creating smooth, animated line charts that show how variables change over time.

Features

  • Multi-sensor support: Visualize data from multiple sensors with different sender tags (e.g., Head, LeftHand, RightHand)
  • Multiple input files: Combine data from multiple JSON files for extended time series visualization
  • Flexible variable selection: Plot any nested JSON variables using dot notation (e.g., absolutePosition.x)
  • Sliding window visualization: Configurable time window for smooth real-time animation
  • Video output: Generates MP4 videos with customizable frame rates
  • Test data generation: Built-in utility to generate sample sensor data for testing

Prerequisites

  • Python 3.7+
  • FFmpeg (added to system PATH or specified separately)

Installation

  1. Clone or download this repository
  2. Install Python dependencies:
    pip install -r requirements.txt

Usage

Basic Usage

Convert sensor data to video by specifying input file(s), output file, and variables to plot:

# Single file
python render_graph_video.py -i data.json -o output.mp4 -v "Head:absolutePosition.x" "LeftHand:absolutePosition.y"

# Multiple files
python render_graph_video.py -i data1.json data2.json data3.json -o output.mp4 -v "Head:absolutePosition.x" "LeftHand:absolutePosition.y"

Command Line Options

python render_graph_video.py [OPTIONS]

Required Arguments:
  -i, --input PATH [PATH ...]  Input JSON file path(s) - one or more files
  -o, --output PATH            Output MP4 video file path
  -v, --variables VARS         Variables to plot in 'SenderTag:variable' format

Optional Arguments:
  -w, --window SECONDS         Sliding window size in seconds (default: 30.0)
  --fps FPS                   Video frame rate (default: 24)
  --ffmpeg-path PATH          Custom path to FFmpeg executable

Variable Format

Variables must be specified in the format SenderTag:variable.path:

  • Head:absolutePosition.x - X position of Head sensor
  • RightHand:absolutePosition.z - Z position of RightHand sensor
  • LeftHand:absoluteRotation.w - W component of LeftHand rotation quaternion

Example Commands

Basic 3D position tracking:

python render_graph_video.py -i sensor_data.json -o movement.mp4 -v "Head:absolutePosition.x" "Head:absolutePosition.y" "Head:absolutePosition.z"

Multi-sensor comparison:

python render_graph_video.py -i data.json -o hands_comparison.mp4 -v "LeftHand:absolutePosition.y" "RightHand:absolutePosition.y" --window 45 --fps 30

Test Data Generation

Generate sample sensor data for testing:

python generate_test_data.py -d 60 -s 20 -o test_data.json

Options:

  • -d, --duration: Duration in seconds (default: 60)
  • -s, --sample-rate: Sample rate in Hz (default: 20)
  • -o, --output: Output JSON file name (default: test_data.json)
  • --seed: Random seed for reproducible data

Input Data Format

The input JSON file should contain sensor data with this structure:

{
  "data": [
    {
      "myType": "AbsoluteActivityRecord",
      "ID": 0,
      "timestamp": 5.70708703994751,
      "msSinceEpoch": 1745458113437,
      "senderTag": "LeftHand",
      "absolutePosition": {
        "x": -0.1876288205385208,
        "y": 1.1969895362854005,
        "z": -0.05969514697790146
      },
      "absoluteRotation": {
        "x": 0.037710338830947879,
        "y": -0.3092581033706665,
        "z": -0.32002973556518557,
        "w": -0.8947169184684753
      }
    }
  ]
}

Required fields:

  • myType: Must be an AbsoluteActivityRecord
  • timestamp or absoluteTime: Time in seconds
  • senderTag: Identifier for the sensor/device
  • Data fields can be nested objects accessible via dot notation

Output

The tool generates an MP4 video showing:

  • Animated line graphs for each specified variable
  • Real-time sliding window view
  • Different colors for each data series
  • Legend identifying each sensor and variable
  • Smooth interpolation between data points

Troubleshooting

FFmpeg not found: If you get FFmpeg errors, specify the path manually:

python render_graph_video.py --ffmpeg-path /path/to/ffmpeg.exe -i data.json -o video.mp4 -v "Head:absolutePosition.x"

No data found: Ensure your variable specifications match the exact senderTag and field names in your JSON data. Check available senderTags in the console output.

Performance: For large datasets, consider reducing the time window (-w) or frame rate (--fps) to speed up rendering.

Disclaimer

This project is not affiliated with or endorsed by the National Research Council (NRC).

About

Converts logs/records from National Research Council (NRC) Canada's bWell VR application to real-time videos for data visualization.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages