A Python tool that converts sensor data from the National Research Council's (NRC) bWell JSON logs into animated MP4 videos with real-time graphs. This project visualizes time-series data from multiple sensors (like VR/AR tracking devices) by creating smooth, animated line charts that show how variables change over time.
- Multi-sensor support: Visualize data from multiple sensors with different sender tags (e.g., Head, LeftHand, RightHand)
- Multiple input files: Combine data from multiple JSON files for extended time series visualization
- Flexible variable selection: Plot any nested JSON variables using dot notation (e.g.,
absolutePosition.x) - Sliding window visualization: Configurable time window for smooth real-time animation
- Video output: Generates MP4 videos with customizable frame rates
- Test data generation: Built-in utility to generate sample sensor data for testing
- Python 3.7+
- FFmpeg (added to system PATH or specified separately)
- Clone or download this repository
- Install Python dependencies:
pip install -r requirements.txt
Convert sensor data to video by specifying input file(s), output file, and variables to plot:
# Single file
python render_graph_video.py -i data.json -o output.mp4 -v "Head:absolutePosition.x" "LeftHand:absolutePosition.y"
# Multiple files
python render_graph_video.py -i data1.json data2.json data3.json -o output.mp4 -v "Head:absolutePosition.x" "LeftHand:absolutePosition.y"python render_graph_video.py [OPTIONS]
Required Arguments:
-i, --input PATH [PATH ...] Input JSON file path(s) - one or more files
-o, --output PATH Output MP4 video file path
-v, --variables VARS Variables to plot in 'SenderTag:variable' format
Optional Arguments:
-w, --window SECONDS Sliding window size in seconds (default: 30.0)
--fps FPS Video frame rate (default: 24)
--ffmpeg-path PATH Custom path to FFmpeg executableVariables must be specified in the format SenderTag:variable.path:
Head:absolutePosition.x- X position of Head sensorRightHand:absolutePosition.z- Z position of RightHand sensorLeftHand:absoluteRotation.w- W component of LeftHand rotation quaternion
Basic 3D position tracking:
python render_graph_video.py -i sensor_data.json -o movement.mp4 -v "Head:absolutePosition.x" "Head:absolutePosition.y" "Head:absolutePosition.z"Multi-sensor comparison:
python render_graph_video.py -i data.json -o hands_comparison.mp4 -v "LeftHand:absolutePosition.y" "RightHand:absolutePosition.y" --window 45 --fps 30Generate sample sensor data for testing:
python generate_test_data.py -d 60 -s 20 -o test_data.jsonOptions:
-d, --duration: Duration in seconds (default: 60)-s, --sample-rate: Sample rate in Hz (default: 20)-o, --output: Output JSON file name (default: test_data.json)--seed: Random seed for reproducible data
The input JSON file should contain sensor data with this structure:
{
"data": [
{
"myType": "AbsoluteActivityRecord",
"ID": 0,
"timestamp": 5.70708703994751,
"msSinceEpoch": 1745458113437,
"senderTag": "LeftHand",
"absolutePosition": {
"x": -0.1876288205385208,
"y": 1.1969895362854005,
"z": -0.05969514697790146
},
"absoluteRotation": {
"x": 0.037710338830947879,
"y": -0.3092581033706665,
"z": -0.32002973556518557,
"w": -0.8947169184684753
}
}
]
}Required fields:
myType: Must be anAbsoluteActivityRecordtimestamporabsoluteTime: Time in secondssenderTag: Identifier for the sensor/device- Data fields can be nested objects accessible via dot notation
The tool generates an MP4 video showing:
- Animated line graphs for each specified variable
- Real-time sliding window view
- Different colors for each data series
- Legend identifying each sensor and variable
- Smooth interpolation between data points
FFmpeg not found: If you get FFmpeg errors, specify the path manually:
python render_graph_video.py --ffmpeg-path /path/to/ffmpeg.exe -i data.json -o video.mp4 -v "Head:absolutePosition.x"No data found: Ensure your variable specifications match the exact senderTag and field names in your JSON data. Check available senderTags in the console output.
Performance: For large datasets, consider reducing the time window (-w) or frame rate (--fps) to speed up rendering.
This project is not affiliated with or endorsed by the National Research Council (NRC).