Project Page | Paper | Dataset (GIP-DB)
Ying Xue, Jiaxi Jiang, Rayan Armani, Dominik Hollidt, Yi-Chi Liao, Christian Holz
Sensing, Interaction & Perception Lab, ETH Zürich
Group Inertial Poser (GIP) estimates 3D full-body poses and global translation for multiple humans using inertial measurements from a sparse set of wearable sensors, augmented by the distances between the sensors via ultra-wideband (UWB) ranging.
By leveraging inter-sensor distances across multiple people, GIP overcomes the inherent translation drift of purely inertial systems. Our method preserves meaningful interaction dynamics and stabilizes global trajectories through a novel structured state-space model (SSM) and a two-step optimization pipeline.
- Drift-Free Tracking: Uses UWB ranging to anchor inertial measurements in a global context.
- Multi-Person Coordination: Leverages cross-body sensor distances to refine relative positioning.
- State-Space Models: Utilizes SSMs to integrate temporal motion patterns for precise pose estimation.
- GIP-DB Dataset: The first IMU+UWB dataset for two-person tracking (200 mins, 14 participants).
conda create --name GIP python=3.9 -y
conda activate GIP
# Install core dependencies
python -m pip install torch torchvision torchaudio chumpy vctoolkit open3d pybullet qpsolvers cvxopt prettytable tensorboard qpsolvers\[quadprog\] cython wandb cmake pytorch_lightning pykeops einops numpy==1.23.5
-
RBDL: Install the modified Rigid Body Dynamics Library rbdl from RBDL-PIP.
-
SMPL Models: Download SMPL model: version 1.0.0 for Python 2.7 (female/male. 10 shape PCs) (unzip and obtain
basicmodel_m_lbs_10_207_0_v1.0.0.pkl) and place it in the./datafolder. -
S4 Model: Download the S4 repository here and move the
models/folder into your project's root directory (./).
-
Interhuman Dataset: Download the preprocessed Interhuman dataset from here and place into the folder
data/processed_data/. Please note that by downloading the preprocessed datasets you agree to the same license conditions as for the Interhuman dataset (https://tr3e.github.io/intergen-page/). You may only use the data for scientific purposes and cite the corresponding papers. -
Pre-trained Weights: Download the GIP Weights and place them in your checkpoint directory.
To run the evaluation on the Interhuman dataset:
python modules/evaluate/evaluator_interhuman.py --network SSM \
--ckpt_path /path/to/model.pt \
--data_dir data/processed_data/interhuman_test/test \
--eval_trans \
--normalize_uwb \
--add_guassian_noise \
--model_args_file config/model_args.json \
--eval_save_dir Eval_Interhuman --exp_name ssm_eval --device cuda:0We follow a phased training approach. Ensure AMASS is downloaded and preprocessed, and paths are configured in config/config.py.
python Train_model.py --pretrain_model '' \
--config_file "config/train_config.ini" \
--log_dir "output/ssm_model" \
--network SSM \
--training_phase baseline_gnn_jp_mapper baseline_rnn_jp_mapper baseline_rnn3 baseline_rnn4 baseline_rnn5 \
--eval_dataset "interhuman" \
--device cuda:0Please stay tuned! The GIP-DB dataset will be released soon, featuring 200 minutes of synchronized motion recordings from 14 participants.
If you find our paper or code useful, please cite our work:
@inproceedings{xue2025groupinertialposer,
author = {Xue, Ying and Jiang, Jiaxi and Armani, Rayan and Hollidt, Dominik and Liao, Yi-Chi and Holz, Christian},
title = {{Group Inertial Poser}: Multi-Person Pose and Global Translation from Sparse Inertial Sensors and Ultra-Wideband Ranging},
booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)},
pages = {24910--24921},
year = {2025},
publisher = {IEEE},
address = {New Orleans, LA, USA},
doi = {10.48550/arXiv.2510.21654},
url = {https://arxiv.org/abs/2510.21654},
keywords = {Human pose estimation, IMU, UWB, multi-person tracking, global translation},
month = oct
}
This project is released under the MIT license. Our code is partially based on PIP and UIP.
