-
-
Notifications
You must be signed in to change notification settings - Fork 174
Description
π Feature Description
Create a new kornia-sensors crate for physical sensor processing. The first module implements IMU preintegration β compressing high-rate IMU measurements between keyframes into a single relative motion constraint for use in visual-inertial optimization.
PRs for this issue should target the feat/slam-utils branch.
π Feature Category
Rust Core Library (new crate)
π‘ Motivation
kornia-3d handles geometric primitives (camera models, PnP, triangulation, BA). Physical sensor behavior β IMU integration, bias estimation, noise models, time synchronization β doesn't belong there. A kornia-sensors crate gives these a clean home.
IMU preintegration is the first and most important module. It enables:
- Visual-inertial odometry (VIO) β fusing camera and IMU for robust pose estimation
- Metric scale recovery in monocular SLAM
- Pose prediction during fast motion (better than constant-velocity model)
- Bridging tracking gaps when visual features are temporarily lost
π Proposed Solution
Crate structure
kornia-sensors/
src/
lib.rs
imu/
mod.rs β public API
types.rs β ImuMeasurement, ImuBias, ImuCalib
preintegration.rs β PreintegratedImu
Core types
/// Single IMU sample.
pub struct ImuMeasurement {
pub timestamp_sec: f64,
pub gyro: Vec3F64, // angular velocity (rad/s)
pub accel: Vec3F64, // linear acceleration (m/sΒ²)
}
/// Accelerometer and gyroscope biases.
pub struct ImuBias {
pub gyro: Vec3F64,
pub accel: Vec3F64,
}
/// IMU noise and calibration parameters.
pub struct ImuCalib {
pub gyro_noise: f64, // gyroscope noise density
pub accel_noise: f64, // accelerometer noise density
pub gyro_walk: f64, // gyroscope random walk
pub accel_walk: f64, // accelerometer random walk
}Preintegration
/// Preintegrated IMU measurement between two keyframes.
pub struct PreintegratedImu {
pub delta_rotation: Mat3F64, // accumulated rotation (SO3)
pub delta_velocity: Vec3F64, // accumulated velocity change
pub delta_position: Vec3F64, // accumulated position change
pub dt: f64, // total integration time
pub covariance: [[f64; 15]; 15], // 15x15 covariance (rot, vel, pos, bias_g, bias_a)
pub bias: ImuBias, // bias used during integration
// Bias correction Jacobians (enable first-order correction without re-integration)
pub j_rotation_gyro: Mat3F64, // βΞR/βbg
pub j_velocity_gyro: Mat3F64, // βΞV/βbg
pub j_velocity_accel: Mat3F64, // βΞV/βba
pub j_position_gyro: Mat3F64, // βΞP/βbg
pub j_position_accel: Mat3F64, // βΞP/βba
}
impl PreintegratedImu {
/// Create a new preintegration from initial bias.
pub fn new(bias: ImuBias, calib: &ImuCalib) -> Self;
/// Integrate a single IMU measurement.
pub fn integrate(&mut self, measurement: &ImuMeasurement, dt: f64);
/// Integrate a batch of measurements between two timestamps.
pub fn integrate_batch(&mut self, measurements: &[ImuMeasurement]);
/// Apply first-order bias correction without re-integrating.
pub fn correct_bias(&self, new_bias: &ImuBias) -> CorrectedPreintegration;
/// Predict pose at time t given initial state.
pub fn predict(
&self,
rotation: &Mat3F64,
position: &Vec3F64,
velocity: &Vec3F64,
gravity: &Vec3F64,
) -> (Mat3F64, Vec3F64, Vec3F64); // (R2, p2, v2)
}Key math
Each measurement (accel, gyro, dt):
- Bias-correct:
a = accel - bias_a,w = gyro - bias_g - Update position:
dP += dV*dt + 0.5*dR*a*dtΒ² - Update velocity:
dV += dR*a*dt - Update rotation:
dR = dR * exp(w*dt)(SO3 exponential map fromkornia-algebra) - Propagate 15Γ15 covariance
- Update bias Jacobians (for first-order correction)
Pose prediction from keyframe state:
R2 = R1 * dR_corrected
p2 = p1 + v1*dt + 0.5*g*dtΒ² + R1*dP_corrected
v2 = v1 + g*dt + R1*dV_corrected
π Library Reference
- Forster et al., "On-Manifold Preintegration for Real-Time Visual-Inertial Odometry" (2017) β the standard reference for IMU preintegration
- ORB-SLAM3 (Campos et al., 2021) β visual-inertial SLAM using preintegration
π Alternatives Considered
No response
π― Use Cases
- Visual-inertial odometry and SLAM
- Pose prediction during visual tracking gaps
- Metric scale estimation for monocular systems
- Any multi-sensor fusion involving IMU
π Additional Context
This crate depends on kornia-3d (for Pose3d, Mat3F64, Vec3F64) and kornia-algebra (for SO3 exponential map, right Jacobian, and matrix operations). It does not duplicate camera models β those stay in kornia-3d.
π€ Contribution Intent
- I plan to submit a PR to implement this feature
- I'm requesting this feature but not planning to implement it