Skip to content

Sensor Fusion

Practical

Sensor Fusion is the process of combining data from multiple sensors to produce more accurate, reliable, and comprehensive information than any single sensor could provide alone. It is a cornerstone technology for autonomous robotics, enabling robust state estimation, localization, and perception.

Prerequisites

Why Sensor Fusion?

Each sensor has inherent limitations:

SensorLimitation
CameraSensitive to lighting, lacks scale (monocular)
LiDARExpensive, struggles with glass/reflective surfaces
IMUDrifts over time, no absolute position
Wheel encodersSlip on rough terrain, no global reference
GPSPoor indoors, urban canyons, intermittent

Fusion provides redundancy (fault tolerance), complementary information (IMU bridges visual tracking gaps), and higher accuracy through combining independent measurements.

Fusion Architectures

Early Fusion: Raw Data ──► Combined ──► Processing ──► Output
Mid-Level Fusion: Raw Data ──► Features ──► Combined ──► Output
Late Fusion: Raw Data ──► Processing ──► Estimates ──► Combined ──► Output

Combines raw sensor data before processing. Neural networks can learn optimal fusion from low-level correlations.

Example: Concatenating LiDAR point clouds with camera images.

Trade-off: Maximum data, but requires synchronized, aligned data.

Fusion Algorithms

The EKF extends the Kalman filter to nonlinear systems by linearizing around the current estimate. It’s the standard for robotics sensor fusion.

  • Recursive predict → update cycle
  • Weights measurements by uncertainty (covariance)
  • Used by robot_localization package

Sensor Fusion Architecture

┌────────────────────────────────────────────────────────────────┐
│ Sensor Fusion with EKF │
├────────────────────────────────────────────────────────────────┤
│ │
│ ┌──────────────┐ │
│ │ Wheel │────┐ │
│ │ Encoders │ │ │
│ └──────────────┘ │ │
│ │ ┌──────────────┐ ┌────────────┐ │
│ ┌──────────────┐ ├────►│ │ │ │ │
│ │ IMU │────┤ │ EKF │───►│ Fused │ │
│ │ (200+ Hz) │ │ │ │ │ Odometry │ │
│ └──────────────┘ │ └──────────────┘ └────────────┘ │
│ │ │
│ ┌──────────────┐ │ │
│ │ Visual │────┤ │
│ │ Odometry │ │ │
│ └──────────────┘ │ │
│ │ │
│ ┌──────────────┐ │ │
│ │ GPS │────┘ │
│ │ (if avail.) │ │
│ └──────────────┘ │
│ │
└────────────────────────────────────────────────────────────────┘

Common Sensor Combinations

CombinationUse Case
Visual-Inertial (VIO)Drones, AR/VR — camera + IMU
LiDAR-Inertial (LIO)Autonomous vehicles — LiDAR + IMU
LiDAR-Camera-IMUState-of-the-art autonomous systems
Wheel Odometry + IMUGround robots — basic but effective

ROS 2: robot_localization

The robot_localization package provides EKF and UKF nodes for fusing arbitrary sensor inputs.

Configuration

config/ekf.yaml
ekf_filter_node:
ros__parameters:
frequency: 30.0
two_d_mode: true # Set true for ground robots
publish_tf: true
map_frame: map
odom_frame: odom
base_link_frame: base_link
world_frame: odom
# Wheel odometry - use velocity, not position
odom0: /wheel/odometry
odom0_config: [false, false, false, # x, y, z
false, false, false, # roll, pitch, yaw
true, true, false, # vx, vy, vz
false, false, true, # vroll, vpitch, vyaw
false, false, false] # ax, ay, az
# IMU - use orientation and angular velocity
imu0: /imu/data
imu0_config: [false, false, false,
true, true, true,
false, false, false,
true, true, true,
true, true, true]
imu0_differential: false
imu0_remove_gravitational_acceleration: true

Launch File

from launch import LaunchDescription
from launch_ros.actions import Node
import os
from ament_index_python.packages import get_package_share_directory
def generate_launch_description():
pkg_share = get_package_share_directory('my_robot')
return LaunchDescription([
Node(
package='robot_localization',
executable='ekf_node',
name='ekf_filter_node',
output='screen',
parameters=[
os.path.join(pkg_share, 'config', 'ekf.yaml'),
{'use_sim_time': False}
]
),
])

Output Topics

TopicTypeDescription
/odometry/filterednav_msgs/OdometryFused odometry estimate
/tfTransformodombase_link

Isaac ROS Visual SLAM with IMU Fusion

Isaac ROS cuVSLAM supports visual-inertial fusion for improved tracking:

Terminal window
# Launch with IMU fusion enabled
ros2 launch isaac_ros_visual_slam isaac_ros_visual_slam_realsense.launch.py \
enable_imu_fusion:=true
# Verify fused output
ros2 topic echo /visual_slam/tracking/odometry

Calibration Requirements

Accurate fusion requires proper calibration:

  • Extrinsic: Rigid transformation between sensor frames (use Kalibr or NVIDIA MSA Calibration)
  • Temporal: Timestamp synchronization — hardware sync preferred on Jetson
  • Intrinsic: Camera distortion, IMU bias and scale factors

Sources