Skip to content

LiDAR

Practical

LiDAR (Light Detection and Ranging) uses laser pulses to measure distances with millimeter precision. By scanning in 2D or 3D, it creates detailed point clouds of the environment, essential for autonomous navigation, mapping, and obstacle detection.

How LiDAR Works

1. Emit laser pulse ──────────► Object
2. Pulse reflects ◄──────────── Object
3. Measure time-of-flight
4. Calculate distance: d = (c × t) / 2
c = speed of light
t = round-trip time

Modern LiDARs emit thousands of pulses per second, building 3D maps in real-time.

LiDAR Types

Mechanical Spinning

A rotating mirror or assembly scans the laser across the scene.

ProsCons
360° field of viewMoving parts wear out
Mature technologyExpensive ($4,000-$75,000)
High accuracyBulky form factor

Examples: Velodyne VLP-16 (now sold through Ouster), Ouster OS1

Solid-State / Hybrid

No external moving parts. Uses phased arrays, MEMS mirrors, or internal rotating mirrors.

ProsCons
Compact, ruggedLimited field of view
Lower costNewer technology
No mechanical wearMay need multiple units for 360°

Examples: Livox Mid-360 (hybrid), Hesai AT128

Flash LiDAR

Illuminates entire scene at once, like a camera flash.

ProsCons
No scanning neededShorter range
Very fastLower resolution
Simple designHigher power consumption

Examples: Intel RealSense L515 (discontinued Feb 2022; consider D455/D435i stereo depth cameras as alternatives)

Key Specifications

SpecDescriptionTypical Range
RangeMaximum detection distance10-300m
Points/secMeasurement rate300K-2M
ChannelsVertical resolution (spinning)16-128
FoVField of view90°-360°
Angular resolutionAngle between points0.1°-0.4°
AccuracyDistance error±2-5cm

Robotics Applications

SLAM and Mapping

LiDAR provides accurate distance measurements for building maps:

  • 2D LiDAR → 2D occupancy grids (mobile bases)
  • 3D LiDAR → 3D point cloud maps (autonomous vehicles)

Obstacle Detection

Reliable detection regardless of lighting:

  • Works in complete darkness
  • Not affected by shadows or glare
  • Precise distance to obstacles

Localization

Match current scan to known map:

  • Iterative Closest Point (ICP)
  • Normal Distributions Transform (NDT)

LiDAR vs Cameras vs Radar

AspectLiDARCameraRadar
Range accuracyExcellentPoor (needs depth estimation)Good
Works in darkYesNoYes
Works in fog/rainDegradedDegradedGood
Color/textureNoYesNo
CostHighLowMedium
Point densityHighVery high (pixels)Low

Most autonomous systems use sensor fusion — combining all three.

ROS 2 Integration

Standard message type: sensor_msgs/PointCloud2

Terminal window
# View LiDAR data
ros2 topic echo /scan # 2D LaserScan
ros2 topic echo /points # 3D PointCloud2
# Visualize in RViz2
ros2 run rviz2 rviz2
# Add PointCloud2 display, set topic to /points

Common Packages

PackagePurpose
pointcloud_to_laserscanConvert 3D to 2D
laser_filtersFilter noisy points
pcl_rosPoint Cloud Library integration
nvbloxGPU-accelerated 3D reconstruction

NVIDIA Isaac Integration

Isaac ROS 4.0 (current release, requires ROS 2 Jazzy) provides GPU-accelerated LiDAR processing:

# nvblox: Real-time 3D reconstruction from depth cameras and/or 3D LiDAR
# Isaac ROS packages:
# - isaac_ros_nvblox
# - isaac_ros_pointcloud_utils

Isaac Sim can simulate LiDAR sensors with realistic noise models.

Mobile Robots (Indoor)

  • RPLidar A1/A2 — $100-200, 2D, 360°
  • Livox Mid-360 — $1,000, 3D solid-state

Mobile Robots (Outdoor)

  • Ouster OS0/OS1 — $4,000-8,000, 3D, 360°
  • Velodyne VLP-16 — $4,000, 3D, 16-channel (via Ouster)

Autonomous Vehicles

  • Hesai AT128 — $1,000+, automotive-grade
  • Hesai FT120 — Solid-state, blind-spot detection

Drones

  • Livox Mid-40 — Lightweight, compact
  • Velodyne VLP-16 — If payload allows
  • SLAM — Uses LiDAR for mapping
  • Cameras — Complementary sensor
  • Isaac ROS — GPU-accelerated processing

Sources