Design World

  • Home
  • Technologies
    • ELECTRONICS • ELECTRICAL
    • Fastening • joining
    • FLUID POWER
    • LINEAR MOTION
    • MOTION CONTROL
    • SENSORS
    • TEST & MEASUREMENT
    • Factory automation
    • Warehouse automation
    • DIGITAL TRANSFORMATION
  • Learn
    • Tech Toolboxes
    • Learning center
    • eBooks • Tech Tips
    • Podcasts
    • Videos
    • Webinars • general engineering
    • Webinars • Automated warehousing
    • Voices
  • LEAP Awards
  • 2025 Leadership
    • 2024 Winners
    • 2023 Winners
    • 2022 Winners
    • 2021 Winners
  • Design Guides
  • Resources
    • Subscribe
    • 3D Cad Models
      • PARTsolutions
      • TraceParts
    • Digital Issues
      • Design World
      • EE World
    • Educational Assets
    • Engineering diversity
    • Trends
  • Supplier Listings
  • Advertise
  • Subscribe

How does multi-sensor calibration work in autonomous vehicles?

By Jeff Shepard | June 11, 2025

Multi-sensor calibration of advanced driver assistance systems (ADAS) in autonomous vehicles ensures accurate data fusion by aligning various sensors (like cameras, LiDAR, and radar) into a unified coordinate system, enabling precise perception and localization.

This article reviews the types and techniques of multi-sensor calibration of ADAS.

ADAS sensor calibration uses a combination of intrinsic considerations for individual sensors and extrinsic system-level factors. Intrinsic calibration considers general factors, such as linearity, output slope, and offset, as well as sensor-specific specifications. Examples of sensor-specific factors include:

  • Camera — focal length, lens distortion, resolution, high dynamic range (HDR), speed of lens focusing, high sensitivity, low light performance, LED flicker mitigation (to minimize the effects of traffic lights), and low latency.
  • LiDAR — laser beam angle, field of view (FOV), scan rate, ranging accuracy, angular resolution, and internal coordinate system.
  • Radar — antenna gain, frequency, pulse characteristics, range, FOV, resolution, speed measurement accuracy, and the ability to detect various types of objects/materials.

Extrinsic calibration examines the spatial relationship between sensors, encompassing both translation and rotation. It involves calibrating the camera, LiDAR, and radar to ensure that their coordinate systems are aligned. For example, it can validate object-level tracking and how the data is fused using a track-level fusion scheme.

Extrinsic calibration of ADAS systems can be implemented using a combination of target-based and targetless methodologies or employing only targetless techniques.

Target-based calibration

Target-based calibration, also called controlled environment or static calibration, uses targets with specific shapes and sizes at specified distances to calibrate ADAS sensor performance in a static setting.

Static ADAS calibration requires specific lighting conditions and the absence of reflective surfaces to avoid sensor confusion. Calibration targets are used to calibrate and align the sensors (Figure 1).

Figure 1. Typical target-based ADAS calibration system. (Image: John Bean)

The highly controlled conditions for target-based ADAS calibration support high-accuracy calibrations. However, the use of a controlled environment is also a limitation, as ADAS is often operated in uncontrolled environments on roadways. As a result, target-based ADAS calibration is generally used in combination with targetless calibration.

Targetless calibration

The limitations of target-based ADAS calibration stem from the diverse types of data provided by the three sensor modalities. Cameras produce 2D images, LiDAR produces dense 3D point clouds, and radar provides sparse 4D point clouds where the fourth dimension represents the object’s speed.

Targetless calibration can be implemented by attaching a scan tool to the car’s computer and driving at specified speeds, following other vehicles, and navigating on clearly marked roads. The scan tool detects objects and road markings and uses an algorithm to calibrate the sensors based on the real-world environment.

A new targetless calibration approach has been proposed based on self-supervised learning (SSL) and deep neural networks. In this approach, one part of the signal is used to predict another part of the signal. It has been used for super-resolving a radar array, up-sampling camera frames, or lidar measurements to improve calibration results.

In Figure 2a, the camera-lidar calibration based on SSL is shown. The projected lidar point cloud (colored by range) and the camera image are clearly misaligned in the left-hand image; however, they can be aligned using SSL, as shown in the right-hand image.

Figure 2. Examples of using SSL for camera-lidar calibration (top) and camera-radar calibration (bottom). (Image: Scientific Reports)

Figure 2b illustrates the camera-radar calibration. Before calibration, the radar, represented by the cyan ‘+’ markers, are misaligned with the moving vehicles in the left-hand image. SSL can be used to calibrate the camera-radar sensors and align their outputs, as shown in the image on the right.

Summary

ADAS are complex with multiple sensor modalities that require different types of intrinsic and extrinsic calibration. Additionally, the overall ADAS operation requires multimodal calibration, utilizing a combination of target-based and targetless calibration methods. Recently, SSL techniques have been applied to targetless ADAS sensors to deliver improved calibration results.

References

A Multi-sensor Calibration Toolbox for Autonomous Driving, arXiv
An Auto-Calibrating System for Sensors in Autonomous Vehicles, KPIT Technologies
Enhancing lane detection with a lightweight collaborative late fusion model, Science Direct
How to Calibrate Sensors with MSA Calibration Anywhere for NVIDIA Isaac Perceptor, Nvidia Developer
Joint Calibration of a Multimodal Sensor System for Autonomous Vehicles, MDPI sensors
Multilevel Data and Decision Fusion Using Heterogeneous Sensory Data for Autonomous Vehicles, MDPI remote sensing
Physics and semantic informed multi-sensor calibration via optimization theory and self-supervised learning, Scientific Reports
Probability-Based LIDAR–Camera Calibration Considering Target Positions and Parameter Evaluation Using a Data Fusion Map, MDPI sensors
The Complete Guide to ADAS Calibration,John Bean

EEWorld Online related content

What are the mathematics that enable sensor fusion?
What determines the connectivity bandwidth needed in a machine vision system?
How will neurotechnology and sensing impact automotive: part 1
What are the types and uses of position and angle sensors in an EV?
If you are working with video signal processing here are some tools to consider, Part 1

You Might Also Like


Filed Under: Automotive, Sensor Tips
Tagged With: FAQ
 

LEARNING CENTER

Design World Learning Center
“dw
EXPAND YOUR KNOWLEDGE AND STAY CONNECTED
Get the latest info on technologies, tools and strategies for Design Engineering Professionals.
Motor University

Design World Digital Edition

cover

Browse the most current issue of Design World and back issues in an easy to use high quality format. Clip, share and download with the leading design engineering magazine today.

EDABoard the Forum for Electronics

Top global problem solving EE forum covering Microcontrollers, DSP, Networking, Analog and Digital Design, RF, Power Electronics, PCB Routing and much more

EDABoard: Forum for electronics

Sponsored Content

  • Widening the scope for machine tool designers with FORTiS™ enclosed encoder
  • Sustainability, Innovation and Safety, Central to Our Approach
  • Why off-highway is the sweet spot for AC electrification technology
  • Looking to 2025: Past Success Guides Future Achievements
  • North American Companies Seek Stronger Ties with Italian OEMs
  • Adapt and Evolve
View More >>
Engineering Exchange

The Engineering Exchange is a global educational networking community for engineers.

Connect, share, and learn today »

Design World
  • About us
  • Contact
  • Manage your Design World Subscription
  • Subscribe
  • Design World Digital Network
  • Control Engineering
  • Consulting-Specifying Engineer
  • Plant Engineering
  • Engineering White Papers
  • Leap Awards

Copyright © 2025 WTWH Media LLC. All Rights Reserved. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media
Privacy Policy | Advertising | About Us

Search Design World

  • Home
  • Technologies
    • ELECTRONICS • ELECTRICAL
    • Fastening • joining
    • FLUID POWER
    • LINEAR MOTION
    • MOTION CONTROL
    • SENSORS
    • TEST & MEASUREMENT
    • Factory automation
    • Warehouse automation
    • DIGITAL TRANSFORMATION
  • Learn
    • Tech Toolboxes
    • Learning center
    • eBooks • Tech Tips
    • Podcasts
    • Videos
    • Webinars • general engineering
    • Webinars • Automated warehousing
    • Voices
  • LEAP Awards
  • 2025 Leadership
    • 2024 Winners
    • 2023 Winners
    • 2022 Winners
    • 2021 Winners
  • Design Guides
  • Resources
    • Subscribe
    • 3D Cad Models
      • PARTsolutions
      • TraceParts
    • Digital Issues
      • Design World
      • EE World
    • Educational Assets
    • Engineering diversity
    • Trends
  • Supplier Listings
  • Advertise
  • Subscribe
We use cookies to personalize content and ads, to provide social media features, and to analyze our traffic. We share information about your use of our site with our social media, advertising, and analytics partners who may combine it with other information you’ve provided to them or that they’ve collected from your use of their services. You consent to our cookies if you continue to use this website.