Design World

  • Home
  • Technologies
    • ELECTRONICS • ELECTRICAL
    • Fastening • joining
    • FLUID POWER
    • LINEAR MOTION
    • MOTION CONTROL
    • SENSORS
    • TEST & MEASUREMENT
    • Factory automation
    • Warehouse automation
    • DIGITAL TRANSFORMATION
  • Learn
    • Tech Toolboxes
    • Learning center
    • eBooks • Tech Tips
    • Podcasts
    • Videos
    • Webinars • general engineering
    • Webinars • Automated warehousing
    • Voices
  • LEAP Awards
  • 2025 Leadership
    • 2024 Winners
    • 2023 Winners
    • 2022 Winners
    • 2021 Winners
  • Design Guides
  • Resources
    • 3D Cad Models
      • PARTsolutions
      • TraceParts
    • Digital Issues
      • Design World
      • EE World
    • Educational Assets
    • Engineering diversity
    • Reports
    • Trends
  • Supplier Listings
  • Advertise
  • SUBSCRIBE
    • MAGAZINE
    • NEWSLETTER

How to implement multi-sensor fusion algorithms for autonomous vehicles

By Randy Frank | October 8, 2025

To achieve autonomous vehicle (AV) operation, sensing techniques include radar, LiDAR, and cameras, as well as infrared (IR) and/or ultrasonic sensors, among others. No single sensing technique is adequate by itself, and each of these sensors has its strengths and weaknesses. As shown in Table 1, one or more sensing techniques overcome the weaknesses of another, and their combined capabilities also provide redundancy for safe autonomous vehicle operation. In addition to the sensors themselves, combining or fusing the data from multiple sensors to obtain a more accurate and reliable estimate requires coding (software algorithms), advanced filtering, and performance simulation.

Table 1. Strengths and weaknesses of LiDAR, radar, and camera sensors. (Image: Mindkosh)

Code examples/Tech stacks for integrating sensor data

Different coding/software approaches have been used to fuse or integrate the data from the key sensors. Similar to other widely pursued applications, tech stacks that combine programming languages, frameworks, databases, and tools are used to expedite the development of sensor fusion software for AV applications. In addition to C/C++, two popular approaches for developing sensor fusion software include Autoware and the Robot Operating System (ROS).

Since it released its first modular open-source software stacks for autonomous driving in 2015, the Autoware Foundation has periodically offered improved versions with the most recent architecture called Autoware Core/Universe.

ROS is not an operating system but a software development kit (SDK) – a set of software libraries and tools that help engineers build robot applications. This open-source robotics framework provides developers with the middleware required for communicating between hardware components and software algorithms.

In addition to these design approaches, Python, a multi-paradigm (object-oriented, procedural (imperative), functional, structured, and reflective) general-purpose programming language, is frequently used for sensor fusion of inertial measurement unit (IMU) outputs (accelerometer, gyroscope, and magnetic sensing signals).

Filtering techniques for combining sensor data

Effectively using the data from different sensors, especially when the output from one sensor conflicts with the output from another, requires specialized filtering of the data. Since three-dimensional readings must be evaluated, Kalman (Gaussian-based) filters provide a common starting point. For non-linear systems, the extended Kalman filter (EKF) uses linearization to model nonlinear functions, making it applicable for fusing data from LiDAR, radar, and camera sensors. A variation of non-linear Kalman filters includes the unscented Kalman filter (UKF) that provides an improvement over the EKF, using a deterministic sampling approach to address the EKF’s flaws in nonlinear estimations. Figure 1 shows how it works for LiDAR and radar data measurements.

Figure 1. Flow Chart with a predict-update cycle for UKF Implementation. (Image: Udacity SFND Course)

In contrast to Kalman filtering, particle filters combine predictions from a system dynamics model with new observations to update the estimated state of a system. The Bayesian filtering technique uses a set of weighted random samples, called “particles,” where the weight of each particle is updated based on how well it aligns with the readings from all sensors.

Simulating sensor fusion performance

Using simulation tools like MATLAB or Simulink, engineers can evaluate different approaches to design and evaluate their system.

Figure 2. Different options (including simultaneous localization and mapping (SLAM)) bring sensor data to perception algorithms. (Image: MathWorks)

To ease into the evaluation process, several simulation tools suggest starting with just two variables (see Figure 1). For example, the MathWorks multi-object tracker to fuse information from radar and video camera sensors uses Kalman filters to estimate the state of motion of a detected object. Another tracker generates an object-level track list from measurements of a radar and a LiDAR sensor and fuses them using a track-level fusion scheme.

Baidu, Inc., a Chinese multinational technology company, has an open platform called Apollo to help developers with simulations, an autonomous perception system, and high definition (HD) mapping data to minimize costs and improve the precision of their designs.

For all the design activities required for sensor fusion in AV applications, engineers have choices, including those mentioned above, as well as options from other providers.

References

Sensor fusion for multi-sensor lidar data
The 6-Step Roadmap to Learn Sensor Fusion
ROS – Robot Operating System
How is Autoware Core/Universe different from Autoware.AI and Autoware.Auto?
LiDAR and Radar Sensor Fusion using Unscented Kalman Filter
Sensor Fusion With Kalman Filter
Sensor Fusion and Navigation for Autonomous Systems Using MATLAB & Simulink
What Is Sensor Fusion and Tracking Toolbox?
Track-Level Fusion of Radar and Lidar Data
Apollo

Related EE World content

What is sensor fusion?
Sensor fusion: What is it?
Sensor fusion levels and architectures
How does fusion timing impact sensors?
Sensors in the driving seat
What sensors make the latest Waymo Driver smarter?
What is the role of sensor fusion in robotics?

You Might Also Like


Filed Under: Automotive, Sensor Tips

 

LEARNING CENTER

Design World Learning Center
“dw
EXPAND YOUR KNOWLEDGE AND STAY CONNECTED
Get the latest info on technologies, tools and strategies for Design Engineering Professionals.
Motor University

Design World Digital Edition

cover

Browse the most current issue of Design World and back issues in an easy to use high quality format. Clip, share and download with the leading design engineering magazine today.

EDABoard the Forum for Electronics

Top global problem solving EE forum covering Microcontrollers, DSP, Networking, Analog and Digital Design, RF, Power Electronics, PCB Routing and much more

EDABoard: Forum for electronics

Sponsored Content

  • Digitalization made easy: Bridging IT/OT with scalable network infrastructure
  • Apple Rubber custom o-rings for harsh underwater conditions
  • ASMPT chooses Renishaw for high-quality motion control
  • Innovating Together: How Italian Machine Builders Drive Industry Forward Through Collaboration
  • Efficiency Is the New Luxury — and Italy Is Delivering
  • Beyond the Build: How Italy’s Machine Makers Are Powering Smart Manufacturing
View More >>
Engineering Exchange

The Engineering Exchange is a global educational networking community for engineers.

Connect, share, and learn today »

Design World
  • About us
  • Contact
  • Manage your Design World Subscription
  • Subscribe
  • Design World Digital Network
  • Control Engineering
  • Consulting-Specifying Engineer
  • Plant Engineering
  • Engineering White Papers
  • Leap Awards

Copyright © 2025 WTWH Media LLC. All Rights Reserved. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media
Privacy Policy | Advertising | About Us

Search Design World

  • Home
  • Technologies
    • ELECTRONICS • ELECTRICAL
    • Fastening • joining
    • FLUID POWER
    • LINEAR MOTION
    • MOTION CONTROL
    • SENSORS
    • TEST & MEASUREMENT
    • Factory automation
    • Warehouse automation
    • DIGITAL TRANSFORMATION
  • Learn
    • Tech Toolboxes
    • Learning center
    • eBooks • Tech Tips
    • Podcasts
    • Videos
    • Webinars • general engineering
    • Webinars • Automated warehousing
    • Voices
  • LEAP Awards
  • 2025 Leadership
    • 2024 Winners
    • 2023 Winners
    • 2022 Winners
    • 2021 Winners
  • Design Guides
  • Resources
    • 3D Cad Models
      • PARTsolutions
      • TraceParts
    • Digital Issues
      • Design World
      • EE World
    • Educational Assets
    • Engineering diversity
    • Reports
    • Trends
  • Supplier Listings
  • Advertise
  • SUBSCRIBE
    • MAGAZINE
    • NEWSLETTER
We use cookies to personalize content and ads, to provide social media features, and to analyze our traffic. We share information about your use of our site with our social media, advertising, and analytics partners who may combine it with other information you’ve provided to them or that they’ve collected from your use of their services. You consent to our cookies if you continue to use this website.