Industrial machines have long relied on a variety of sensors to execute their targeted mission and obtain the desired performance and reliability. Sensor fusion provides a more comprehensive approach to achieving these results and makes the appropriate decision(s) in these and other applications, as well as offering other benefits. Actually, a subcategory of data fusion, sensor fusion is also called multisensory data fusion or sensor-data fusion.
In artificial intelligence, sensor fusion, as well as large language models, neural networks, and machine learning, and other enabling technologies, such as high-bandwidth memory (HBM), continue to advance rapidly. With each advancement in these technologies, the inherent strengths of an individual sensor type can be better utilized and its weaknesses overcome or at least avoided. For example, in machine perception, LiDAR can accurately detect objects, but it does not have the range or affordability of cameras or radar sensing. Cameras can easily be blinded by dirt, sun, rain, snow, or darkness. Radar cannot read signs or observe different colors of light.
Various design approaches to sensor fusion include:
- Centralized vs. decentralized
- Different levels of abstraction
- Different fusion levels
- Sensor fusion paradigm (statistical, probabilistic, or knowledge-based)
- Fusion timing (late or early)
Benefits of sensor fusion for autonomous machines
In autonomous machines, sensor fusion provides redundancy, improved accuracy, better resolution, and robustness in dynamic environments. However, adding more sensors means adding more data that must be appropriately managed in real time. This requires the right software to fuse together the sensors’ outputs.

With the proper design/choice of signal processing algorithms, the noise, erroneous data, and other uncertainties of individual sensors can be minimized. In addition, fault tolerance of sensor inputs can be achieved. Identifying a sensor failure by observing that one sensor’s output is inconsistent with the output of others provides an additional advantage and reason for using sensor fusion.
Real-world use cases
With sensor fusion, several modern use cases either become practical or improve substantially. Examples include autonomous vehicles, robots, drones, and more.
For example, in autonomous vehicles, sensor fusion of LiDAR, radar, and optical sensing (cameras) is three of the most important sensor technologies for their development. The ultimate goal of their implementation is to recreate the human power of reliable judgment, with the ability to make split-second decisions based on a combination of information from the sensors and lessons learned from previous experiences.
In the all-electric Jaguar I-Pace SUV, each visible mounting location has more than one sensor. To consistently obtain the required amount of information, Waymo engineers concluded that a single type of sensor could not provide sufficient detail in all operating conditions.
Autonomous mobile robots, stationary robots, aerial robots, and marine robots all use sensor fusion. For robotic operations, accurate three-dimensional input allows a robot to adapt dynamically to changing conditions. High-resolution mapping is essential for safe and efficient movement of robotic systems, especially in cobot (collaborative robots that operate in a close proximity to and even interact with humans) situation,s to reduce the risk of collisions and optimize workflow efficiency.
While an individual drone can use sensor fusion, the technology gets even more impressive when it is used to create swarms employing numerous aerial drones to function as a coordinated, intelligent collective rather than as individual units.
References
What Is Sensor Fusion?
Home » Sensor Fusion
Related EE World content
Sensor fusion: What is it?
Sensor fusion levels and architectures
How does fusion timing impact sensors?
Sensors in the driving seat
What sensors make the latest Waymo Driver smarter?
What is the role of sensor fusion in robotics?
Filed Under: Sensor Tips