From traditional industrial robotic systems to today’s latest collaborative robots (cobots), robots rely on sensors that generate increasingly massive volumes of highly varied data. This data can help build better machine learning (ML) and artificial intelligence (AI) models that robots rely on to become “autonomous,” making real-time decisions and navigating in dynamic real-world environments.
Industrial robots are typically placed in “caged” environments; a human entering that environment stops robot movement for safety reasons. But limiting human-robot collaboration prevents the realization of many benefits. Robots with autonomous capabilities would enable the safe and productive co-existence of humans and robots.
Sensing and intelligent perception in robotic applications are important because they effect the performance of robotic systems―particularly ML/AI systems―which greatly depends on the performance of sensors that provide critical data to these systems. Today’s wide range of increasingly sophisticated and accurate sensors, combined with systems that can fuse sensor data together, are enabling robots to have increasingly good perception and awareness.
The Growth of AI
Robotic automation has been a revolutionary technology in the manufacturing sector for some time, yet the integration of AI into robotics is poised to transform the industry over the next few years. Let’s take a look at some of today’s key trends in robotics and automation, and the most important technologies that will tie AI to the data that it needs to be intelligent. I’ll also address how different sensors are being used (and fused) into AI systems.
Pushing AI Processing for Robotics to the Edge
There are two main parts of ML: training and inference, which can be executed on completely different processing platforms. The training side usually occurs offline on desktops, or in the cloud, and entails feeding large data sets into a neural network. Real-time performance or power is not an issue during this phase. The result of the training phase is a trained AI system that when deployed can perform a specific task, such as inspecting a bottle on an assembly line, counting and tracking people within a room, or determining whether a bill is counterfeit.
But in order for AI to fulfill its promise in many industries, the fusion of sensor data that happens during inference (the part that executes the trained ML algorithm) must happen in (near) real time. Thus, designers need to put ML and deep-learning models on the edge, deploying the inference into an embedded system.
For example, the cobot shown in Figure 1 is built to work in close collaboration with humans. It relies on data from proximity sensors, as well as vision sensors, to ensure it successfully protects humans from harm while supporting them in activities that would be challenging for them. All of this data needs to be processed in real time, but the cloud is not fast enough for the real-time, low-latency response that the cobot needs. To address this bottleneck, today’s advanced AI systems are being pushed to the edge, which in the case of robots means onboard.
This decentralized AI model relies on highly integrated processors that have:
- A rich peripheral set for interfacing to various sensors.
- High-performance processing capability to run machine-vision algorithms.
- A way to accelerate deep learning inference.
All of these capabilities also have to work efficiently and with a relatively low-power and small-size footprint in order to exist at the edge.
Power- and size-optimized “inference engines” are increasingly available as ML grows in popularity. These engines are specialized hardware offerings aimed specifically at performing ML inference.
An integrated system on chip (SoC) is often a good choice in the embedded space, because in addition to housing various processing elements capable of running deep learning inference, an SoC also integrates many components necessary to cover the entire embedded application.
Let’s look at some of the top robotic trends today.
Cobots
Humans can’t generally get near traditional industrial robots while they are operating without peril. Cobots are, in contrast, designed to operate safely alongside humans, moving slowly and gracefully.
As defined by ISO standard TS 15066, a collaborative robot is a robot capable of being used in a collaborative operation where collaborative operation means robot and humans working concurrently within a defined workspace for production operation (this excludes robot + robot systems or co-located humans and robots which are operating at different times). Defining and deploying cobots is used to foresee potential collisions between physical portions of the robot (or virtual extensions like laser) and the operators. This makes the use of sensors to determine the exact position and velocity of the operator more important.
Cobot makers must implement a high level of environmental sensing and redundancy into robotic systems to quickly detect and prevent possible collisions. Integrated sensors connected to a control unit will sense an impending collision between a robot arm and a human or other object, and the control unit will turn the robot off immediately. If any sensor or its electronic circuit fails, the robot also turns off.
Logistics Robots
Logistics robots (Figure 2) are mobile units that operate in environments where people may or may not be present, such as warehouses, distribution centers, ports, or campuses. Logistics robots fetch goods and bring them to a packing station, or transport goods from one building of a company site to another; some are capable of picking and packing goods as well. These robots typically move within a particular environment and need sensors for localization, mapping, and to prevent collisions (especially with humans).
Until recently, most logistics robots used pre-defined routes; they are now capable of adjusting their navigation based on the location of other robots, humans, and packages. Ultrasonic, infrared, and lidar sensing are all enabling technologies. Because of the robot’s mobility, the control unit is located inside, often with wireless communication to a central remote control. Logistics robots are now adopting advanced technologies such as ML logic, human-machine collaboration, and environmental analysis technologies.
Rising labor costs and stringent government regulations are contributing to the higher adoption of logistics robots. Their popularity is also rising because of a drop in the cost of equipment, components like sensors, and the cost of (and time required) for integration.
Last-Mile Delivery Robots
In a product’s journey from warehouse shelf to customer doorstep, the “last mile” of delivery is the final step of the process: the point at which the package finally arrives at the buyer’s door. In addition to being key to customer satisfaction, last-mile delivery is expensive and time-consuming.
Last-mile delivery costs are a substantial percentage of the total shipping cost. As such, making the last mile of delivery more efficient has become a focus for where to develop and implement new robotics technologies that can drive process improvements and increase efficiency.
Sensor Technology for AI in Robotics
As robotic technologies advance, so do complementary sensor technologies. Much like the five senses of a human being, combining different sensing technologies offers the best results when deploying robotic systems into changing and uncontrolled environments. Even the simplest tasks that a robot performs will depend on 3D machine vision to feed data into AI technology. Grasping an object, for example, without pre-determined locations and motions would be impossible without machine vision capable of reconstructing a 3D image, and AI to translate this visual information into a successful action on the part of the robot.
Today’s most prevalent and relevant sensor technologies for supporting AI in robotics include:
- Time-of-flight (ToF) optical sensors: These sensors rely on the principle of ToF and use a photodiode (a single sensor element or an array) along with active illumination to measure distance. The reflected light waves from obstacles are compared with the transmitted wave to measure the delay, which in turn is a representation of distance. This data then helps create a 3D map of the object.
- Temperature and humidity sensors: Many robots need to measure the temperature and sometimes the humidity of both their environment and their components―including motors and main AI motherboards―to ensure they are operating in safe ranges.
- Ultrasonic sensors: Vision sensors may not work if the robot is blinded by a bright light or finds itself in a very dark environment. By transmitting ultrasonic waves and listening for echoes that reflect back from objects (similar to how bats maneuver), ultrasonic sensors perform excellently in dark or bright conditions, overcoming the limitations of optical sensors.
- Vibration sensors: Industrial vibration sensing is a crucial part of the condition monitoring necessary for predictive maintenance. Integrated electronic piezoelectric sensors are the most common vibration sensor used in industrial environments.
- Millimeter-wave sensors: These sensors use radio waves and their echoes to determine the direction and distance of a moving object by measuring three components: velocity, angle, and range. This enables robots to take more predictive actions based on how fast objects are approaching the sensor. Radar sensors deliver excellent performance in the dark and can sense through materials like drywall, plastic, and glass.
Although humans still perform a majority of tasks on factory floors, robots will be adapting to humans in order to increase automation, not the other way around. To accomplish this goal, they need to become equipped with more AI capability to recognize and adapt to a wider variety of situations in real time, which is only possible with AI at the edge.
Filed Under: AI • machine learning, Product design