
In the field of wearable robotics, physical interfacing between the human body and a robot causes various engineering issues with mechanical design, control architecture construction and actuation algorithm design.
Right now, your eyes are moving in short bursts and quick stops to read this line of text. Your lungs are expanding and contracting, and your muscles are subtly twitching—you’re in motion. Human movement is so constant and ubiquitous, if we think of it at all, we tend to think of it as simple. But it isn’t. Designing a robotic system that replicates even the most basic human movement is still only an emerging technology.
The Central Advanced Research and Engineering Institute at Hyundai Motor Company develops future mobility technologies. As our society ages, there is a greater need for systems that can aid mobility. That’s why the institute began developing wearable exoskeleton robots with NI embedded controllers. The problem was developing a system that could handle the complex control algorithms needed to capture data from sensors while performing real-time control of multiple actuators.
The solution for the institute’s team was using the LabVIEW RIO platform with a CompactRIO embedded system and a real-time controller. The FPGA control architecture provided by Single-Board RIO collected data from sensors and control peripheral units, conducts real-time analysis and reduces development time.
In the field of wearable robotics, physical interfacing between the human body and a robot causes engineering issues with mechanical design, control architecture construction and actuation algorithm design. The space and weight for electrical devices is extremely limited because a wearable robot is worn like a suit. Additionally, the overall control sampling rate of the robot must be fast enough that it does not impede motion and can properly react to external forces. Also, many questions remain regarding human augmentation and assistance with control algorithms for wearable robots. Therefore, the institute focused on the following requirements for selecting a main controller for our wearable robots:
- high-speed processing of data obtained from various types of sensors
- size and weight
- real-time data visualization for developing control algorithms
- connectivity to other smart devices to offer more convenient functions
The real-time control and FPGA hardware environment provided I/O that was compatible with various robotic control devices. For instance, in the process of building the wearable robots, the overall control architecture drastically changed several times due to the replacement of sensors or changes in the control communication method. However, the onboard combination of the real-time controller and FPGA features let the institute’s team manage these changes promptly, which helped reduce the development period.
In addition, adopting the compact sbRIO-9651 System on Module (SOM) device let the design team reduce the robot’s weight to less than 10 kg while maximizing battery efficiency through a low-power base system configuration.
As the number of sensors and actuators increases for more complex tasks, the complexity of the control algorithms increases exponentially. Therefore, simultaneously processing all data from multiple sensors and sending instructions to multiple actuators becomes one of the most important challenges to address in robotics. LabVIEW supports concurrent visualization for intuitive signal processing for installed sensors on robots and further control algorithm design in the experimental stages.

Life-Caring Exoskeleton—This is a modular robot that combines the hip and knee parts to provide walking assistance to the elderly or people with difficulties moving the lower half of their bodies.
When someone wears this robot, it is possible to identify intention and walking status by collecting data from an area between the ground and the sole of the foot. Technology that transmits this data through wireless ZigBee communication is already in place. This technology can be further expanded now using Internet of Things (IoT) technology. In other words, you can send information acquired wirelessly to a robot to make it assist with the walker’s movements. In addition, gathering relevant data can help users identify a personal range of activities and conditions based on location, and that information can be integrated into the robot and lead to more comprehensive service. If a patient wears this robot for rehabilitation purposes, doctors can monitor patient and robot conditions during rehabilitation and deliver real-time training or adjustments to enhance efficiency and effectiveness of treatment, a good example of implementing data information-based technology.
Hyundai
worldwide.hyundai.com
Filed Under: TECHNOLOGIES + PRODUCTS, The Robot Report
Tell Us What You Think!