In Part 1, Chris Yahnker, group supervisor for Extreme Environments Robotics, spoke at MD&M West about the ongoing details of putting a robot on Mars and how bringing that to fruition is an extremely tedious process. Now, he hones in on what happens while a bot navigates unknown terrain and what’s in the store for the future of these robotics.
Once a robot has been created, JPL still has to take into consideration how to plan the movements of a robot and while also executing these movements. The robot needs to be able to navigate effectively, even if an object impedes its path. By using 3D data from cameras and LIDAR, JPL can build a map. For instance, if they fly a drone around and it collects information as it flies around, JPL can create a map based on the collected data.
“We also look for optimizing resource utilization,” he says. “Rovers are low powered and powered by either the sun or batteries. So, how do we make sure we’re picking a path that is safe and energy efficient? We do a lot of research on machine learning for situational awareness.”
One of the biggest trends when working on these projects is implementing more autonomy. Yahnker says the delay in time to communicate with a robot is not realistic. The delay is too much to joystick an operation, he says.
“There’s no way we can joystick a drone flying on Mars,” says Yahnker. “We have to look at how we can build the sensing and autonomy within that robot. We look at what we’re giving our operators here on Earth to enable those controls.”
For instance, if JPL wants to position a robot’s drill a certain way, or plan a specific path for a rover to navigate, Yahnker says they do hardware in the loop of real-time. With operator interface, real-time perception of the robot’s orientation, and an identified target, the operator can then manipulate a path for the robot, press okay, and execute that task. The robot can then notify the operator that the task has been completed successfully.
Meanwhile, scientists are looking at ways to do a lot more with robots while integrating better systems, but this is being done with a lot of manufacturing techniques, he says.
Yahnker says they’re currently looking at how to print stuff in space.
“If we send humans to Mars…if they break something how do they fix that in space?” he asked. “We’re doing a lot of work to understand the materials and resin to fabricate these things. It would be really cool if we could have a robot that could do this when they’re in Mars. Also, could you fully print your robot while in space?”
Although not part of an astronaut’s current agenda, Yahnker thinks someday astronauts will be 3D printing items from space with a handy, perhaps 3D-printed itself, robot.
The team is also looking at innervated robots, robots that have multiple channels and inputs. This includes sensor fusion, and looking at the cameras and LIDAR data.
“Can we make smarter robots that can do better science and sensing? For example, can we integrate sensors into the wheels so we can know what it’s touching and what it’s feeling?” Yahnker asked.
Although there are many unknowns, JPL has tackled many of these questions in their innovative designs.
A few of JPL’s current and future projects include the following:
The Mars 2020 rover will investigate the environment on Mars that has been favorable for microbial life and search for evidence of past life. Two instruments mounted on the rover’s robotic arm will be used to search for and collect samples by analyzing the chemical, mineral, physical, and organic characteristics of rocks on Mars. Additionally, two other instruments will provide high-res imaging and three types of spectroscopy for characterizing rocks and soil from a distance. Yahnker says the rover will launch with a total of 23 cameras.
JPL created the RoboSimian and Surrogate robots in a 2015 DARPA Robotics Challenge. The RobotSimian cam hand was developed specifically for this challenge.
“It had to climb into a car, open the door, grab a drill. All of those things required different ways to grasp,” says Yahnker. “This cam hand gave us the dexterity with a robust system that we could adapt.”
The Gecko Gripper was inspired by a gecko’s specialized foot hairs that enable them to stick to vertical surfaces without falling.
“Researchers said, ‘How can we use that?’” says Yahnker. “Can we grab onto space debris that’s floating around and clean that up?”
The Gecko Gripper investigation is testing the adhesive gripping device to stick on surfaces, despite being immersed in the harsh environment of space. JPL foresees robotic crawlers that could walk along spacecraft exteriors, and grippers that could use a touch-to-stick method to catch and release objects.
No matter what, JPL will continue to delve into the unknowns of extreme environments and prepare their robots for the best and worst.
Filed Under: Aerospace + defense, Robotics • robotic grippers • end effectors