As robots gain more autonomy for navigation and manipulation, the challenge of processing the data they generate and gather in real time has led to advances in cloud-based robotics. Developers have not had many tools for applying machine learning and coordinating robots, but AWS RoboMaker promises to accelerate the development of cloud services for robots.
One of the keynote speakers at next month’s Robotics Summit & Expo is Roger Barga, general manager of AWS Robotics and Autonomous Services at Amazon Web Services Inc. He will address “The Role of the Cloud in the Future of Robotics,” and he gave The Robot Report a preview of what he’ll discuss.
Barga has a Ph.D. in computer science from the Oregon Institute of Technology and is an affiliate professor at the University of Washington. Barga holds more than 30 patents, has published over 100 peer-reviewed technical papers and book chapters, and has written a book on predictive analytics.
Tell us a little about your experience at AWS.
I’ve had the good fortune to work at AWS for five years. I started in Amazon Kinesis data streaming services, moving data fast from, say, iPhones to plates of sushi to the cloud for real-time monitoring of inventory and food consumption. It gave me a good idea of how the cloud can help businesses.
I saw a trend in devices becoming more capable, but how is processing going to shift to the edge? The question of coordinating with the cloud vexed us.
We know how hard it is to program devices at the edge, with 200,000 autonomous robots in Amazon’s fulfillment centers. That led us to build AWS RoboMaker, which is a solid tool that works with ROS [Robot Operating System] Industrial, machine learning, monitoring, and tracking.
Where are we now with robotics and the cloud?
There are two ends of the spectrum. At one end, we’re deeply concerned about small startups. It’s difficult to do the heavy lifting with undifferentiated data, to get an expert in ROS, and to get all the developers in a room together. So many startups can’t afford the hardware or team members and are wasting time not doing simulations or testing first. They’re struggling with over-the-air deployment and scaling up to thousands of robots.
This could bust out big in the next few years, and we’re getting good feedback on what works and what doesn’t — developers can use laptops, batch processing, or highly interactive simulations. We’ve brought a lot of people on board, plus over 2,000 students have signed up [for RoboMaker training].
The second group includes very mature companies building fleet-management services. In Germany, for example, Bosch recently announced that it’s using RoboMaker. It already had its own fleet-management software, which is integrated seamlessly with RoboMaker. They can see the ROS source code and tweak it.
Individual companies have their own use cases and built their own services. A big focus for us this year is to help them get data off of robots and build their own fleets.
Simulating robotics functions could save developers and users time, but is it underrated?
Simulation is under-appreciated. It’s important for debugging applications, seeing robots run the right way, and building models.
For one of the more recent robots we built within Amazon Robotics, simulation reduced the time to build by 25%. Before any robots were built, we simulated the robots’ behavior and congestion patterns.
Simulation really is a developer’s best friend. We are supporting Gazebo 7 and 9 and a new version called Ignition, as well as ROS1 Melodic. We’re also looking at other physics engines.
AWS can also perform batch simulations in parallel. It’s serverless — we take care of running the simulations, and customers can see the logs. They can also get alerts when Simulation 21 collides or Simulation 14 runs out of power.
What can machine learning in the cloud do for robotics?
I’ll talk at the Robotics Summit about Deep Racer. We integrate RoboMaker with AWS SageMaker for reinforcement learning. We’re running dozens of simulations of toy cars in different tracks and integrate the lessons from those experiments into one model. We can then train fleet management to deploy to the physical track.
Imagine an individual car hitting a pothole, but then 100,000 vehicles. As each one hits, you can run machine learning and push out improved algorithms to vehicles. With a group of robots, you can monitor them, make inferences, and make improvements.
If a robotic vacuum cleaner gets stuck somewhere, it can generalize the experience and update the model. An Amazon team is training a TurtleBot to recognize its environment. It can recognize objects and build a library for reinforcement learning.
At reMARS, we’ll demonstrate how cars using NVIDIA’s low-priced Jetson chip can be trained to recognize dinosaurs around the track. Imagine coordinating robots so they can learn from individual mistakes.
How does the cloud enable customer-service and other robot capabilities?
A robot is something that senses, computes, and takes action. The form factors that robots will take on will be amazing in the next few years. We have case studies coming out on service robots and cloud services.
One example is a robotic concierge that works in stores and courthouses. People can talk to it, and it can provide directions. With Polly [AWS text-to-speech service] and Lex [for building conversational interfaces], not only can the robot provide directions, but it can also sense if someone is frustrated. If a customer is paying for premium support, the robot can even show a person’s face to respond. It does this with no draw on the battery or onboard processing.
Over 60% of power typically goes to compute, and even small processors consume large amounts of power and generate heat. In applying cloud services to robots, we have to understand each case. Where can we be latency-tolerant? There’s no single answer.
In my keynote, I’ll talk about how developers are often focused on the individual robot, but the user is trying to optimize for a business metric, like optimizing fulfillment or customer service. No one robot can do that, so they need Oracle in the cloud to tell them how the robots are working together and how they can be choreographed to meet that business metric.
With Amazon Robotics, individual robots are running autonomously, but there’s another Oracle service monitoring where each one is. If a robot goes into a quadrant and builds a SLAM [simultaneous localization and mapping] map, another can call for it with Polly and Lex and use it.
How important is open source to AWS’s efforts?
I love seeing open-source software and its contributions. There’s be no cloud today without Linux. We’ve open-sourced a lot of sample applications to accelerate development.
Licensing an operating system can be expensive and slow. I’m glad to see Bosch, Microsoft, and Toyota Research committing to make ROS2 industrial-grade. I’m very pleased to see my team at Amazon leading security and other performance improvements.
What about emerging technologies such as the Internet of Things (IoT) and 5G?
We think that’s going to be an enabler for new scenarios with more ubiquitous, reliable connectivity to the cloud. However, RoboMaker works just fine without connectivity; it can work with a customer’s PC — but that would preclude its ability to do fleet management or monitoring with cloud services.
Our customers with robots largely have fairly good connectivity. We’re not betting our current strategy on improvements, but 5G is a game-changer.
One requirement that we have is that when you install AWS RoboMaker, you install the AWS IoT Greengrass core and security resources. It’s a secure environment and creates a digital twin in the cloud. It supports the idea of quorums — a chat group for robots, if you will.
The real utility of robots shines when they have the ability to coordinate. Robots in a quorum have greater functionality because they can call on one for image processing, etc.
What’s next for cloud robotics — and Amazon?
Our journey this year is building services to get more utility out of robots, which are data-gathering machines. If one vacuum cleaner wasn’t working properly, another one could go over and take care of a specific area.
With data streaming, you can take a little bit of data and make sense of it, enabling robots to take on small tasks as needed. That’s an area of increasing interest for utility companies.
In the DARPA swarm robotics challenge, the question was, “How do you program swarms?” It’s still early days for defining cooperative models, and it’s an open area rich with possibilities for developers, which is what attracted us to it.
Of course, the robots in Amazon’s fulfillment centers can benefit from better monitoring, control, and fleet management. At AWS, we love building services that both internal and external customers can use.
Note: Amazon Web Services will be hosting a free AWS RoboMaker Immersion Day in Boston on June 4.
The Robotics Summit & Expo 2019 will be on June 5-6 at Boston’s Seaport World Trade Center. Barga’s keynote on “The Role of the Cloud in the Future of Robotics” will be on Wed., June 5, at 9:45 a.m. Register now to attend.
Filed Under: Student programs, The Robot Report, Wireless, Robotics • robotic grippers • end effectors