A picture may truly be worth a thousand words to Soldiers who lose GPS while on patrol.
The Army Materiel Command’s Communications-Electronics Research, Development and Engineering Center, or CERDEC, is using miniature cameras to create vision-aided navigation capabilities in GPS-denied situations.
“Vision-aided navigation works by using cameras with rapid frame rates to take pictures of objects in view and then comparing the object’s features in each frame to determine how far, and in what direction, the camera has moved in relation to the object,” said Eric Bickford, an engineer in CERDEC’s Command, Power and Integration Directorate’s Positioning, Navigation and Timing Division, or CP&ID PNTD.
The camera catches even the slightest movement through feature detection, which allows users to leverage the camera’s data to track a person’s relative position and movement over a given trajectory or path, Bickford said.
Vision-aided navigation is part of the Army’s overarching goal of providing uninterrupted PNT capabilities to Soldiers. While still in the early development phase, CERDEC plans to transition vision-aided navigation solutions to the Army’s Direct Reporting Program Manager Positioning, Navigation and Timing, which was chartered in 2015 to address PNT capabilities across Army portfolios.
“The availability of GPS on the battlefield has significantly enhanced Soldiers’ navigational capabilities, but it is susceptible to interference,” said Christopher Manning, acting director for CERDEC CP&ID.
“As the Army’s R&D lead for Soldier and ground platform PNT needs, we’re using our science and technology investments to support PM PNT by investigating and developing alternate navigation solutions that will address the PNT challenges our Soldiers face in various tactical environments.”
A monocular, or single camera, acts as the foundation to the vision-aided navigation system. It captures rotation and translation but not depth; in other words, it shows how much a person rotated and moved along a path, but not how far away he or she is. The “aided” component incorporates inertial measurement units, or IMUs, which are composed of sensors such as gyroscopes and accelerometers; when properly combined with the camera, they provide motion and direction information simultaneously.
“IMUs allow us to determine approximately how the camera is moving; thus, the motion of the camera can be mathematically compared with the motion detected from features tracked visibly by the camera,” CERDEC engineer Gary Katulka said.
“With calibrated cameras, quality IMUs, a well-tuned navigation algorithm and other supporting components, a person equipped with a vision-aiding navigation system can achieve GPS-like navigational performance.”
Over time, errors from the IMUs will accumulate and cause some sensor “drift,” but data from the camera serves to limit these errors for a more accurate combined-sensors navigation solution known as sensor fusion, Katulka said.
The first iteration of vision-aided navigation will likely be vehicle-mounted. CERDEC tested this concept by inserting a system into a standard vehicle and driving along a major highway. The camera’s feature detection capability accurately captured everything in its path — other cars, exit signs, and trees — even at high speeds.
“The system understood that the cars ahead of us were going nearly the same speed as we were because those cars never appeared to change in size,” Bickford said.
Two-camera vision-aided navigation, also known as “stereo,” could be a viable option as long as size, weight, power and cost, or SWAP-C, are not a factor. Objects appear to shift with the two camera solution as it operates similarly to a person repeatedly opening one eye while closing the other.
“That shift tells you how far things are away from you; whereas an object will appear to shift more if it closer to you, but if you look into the distance, the shift yields very little movement,” Bickford said. “This also gives us the required distance or depth information.”
The Army’s science and technology community is investigating approaches for multi-purpose cameras, including vision-aided navigation. With that in mind, CERDEC CP&ID is teaming with additional organizations, including its sister organization, the Night Vision and Electronic Sensors Directorate, to leverage existing technologies such as its thermal imaging camera, which will allow vision-aided navigation in less optimal and low-light situations. In the future, vision-aided navigation systems could be integrated with wearable devices.
During tests in urban environments, CERDEC’s Soldier-mounted prototype allowed the user to stay on nearly the exact trajectory to the path generated from GPS and other sources.
“What is working in our favor is that we are leveraging commercial technologies in today’s cellphones and cameras, which are getting smaller, cheaper and are nearing the visual acuity of the human eye,” Katulka said. “These developments have made vision-aided navigation technology for military applications very viable and quite attractive.”
Ultimately, the goal for any navigation system is to allow commanders and Soldiers to share PNT information up and down the command chain. The overarching goal is to aggregate emerging PNT systems in a cost effective manner to provide a comprehensive capability that will mitigate impacts during to GPS-denied situations.
“A battalion could have several PNT capabilities sprinkled throughout the battlefield or flying above that, when extrapolated, assist the entire group on the battlefield,” Katulka said. “Assured PNT makes expeditionary mission command much more efficient and thus a possible game changer.”
By combining multiple inertial components with basic camera technologies, CERDEC is elevating an elementary motion-detection concept into a cutting-edge solution, Katulka said.
“When we were kids, we made stop-motion flip books to create homemade animation,” Katulka said. “I would never have guessed then that the same concept could be used today to help our Soldiers navigate.”
Filed Under: M2M (machine to machine)