Researchers at UC Santa Barbara professor Yasamin Mostofi’s lab have given the first demonstration of three-dimensional imaging of objects through walls using ordinary wireless signal. The technique, which involves two drones working in tandem, could have a variety of applications, such as emergency search-and-rescue, archaeological discovery and structural monitoring.
“Our proposed approach has enabled unmanned aerial vehicles to image details through walls in 3D with only WiFi signals,” said Mostofi, a professor of electrical and computer engineering at UCSB. “This approach utilizes only WiFi RSSI measurements, does not require any prior measurements in the area of interest and does not need objects to move to be imaged.”
The proposed methodology and experimental results appeared in the Association for Computing Machinery/Institute of Electrical and Electronics Engineers International Conference on Information Processing in Sensor Networks (IPSN) in April, 2017.
In their experiment, two autonomous octocopters take off and fly outside an enclosed, four-sided brick house whose interior is unknown to the drones. While in flight, one copter continuously transmits a WiFi signal, the received power of which is measured by the other copter for the purpose of 3D imaging. After traversing a few proposed routes, the copters utilize the imaging methodology developed by the researchers to reveal the area behind the walls and generate 3D high-resolution images of the objects inside. The 3D image closely matches the actual area.
“High-resolution 3D imaging through walls, such as brick walls or concrete walls, is very challenging, and the main motivation for the proposed approach,” said Chitra R. Karanam, the lead Ph.D. student on this project.
This development builds on previous work in the Mostofi Lab, which has pioneered sensing and imaging with everyday radio frequency signals such as WiFi. The lab published the first experimental demonstration of imaging with only WiFi in 2010, followed by several other works on this subject.
“However, enabling 3D through-wall imaging of real areas is considerably more challenging due to the considerable increase in the number of unknowns,” said Mostofi. While their previous 2D method utilized ground-based robots working in tandem, the success of the 3D experiments is due to the copters’ ability to approach the area from several angles, as well as to the new proposed methodology developed by her lab.
The researchers’ approach to enabling 3D through-wall imaging utilizes four tightly integrated key components. First, they proposed robotic paths that can capture the spatial variations in all the three dimensions as much as possible, while maintaining the efficiency of the operation.
Second, they modeled the 3D unknown area of interest as a Markov Random Field to capture the spatial dependencies, and utilized a graph-based belief propagation approach to update the imaging decision of each voxel (the smallest unit of a 3D image) based on the decisions of the neighboring voxels.
Third, in order to approximate the interaction of the transmitted wave with the area of interest, they used a linear wave model.
Finally, they took advantage of the compressibility of the information content to image the area with a very small number of WiFi measurements (less than 4 percent). It is noteworthy that their setup consists solely of off-the-shelf units such as copters, WiFi transceivers and Tango tablets.
Filed Under: M2M (machine to machine)