When you go to the doctor to see if you have a sore throat, she’ll usually swab inside your mouth and then send a sample to the lab for analysis. This process generally takes a few days, by which time you’ll probably know whether you’re sick or not, anyway. Imagine that, instead, the doctor just slips out her smartphone, takes a couple of pictures, and lets you know in less than a minute.
Professors Brian T. Cunningham, Steven S. Lumetta, and John Michael Dallesasse are leading a team that’s transforming this scenario into reality, and doing so with image-processing equipment and the camera already in smartphones. Their research has been documented in two papers in Analytical Chemistry and Optics Express, and the team has recently won a $300,000 National Science Foundation grant to take their research further.
“A lot of these tests require machines that cost hundreds of thousands of dollars,” Cunningham said. “We’re able to get the same results with a $200 smartphone, which could really help in a number of different situations.”
Cunningham sees this technology as applicable not just at the doctor’s office, but around the world. Imagine being at a restaurant and not knowing whether allergens are in your food, so you just open an app on your phone and find out. Imagine doctors in remote parts of the world being able to test patients without needing expensive lab equipment because all of the devices they need are in the palms of their hands.
At the moment, the smartphone works along with an additional 3-D printed cradle that shines a green laser pointer into the sample. The phone camera then takes pictures of the light that emerges. Cunningham plans to use his new grant to develop a means of using the phone independent of the cradle, so that all of the app’s capabilities can be used without any additional hardware.
“We are very interested in a simple, portable, and inexpensive instrument which can be used for early detection of cancer, HIV, and other pathogens,” said graduate student Hojeong Yu. “Current laboratory instruments are too expensive and unwieldy to be used for home-healthcare applications and point-of-care diagnostics, and we expect the smartphone biodetection instrument to be a much more portable solution for biodetection needs in the field, functioning like a medical tricorder in the Star Trek universe.”
Cunningham originally had the idea after seeing how effective and diverse mobile apps were, and realized how powerful smartphone apps could be.
“I remember a few years back I was having lunch with an ECE alumnus who showed me this app his company developed,” Cunningham said. “It was a game called “Augmented Reality Missile,” where you looked out at the world through the iPhone camera, then shoot virtual missiles at targets on the screen to create explosions overlaying the real world.”
The app was deviously simple in execution, but it opened Cunningham’s eyes to the possibility of using smartphone cameras for far more than just taking pictures. He was already working in the realm of biodetection and so immediately visualized a way to take the camera capabilities of phones and turn them into surprisingly effective biodetection equipment.
Though phones using Lumetta’s software and capabilities have the same levels of tolerance as lab machines, and in some cases, even better levels, Cunningham doesn’t necessarily see the technology as supplanting machines in the lab.
“The technology could be better on a phone, but I don’t see the lab machines going away anytime soon,” Cunningham said. “Lab machines have way better throughput; they can go through hundreds of samples a day. The smartphone technology is designed to help people in different kinds of situations, when they want a single or a few samples measured in the field.”
Apart from having papers published in multiple reputed science journals, Cunningham has also been giving talks at conferences across the nation and has been contacted by numerous scientists and other attendees who are interested in his work. The feedback is varied, from researchers interested in pursuing his findings independently to concerned parents wondering if they can use the technology to scan their kids’ food for allergens at restaurants, but it has been overwhelmingly positive, according to Cunningham.
“People are becoming aware of the potential new uses for mobile devices,” he said. “I am sure that we will see an explosion of health, food, and environmental measurement applications become available commercially.”
Filed Under: M2M (machine to machine)