In a recent video with Paul Heney, we touched on touchless gesture controls — haptic technology that lets machine builders and other engineers incorporate mid-air controls enhanced by tactile feedback. The hardware and software can be embedded into machine builds and consumer designs large and small. Our sister publication EE World covered the technology late last year (with a detailed description of the technology from Vicky Messer, electrical and system engineer and technical spokesperson for Ultrahaptics).
In short, haptic feedback is the sense of touch that devices send back to a user — often as vibrations or tapping, for example. Haptic feedback is common in tablets and smartphones … But these ubiquitous forms of haptics all rely on direct physical contact by that user. Now, there’s a new breed of touchless haptic feedback that sends signals through the air for the user to feel. One iteration currently leading the emerging industry is based on technology developed by University of Bristol researchers Tom Carter and Ben Long of the startup Ultrahaptics. Their technology uses ultrasound to stimulate neuro-receptors in the human body.
How touchless gesture controls work
Ultrahaptics haptic feedback uses acoustic radiation force, which is force generated when ultrasound reflects off the skin. When the ultrasound is focused onto the surface of the skin, it induces a shear wave in skin tissue. Displacement from the wave triggers mechanoreceptors within the skin — and that induces a haptic sensation.
It’s very safe, as most of the ultrasound energy reflects off the skin (because human skin is so much denser than air). In fact, Ultrahaptics technology uses an array of ultrasonic transducers and signal-processing algorithms to create focal points of ultrasonic energy on user palms or fingers.
[bctt tweet=”#touchless haptic controls — How they work • @ultrahaptics” username=”DW_LisaEitel”]
Carrier frequency is about 40 kHz because transducers to output this high frequency are readily available. Then modulating controls slow the frequency down to somewhere between 1 and 500 Hz — though usually between 100 and 300 Hz for most applications — which is the frequency at which tactile receptors in the human hand have peak sensitivity.
In fact, the system is designed for hand-based haptics because the human palm and fingers are most sensitive to the modulated acoustic fields. Modulation frequency is one of the parameters that’s adjustable by the Ultrahaptics application programming interface (or API) to create different sensations. Another important parameter is the placement of the focal points in space — and those are determined by 3D coordinates in the API as well.
Touchless-haptic setups require a transducer array and logic processor board … as well as firmware installed on the logic board, a camera sensor for hand-position recognition, a software license, and a host processor running the Ultrahaptics and camera-sensor API.
Industrial and consumer applications abound. Anyplace that could be germy or messy or difficult for working hands to touch, for example — such as industrial HMIs or stovetops or ATMs — are good candidate applications. Consider that at the recent Consumer Electronics Show, Ultrahaptics’ touchless haptics were on display at the Bosch infotainment system booth. This was to showcase new HMIs for automotive interiors. Here touchless haptics can boost driver safety … so for example, one can quickly make a wave or other motion to control in-car audio, navigation, or communications and then get that second hand right back on the steering wheel without taking eyes off the road.