Wednesday, 2 April 2025

Feeling the future: New wearable tech simulates realistic touch

 When it comes to haptic feedback, most technologies are limited to simple vibrations. But our skin is loaded with tiny sensors that detect pressure, vibration, stretching and more.

Now, Northwestern University engineers have unveiled a new technology that creates precise movements to mimic these complex sensations.

The study will be published on March 28 in the journal Science.

While sitting on the skin, the compact, lightweight, wireless device applies force in any direction to generate a variety of sensations, including vibrations, stretching, pressure, sliding and twisting. The device also can combine sensations and operate fast or slow to simulate a more nuanced, realistic sense of touch.

Powered by a small rechargeable battery, the device uses Bluetooth to wirelessly connect to virtual reality headsets and smartphones. It also is small and efficient, so it could be placed anywhere on the body, combined with other actuators in arrays or integrated into current wearable electronics.

The researchers envision their device eventually could enhance virtual experiences, help individuals with visual impairments navigate their surroundings, reproduce the feeling of different textures on flat screens for online shopping, provide tactile feedback for remote health care visits and even enable people with hearing impairments to "feel" music.

"Almost all haptic actuators really just poke at the skin," said Northwestern's John A. Rogers, who led the device design. "But skin is receptive to much more sophisticated senses of touch. We wanted to create a device that could apply forces in any direction -- not just poking but pushing, twisting and sliding. We built a tiny actuator that can push the skin in any direction and in any combination of directions. With it, we can finely control the complex sensation of touch in a fully programmable way."

A pioneer in bioelectronics, Rogers is the Louis A. Simpson and Kimberly Querrey Professor of Materials Science and Engineering, Biomedical Engineering, and Neurological Surgery, with appointments in the McCormick School of Engineering and Northwestern University Feinberg School of Medicine. He also directs the Querrey Simpson Institute for Bioelectronics. Rogers co-led the work with Northwestern's Yonggang Huang, the Jan and Marcia Achenbach Professor in Mechanical Engineering and professor of civil and environmental engineering at McCormick. Northwestern's Kyoung-Ho Ha, Jaeyoung Yoo and Shupeng Li are the study's co-first authors.

The study builds on previous work from Rogers' and Huang's labs, in which they designed a programmable array of miniature vibrating actuators to convey a sense of touch.

The haptic hang-up

In recent years, visual and auditory technologies have experienced explosive growth, delivering unprecedented immersion through devices like high-fidelity, deeply detailed surround-sound speakers and fully immersive virtual-reality goggles. Haptics technologies, however, mostly have plateaued. Even state-of-the-art systems only offer buzzing patterns of vibrations.

This developmental gap stems largely from the extraordinary complexity of human touch. The sense of touch involves different types of mechanoreceptors (or sensors) -- each with its own sensitivity and response characteristics -- located at varying depths within the skin. When these mechanoreceptors are stimulated, they send signals to the brain, which are translated as touch.

Replicating that sophistication and nuance requires precise control over the type, magnitude and timing of stimuli delivered to the skin. This presents a massive challenge, which current technologies have struggled -- and failed -- to overcome.

"Part of the reason haptic technology lags video and audio in its richness and realism is that the mechanics of skin deformation are complicated," said Northwestern's J. Edward Colgate, a haptics pioneer and study co-author. "Skin can be poked in or stretched sideways. Skin stretching can happen slowly or quickly, and it can happen in complex patterns across a full surface, such as the full palm of the hand."

Actuator unleashed

To simulate that complexity, the Northwestern team developed the first actuator with full freedom of motion (FOM). This means the actuator is not constrained to a single type of movement or limited set of movements. Instead, it can move and apply forces in all directions along the skin. These dynamic forces engage all mechanoreceptors in the skin, both individually and in combination with one another.

"It's a big step toward managing the complexity of the sense of touch," said Colgate, Walter P. Murphy Professor of Mechanical Engineering at McCormick. "The FOM actuator is the first small, compact haptic device that can poke or stretch skin, operate slow or fast, and be used in arrays. As a result, it can be used to produce a remarkable range of tactile sensations."

Source: ScienceDaily

No comments:

Post a Comment