Tactile Echoes: Multisensory Augmented Reality for the Hand

Abstract

Touch interactions are central to many human activities, but there are few technologies for computationally augmenting free-hand interactions with real environments. Here, we describe Tactile Echoes, a finger-wearable system for augmenting touch interactions with physical objects. This system captures and processes touch-elicited vibrations in real-time in order to enliven tactile experiences. We process these signals via a parametric signal processing network in order to generate responsive tactile and auditory feedback. Just as acoustic echoes are produced through the delayed replication and modification of sounds, so are Tactile Echoes produced through transformations of vibrotactile inputs in the skin. The echoes also reflect the contact interactions and touched objects involved. A transient tap produces discrete echoes, while a continuous slide yields sustained feedback. We also demonstrate computational and spatial tracking methods that allow these effects to be selectively assigned to different objects or actions. A large variety of distinct multisensory effects can be designed via ten processing parameters. We investigated how Tactile Echoes are perceived in several perceptual experiments using multidimensional scaling methods. This allowed us to deduce low-dimensional, semantically grounded perceptual descriptions. We present several virtual and augmented reality applications of Tactile Echoes. In a user study, we found that these effects made interactions more responsive and engaging. Our findings show how to endow a large variety of touch interactions with expressive multisensory effects.

Publication
IEEE Transactions on Haptics


https://spectrum.ieee.org/tech-talk/consumer-electronics/gadgets/new-ar-system-alters-sight-sound-and-touch

Related