Text entry and editing have always been cumbersome for visually impaired users. Current solutions for eyes-free text entry demand high cognitive and motor abilities. Mainstream touchscreen-based solutions usually involve both hands, making them difficult to use for visually impaired users with a cane in one hand.
Moception presents a wrist-worn wearable for gesture recognition and audio/haptic feedback.
Even with our eyes closed, we can still tell the positions of different body parts confidently. Taking advantage of proprioception, Moception maps the spatial layout of a line of text to the spherical space under our palm, which improves visually impaired user’s awareness and perception of the text layout. By rotating their wrist, users can intuitively navigate through a line of text.
Gestures for discrete commands:
Low-key and intuitive gestures like swipe, draw a line were defined by 10 visually impaired participants in gesture study, following the participatory design paradigm proposed by Wobbrock et al. Each gesture was mapped to a basic command of the text entry system.
The final working prototype was built in Unity with a Leap Motion Controller for gesture recognition and tested with 4 visually impaired participants. The average completion time of task “Enter-review-edit” was cut down by 53.2% compared to the current speech-based input method on an iPhone. Moception was also perceived to be more effortless and intuitive to use.