Design Engineering
Showcase 2021

Computer Vision Powered Tactile Interfaces for VR and AR Applications

Tags
Virtual Reality
Computer Vision
Tactile Interfaces

Project Details

Student
Harvey Upton
Course
Design Engineering MEng
Supervisor
Dr A. Freddie Page
Theme
Masters Project
Links
Harvey Upton (linktr)

VR and AR technology is increasingly being used for productivity, education, social and gaming applications. Currently, users interact with these applications using the same input device. Either through a set of handheld controllers or by using gesture-based hand tracking systems.

This research develops and tests a new system for creating tactile interfaces for VR and AR by utilising a computer vision and passive hardware approach. The system allows for the creation of new input form factors and experiences while decreasing the cost of input devices due to the passive nature of the hardware.

Three studies were carried out to evaluate the new input system.

Firstly, a performance study of each input modules accuracy and reliability. Secondly, a demonstration of how the system may be combined with existing hand tracking technology. And finally, to evaluate how the system may be used to enable new interaction experiences, the system was applied to three VR interactive applications through the creation of new input devices and Unity virtual environments.

Image showing mixed reality capture of the VR input system in use. One section shows the launchpadXR interface in use in the real world with the virtual world overlaid. The other section shows the ActiveXR interface in use with the virtual environment visible behind the user. The image also shows the computer vision data visualisation on top of the raw input image.

ActiveXR

VR is becoming a popular platform for exercise with games enabling fun new ways for people to work out. ActiveXR explored how the new input system may be used to create tangible interfaces for new exercise experiences. The interface takes the form of a resistive deformable handheld ring. Two pose input modules are used to track each handle of the interface and the extent to which it is deformed when the user is compressing or stretching it. The input system allows the user’s interactions to be understood and brought into the virtual domain.

LaunchpadXR

LaunchpadXR explored how the input system may be used to create new educational experiences. The tactile interface was created as a tool to help teach space exploration and rocket science in a fun way. Using various magnetically attachable tactile modular rocket components users can design and optimise their own exploration rocket and watch it launch in an immersive virtual simulation. The hardware comprised various rocket components and each utilised a pose input module to track the position and relation to other rocket components.

Oculus Quest hand tracking was integrated into the LaunchpadXR experience to be used in conjunction with the tactile rocket components. This allowed the user to see a representation of their hands in the virtual world along with the virtual versions of the individual components.

Input System Setup

To capture the experience of using the new interfaces and applications, mixed reality recordings were created when the system was in use. These enable viewers to see the chroma keyed real-world interface interactions with the virtual environment overlayed in front of and behind the user.

The photo below shows the setup used for testing: (a) Oculus VR headset. (b) Custom interface device. (c) Webcam capture for CV processing. (d) iPhone video recording for MR video. (e) Green screen. (out of shot) Lighting array, PC running input system, Unity application and OBS.