A Design Space for Gaze Interaction on Head-Mounted Displays

Teresa Hirzle, Jan Gugenheimer, Florian Geiselhart, Andreas Bulling, Enrico Rukzio

In Proceedings of CHI ’19 (full paper). Paper   Slides   acm DL

Augmented and virtual reality (AR/VR) head-mounted display (HMD) applications inherently rely on three dimensional information. In contrast to gaze interaction on a two dimensional screen, gaze interaction in AR and VR therefore also requires to estimate a user’s gaze in 3D (3D Gaze).

While first applications, such as foveated rendering, hint at the compelling potential of combining HMDs and gaze, a systematic analysis is missing. To fill this gap, we present the first design space for gaze interaction on HMDs.



Towards a Symbiotic Human-Machine Depth Sensor: Exploring 3D Gaze for Object Reconstruction

Teresa Hirzle, Jan Gugenheimer, Florian Geiselhart, Andreas Bulling, Enrico Rukzio

In Adjunct Proceedings of UIST ’18 (extended abstract). Extended Abstract   Poster   acm DL

The goal of this project is to explore how much we can learn about physical objects a user is looking at by observing gaze depth. We envision a symbiotic scenario, where current technology (e.g. depth cameras) is extended with “human sensing data”. Here, a depth camera is able to create a rough understanding of a static environment and gaze depth is merged into the model by leveraging unique propetries of human vision.



VRSpinning: Exploring the Design Space of a 1D Rotation Platform to Increase the Perception of Self-Motion in VR

Michael Rietzler, Teresa Hirzle, Jan Gugenheimer, Julian Frommel, Thomas Dreja, Enrico Rukzio

In Proceedings of DIS ’18 (full paper). Paper  Video   acm DL

Current approaches for locomotion in virtual reality are either creating a visual-vestibular conflict, which is assumed to cause simulator sickness, or use metaphors such as teleportation to travel longer distances, lacking the perception of self motion. We propose VRSpinning, a seated locomotion approach based around stimulating the user’s vestibular system using a rotational impulse to induce the perception of linear self-motion. In a first study we explored the approach of oscillating the chair in different frequencies during visual forward motion and collected user preferences on applying these feedback types. In a second user study we used short bursts of rotational acceleration to match the visual forward acceleration. We found that this rotational stimulus significantly reduced simulator sickness and increased the perception of self-motion in comparison to no physical motion.



Rethinking Redirected Walking: On the Use of Curvature Gains Beyond Perceptual Limitations and Revisiting Bending Gains

Michael Rietzler, Jan Gugenheimer, Teresa Hirzle, Martin Deubzer, Eike Langbehn, Enrico Rukzio

In Proceedings of ISMAR ’18 (full paper). Paper   IEEE Xplore DL

Redirected walking (RDW) allows virtual reality (VR) users to walk infinitely while staying inside a finite physical space through subtle shifts (gains) of the scene to redirect them back inside the volume. All prior approaches measure the feasibility of RDW techniques based on if the user perceives the manipulation, leading to rather small applicable gains. However, we treat RDW as an interaction technique and therefore use visually perceivable gains instead of using the perception of manipulation. We revisited prior experiments with focus on applied gains and additionally tested higher gains on the basis of applicability in a user study. We found that users accept curvature gains up to 20◦/m, which reduces the necessary physical volume down to approximately 6x6m for virtually walking infinitely straight ahead. Our findings strife to rethink the usage of redirection from being unperceived to being applicable and natural.



WatchVR: Exploring the Usage of a Smartwatch for Interaction in Mobile Virtual Reality

Teresa Hirzle, Jan Ole Rixen, Jan Gugenheimer, Enrico Rukzio

In Adjunct Proceedings of CHI ‘ 18 (extended abstract). Extended Abstract   acm DL

Mobile virtual reality (VR) head-mounted displays (HMDs) are steadily becoming part of people’s everyday life. Most current interaction approaches rely either on additional hardware (e.g. Daydream Controller) or offer only a limited interaction concept (e.g. Google Cardboard). We explore a solution where a conventional smartwatch, a device users already carry around with them, is used to enable short interactions but also allows for longer complex interactions with mobile VR. To explore the possibilities of a smartwatch for interaction, we conducted a user study in which we compared two variables with regard to user performance: interaction method (touchscreen vs inertial sensors) and wearing method (hand-held vs wrist-worn). We found that selection time and error rate were lowest when holding the smartwatch in one hand using its inertial sensors for interaction (hand-held).