“Walk a Robot Dog in VR!” by Rewkowski and Lin

  • ©Nicholas Rewkowski and Ming C. Lin


Entry Number: 07


    Walk a Robot Dog in VR!

Program Title:

    Mixing XR and Reality



    Realistic locomotion in a virtual environment (VE) can help maximize immersion and decrease simulator sickness. Redirected walking (RDW) allows a user to physically walk in VR by rotating the VE as a function of head rotation such that they walk in an arc that fits in the tracking area. However, this requires significant user rotation, often requiring a “distractor” to cause such rotation in commercial tracking spaces. Previous implementations suddenly spawned a distractor (e.g. butterfly) when the user walks near the safe boundary, with limitations like the user causing distraction accidentally by looking around, the distractor not being acknowledged, or getting “stuck” in a corner. We explore a persistent, robot distractor tethered to the user that provides two-way haptic feedback and natural motion constraints. We design a dynamic robot AI which adapts to randomness in the user’s behavior, as well as trajectory changes caused by tugging on its leash. The robot tries to imperceptibly keep the user safe by replicating a real dog’s behaviors, such as barking or sniffing something. We hypothesize that the naturalness of the dog behavior, its responses to the user, and the haptic tethering will work together to allow the user to explore the entire city, ideally without noticing that the dog is a robot.

Additional Images:

©Nicholas Rewkowski and Ming C. Lin ©Nicholas Rewkowski and Ming C. Lin

ACM Digital Library Publication:

Overview Page: