“Dual Robot Avatar: Real-time Multispace Experience using Telepresence Robots and Walk Sensation Feedback including Viewpoint Sharing for Immersive Virtual Tours” by Kikuchi, Ojima, Kato, Unno, Yem, et al. …
Notice: Pod Template PHP code has been deprecated, please use WP Templates instead of embedding PHP. has been deprecated since Pods version 2.3 with no alternative available. in /data/siggraph/websites/history/wp-content/plugins/pods/includes/general.php on line 518
Conference:
- SIGGRAPH 2022
-
More from SIGGRAPH 2022:
Notice: Array to string conversion in /data/siggraph/websites/history/wp-content/plugins/siggraph-archive-plugin/src/next_previous/source.php on line 345
Notice: Array to string conversion in /data/siggraph/websites/history/wp-content/plugins/siggraph-archive-plugin/src/next_previous/source.php on line 345
Type(s):
Entry Number: 03
Title:
- Dual Robot Avatar: Real-time Multispace Experience using Telepresence Robots and Walk Sensation Feedback including Viewpoint Sharing for Immersive Virtual Tours
Presenter(s):
Description:
Traveling to different places simultaneously is a dream for several people, but it is difficult to realize this aspiration because of our physical space limits. On one hand, virtual reality technologies can help alleviate such limits. According to the best of the authors’ knowledge, there is no study attempt to operate multiple telepresence robots in remote places simultaneously, with presenting walk sensation feedback to the operator for an immersive multispace experience. In this study, we used autonomous mobile robots; a dog and wheel type one, where their movements’ direction can be controlled by an operator (Fig. 1). The operator can alternatively choose/re-choose the space (or robot) to attend and can move the viewpoint using a head-mounted display (HMD) controller. A live video image with 4 K resolution is transmitted to the HMD via web real-time communication (WebRTC) network from a 360° camera placed to the top of each robot. The operator perceives viewpoint movement feedback as a visual cue and vestibular feeling via waist motion and proprioception on the legs. Our system also allows viewpoint sharing in which fifty users can enjoy omnidirectional viewing of the remote environments through the HMD without walk-like sensation feedback.
References:
- Lucas Bruck, Bruce Haycock, and Ali Emadi. 2021. A Review of Driving Simulation Technology and Applications. IEEE Open Journal of Vehicular Technology 2 (2021), 1–16. https://doi.org/10.1109/OJVT.2020.3036582
- Markku Suomalainen, Basak Sakcak, Adhi Widagdo, Juho Kalliokoski, Katherine J. Mimnaugh, Alexis P. Chambers, Timo Ojala, and Steven M. LaValle. 2022. Unwinding Rotations Improves User Comfort with Immersive Telepresence Robots. CoRR abs/2201.02392(2022). arXiv:2201.02392https://arxiv.org/abs/2201.02392
- Susumu Tachi. 2016. Telexistence: Enabling Humans to Be Virtually Ubiquitous. IEEE Computer Graphics and Applications 36, 1 (2016), 8–14. https://doi.org/10.1109/MCG.2016.6
- Minori Unno, Ken Yamaoka, Vibol Yem, Tomohiro Amemiya, Michiteru Kitazaki, and Yasushi Ikei. 2021. Novel Motion Display for Virtual Walking. 482–492. https://doi.org/10.1007/978-3-030-78361-7_37
- Vibol Yem, Reon Nashiki, Tsubasa Morita, Fumiya Miyashita, Tomohiro Amemiya, and Yasushi Ikei. 2019. TwinCam Go: Proposal of Vehicle-Ride Sensation Sharing with Stereoscopic 3D Visual Perception and Vibro-Vestibular Feedback for Immersive Remote Collaboration. In SIGGRAPH Asia 2019 Emerging Technologies (Brisbane, QLD, Australia) (SA ’19). Association for Computing Machinery, New York, NY, USA, 53–54. https://doi.org/10.1145/3355049.3360540