“Remote Control Experiment with DisplayBowl and 360-Degree Video” by Miyafuji, Toyohara, Sato and Koike

  • ©Shio Miyafuji, Soichiro Toyohara, Toshiki Sato, and Hideki Koike

  • ©Shio Miyafuji, Soichiro Toyohara, Toshiki Sato, and Hideki Koike

  • ©Shio Miyafuji, Soichiro Toyohara, Toshiki Sato, and Hideki Koike

Conference:


Type:


Entry Number: 62

Title:

    Remote Control Experiment with DisplayBowl and 360-Degree Video

Presenter(s)/Author(s):



Abstract:


    DisplayBowl is a bowl-shaped hemispherical display for showing omnidirectional images with direction data. It provides users with a novel way of observing 360-degree video streams, which improves the awareness of the surroundings when operating a remote-controlled vehicle compared to conventional flat displays and HMDs. In this paper, we present a user study, in which we asked participants to control a remote drone using an omnidirectional video streaming, to compare the uniqueness and advantages of three displays: a flat panel display, a head-mounted display and DisplayBowl. 

References:


    • Huiwen Chang and Michael F. Cohen. 2017. Panning and Zooming High-Resolution Panoramas in Virtual Reality Devices. In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology (UIST ’17). ACM, 279–288. https://doi.org/10.1145/3126594.3126617 
    • Satoshi Hashizume, Ippei Suzuki, Kazuki Takazawa, Ryuichiro Sasaki, and Yoichi Ochiai. 2018. Telewheelchair: The Remote Controllable Electric Wheelchair System Combined Human and Machine Intelligence. In Proceedings of the 9th Augmented Human International Conference (AH ’18). Article 7, 9 pages. https://doi.org/10. 1145/3174910.3174914 
    • Yasamin Heshmat, Brennan Jones, Xiaoxuan Xiong, Carman Neustaedter, Anthony Tang, Bernhard E. Riecke, and Lillian Yang. 2018. Geocaching with a Beam: Shared Outdoor Activities Through a Telepresence Robot with 360 Degree Viewing. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI ’18). Article 359, 13 pages. https://doi.org/10.1145/3173574.3173933 
    • Yung-Ta Lin, Yi-Chi Liao, Shan-Yuan Teng, Yi-Ju Chung, Liwei Chan, and Bing-Yu Chen. 2017. Outside-In: Visualizing Out-of-Sight Regions-of-Interest in a 360 degree Video Using Spatial Picture-in-Picture Previews. In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology (UIST ’17). ACM, New York, NY, USA, 255–265. https://doi.org/10.1145/3126594.3126656 
    • Akira Matsuda, Takashi Miyaki, and Jun Rekimoto. 2017. ScalableBody: A Telepresence Robot That Supports Face Position Matching Using a Vertical Actuator. In Proceedings of the 8th Augmented Human International Conference (AH ’17). ACM, New York, NY, USA, Article 13, 9 pages. https://doi.org/10.1145/3041164.3041182 
    • Shio Miyafuji, Soichiro Toyohara, Toshiki Sato, and Hideki Koike. 2018. DisplayBowl: A Bowl-Shaped Display for Omnidirectional Videos. In The 31st Annual ACM Symposium on User Interface Software and Technology Adjunct Proceedings (UIST ’18 Adjunct). 99–101. https://doi.org/10.1145/3266037.3266114.

Keyword(s):



Additional Images:

©Shio Miyafuji, Soichiro Toyohara, Toshiki Sato, and Hideki Koike ©Shio Miyafuji, Soichiro Toyohara, Toshiki Sato, and Hideki Koike ©Shio Miyafuji, Soichiro Toyohara, Toshiki Sato, and Hideki Koike

PDF:



Overview Page: