“Above Your Hand: direct and natural interaction with aerial robot” by Miyoshi, Konomura and Hori

  • ©Kensho Miyoshi, Ryo Konomura, and Koichi Hori

Title:


    Above Your Hand: direct and natural interaction with aerial robot

Presenter(s):


Entry Number: 03


Description:


    “Above Your Hand” is a new application of interactive aerial robot. We explore direct and natural interaction with autonomous aerial robots and its applications. The uniqueness of our approach is to use no external equipment such as controllers, motion tracking system or wireless control system.

    Our original palm-sized quadcopter (Figure 1, right) flies above your hand wearing a glove of a particular color. It is capable of following your hand using two onboard cameras (Figure 1, middle). One camera is attached horizontally to the quadcopter and the other vertically. When it does not detect hands, it keeps hovering at its current location by processing the feature points of the obtained images of its environment without particular landmarks [Konomura et al. 2013, 2014]. Since all of this processing is executed inside the onboard Linux-based microcontroller, it requires no external computational control at all. This is the world-first success in both the smallness and full autonomy within the onboard computer. The smallness enables more active and closer interaction in indoor environments. Conversely our robot does not perform well in outdoor or windy environments yet.

    Owing to the simple design of the interaction, it is easy for multiple people to be involved in the interaction at the same time. For example you can “pass” the flying robot to another person from hand to hand (Figure 1, left).

    There have been previous works on natural interaction with aerial robots. Ng made methods to interact with Parrot’s AR.Drone using Microsoft’s Kinect [Ng et al. 2011]. Sanna presented a NUI framework for quadcopter control [Sanna et al. 2013]. Lementec used multiple orientation sensors to classify gestures [Lementec et al. 2004]. Since all of these methods require stationary equipment, there are naturally limitations in the available area of the aerial robots. Our approach has much potential to remove this limitation.

References:


    KONOMURA, R., ET AL., K., Designing Hardware and Software Systems Toward Very Compact and Fully Autonomous Quadrotors, IEEE/ASME International Conference on Advanced Intelligent Mechatronics, 2013

    KONOMURA, R., ET AL., Visual 3D Self Localization with 8 Gram Circuit Board for Very Compact and Fully Autonomous Unmanned Aerial Vehicles, IEEE International Conference on Robotics and Automation, May 31-June 5, 2014 (to appear)

    NG, W., ET AL., Collocated Interaction with Flying Robots, ROMAN, 2011 IEEE. IEEE, 2011

    SANNA, A., ET AL., A Kinect-based natural interface for quadrotor control. Intelligent Technologies for Interactive Entertainment. Springer Berlin Heidelberg, 2012

    LEMENTEC. J-C., ET AL., Recognition of arm gestures using multiple orientation sensors: gesture classification, Intelligent Transportation Systems, 2004. Proceedings. The 7th International IEEE Conference on. IEEE, 2004


Additional Images:

©Kensho Miyoshi, Ryo Konomura, and Koichi Hori ©Kensho Miyoshi, Ryo Konomura, and Koichi Hori

ACM Digital Library Publication:



PDF:



Conference:


Type: