“Wearable Haptics and Hand Tracking via an RGB-D Camera for Immersive Tactile Experiences” by Meli, Scheggi, Pacchierotti and Prattichizzo

  • ©Leonardo Meli, Stefano Scheggi, Claudio Pacchierotti, and Domenico Prattichizzo

  • ©Leonardo Meli, Stefano Scheggi, Claudio Pacchierotti, and Domenico Prattichizzo

  • ©Leonardo Meli, Stefano Scheggi, Claudio Pacchierotti, and Domenico Prattichizzo

Conference:


Type(s):


Entry Number: 109

Title:

    Wearable Haptics and Hand Tracking via an RGB-D Camera for Immersive Tactile Experiences

Presenter(s)/Author(s):



Abstract:


    In 1997 Sony revolutionized the gaming industry by introducing a simple but effective vibrotactile feedback in its DualShock controller for PlayStation. By 2013, more than 400M units have been sold. Nowadays, the game interface Wii Remote motion controller provides a similar feature, but wirelessly, and can be considered the most popular portable haptic interface, with over 100M sales. However, its force feedback is still limited to vibrations, reducing the possibility of simulating any rich contact interaction with virtual and remote environments. Towards a more realistic feeling of interacting with virtual objects, researchers focused on glove-type haptic displays such as the Rutgers Master II and the CyberGrasp, which provide force sensations to all the fingers of the hand simultaneously. However, although they provide a compelling force feedback, these displays are complex and very expensive – the Cyber- Grasp, for instance, costs more than 60,000 US dollars!
    Thus, it becomes crucial to find a trade-off between a realistic feeling of touch and cost/portability of the system. In this regard, we found tactile technologies very promising. Tactile devices are haptic interfaces able to provide tactile force feedback only (they do not provide any kind of kinesthetic force). This property makes possible to dramatically simplify their form factor and provide a compelling and realistic feeling of touching virtual objects [Pacchierotti et al. 2014].

References:


    1. Oikonomidis, I., Kyriazis, N., and Argyros, A. 2011. Efficient model-based 3d tracking of hand articulations using kinect. In Proc. 22nd BMVC.
    2. Pacchierotti, C., Tirmizi, A., and Prattichizzo, D. 2014. Improving transparency in teleoperation by means of cutaneous tactile force feedback. ACM Trans. Appl. Percept. 11, 1, 4:1–4:16.
    3. Šarić, M., 2011. Libhand: A library for hand articulation. Version 0.9.

Acknowledgements:


    The research has received funding from the European Union Seventh Framework Programme FP7/2007-2013 with project “WEARHAP” (grant 601165).


PDF:



ACM Digital Library Publication:



Overview Page: