“REVEL: tactile feedback technology for augmented reality” by Bau and Poupyrev
Conference:
Type(s):
Title:
- REVEL: tactile feedback technology for augmented reality
Presenter(s)/Author(s):
Abstract:
REVEL is an augmented reality (AR) tactile technology that allows for change to the tactile feeling of real objects by augmenting them with virtual tactile textures using a device worn by the user. Unlike previous attempts to enhance AR environments with haptics, we neither physically actuate objects or use any force- or tactile-feedback devices, nor require users to wear tactile gloves or other apparatus on their hands. Instead, we employ the principle of reverse electrovibration where we inject a weak electrical signal anywhere on the user body creating an oscillating electrical field around the user’s fingers. When sliding his or her fingers on a surface of the object, the user perceives highly distinctive tactile textures augmenting the physical object. By tracking the objects and location of the touch, we associate dynamic tactile sensations to the interaction context. REVEL is built upon our previous work on designing electrovibration-based tactile feedback for touch surfaces [Bau, et al. 2010]. In this paper we expand tactile interfaces based on electrovibration beyond touch surfaces and bring them into the real world. We demonstrate a broad range of application scenarios where our technology can be used to enhance AR interaction with dynamic and unobtrusive tactile feedback.
References:
1. Amberg, M., Fr, Giraud, R., Semail, B., Olivo, P., Casiez, R. and Roussel, N. 2011. STIMTAC: a tactile input device with programmable friction. In Proc. of UIST’11, ACM, 7–8. Google ScholarDigital Library
2. Azuma, R. 1997. A Survey of Augmented Reality. Presence: Teleoperatore and Virtual Environments 6, 355–385.Google ScholarDigital Library
3. Azuma, R., Baillot, Y., Behringer, R., Feiner, S., Julier, S. and Macintyre, B. 2001. Recent Advances in Augmented Reality. IEEE Comput. Graph. Appl. 21, 34–47. Google ScholarDigital Library
4. Bau, O., Petrevski, U. and Mackay, W. 2009. BubbleWrap: a textile-based electromagnetic haptic display. In Proc. of CHI EA’09, ACM, 3607–3612. Google ScholarDigital Library
5. Bau, O., Poupyrev, I., Israr, A. and Harrison, C. 2010. TeslaTouch: electrovibration for touch surfaces. In Proc. of UIST’10, ACM, 283–292. Google ScholarDigital Library
6. Benko, H., Wilson, A., Balakrishnan, R. and Chen, B. Sphere: multi-touch interactions on a spherical display. In Proc. of UIST’08, ACM. 77–86 Google ScholarDigital Library
7. Bianchi, G., Knoerlein, B., Szekely, M. and Harders, M. 2006. High precision augmented reality haptics. In Proc. of EuroHaptics’06, 169–178.Google Scholar
8. Burdea, G. C. 1996. Force and touch feedback for virtual reality. Google ScholarDigital Library
9. Carlin, A., Hoffman, H. and Weghorst, S. 1997. Virtual reality and tactile augmentation in the treatment of spider phobia: a case report. Behavior Research and Therapy 35, 153–159.Google ScholarCross Ref
10. Fitzmaurice, G., Ishii, H. and Buxton, W. 1995. Bricks: Laying the foundations for graspable user interfaces. In Proc. of CHI’95, ACM, 442–449. Google ScholarDigital Library
11. Grimnes, S. 1983. Dielectric breakdown of human skin in vivo. Medical and Biological Engineering and Computing 21, 379–381.Google ScholarCross Ref
12. Grimnes, S. 1983. Electrovibration, cutaneous sensation of microampere current. Acta Physiologica Scandinavica 118, 19–25.Google ScholarCross Ref
13. Harrison, C., Benko, H. and Wilson, A. 2011. OmniTouch: Wearable Multitouch Interaction Everywhere. In Proc. of UIST’11, ACM, 441–450 Google ScholarDigital Library
14. Huang, K., Starner, T., Do, E., Weiberg, G., Kohlsdorf, D., Ahlrichs, C. and Leibrandt, R. 2010. Mobile music touch: mobile tactile stimulation for passive learning. In Proc. of CHI’10, ACM, 791–800. Google ScholarDigital Library
15. Israr, A. and Poupyrev, I. 2011. Tactile brush: Drawing on skin with a tactile grid display. In Proc. of CHI’11, ACM, 2019–2028. Google ScholarDigital Library
16. Iwata, H., Yano, H., Nakaizumi, F. and Kawamura, R. 2001. Project FEELEX: adding haptic surface to graphics. In Proc. of SIGGRAPH’01, ACM, 469-476. Google ScholarDigital Library
17. Jeon, S. and Choi, S. 2009. Haptic Augmented Reality: Taxonomy and an Example of Stiffness Modulation. Presence: Teleoperators and Virtual Environments 18, 387–408. Google ScholarDigital Library
18. Kaczmarek, K., Nammi, K., Agarwal, A., Tyler, M., Haase, S. and Beebe, D. 2006. Polarity effect in electrovibration for tactile display. IEEE Transactions on Biomedical Engineering 10, 2047–2054.Google ScholarCross Ref
19. Kajimoto, H. 2010. Electro-tactile display with real-time impedance feedback. In Proc. of Haptics Symposium, Springer-Verlag, 285–291. Google ScholarDigital Library
20. Kato, H., Billinghurst, M., Poupyrev, I., Imamoto, K. and Tachibana, K. 2000. Virtual Object Manipulation on a Table-Top AR Environment. In Proc. of International Symposium on Augmented Reality, ACM, 111–119.Google ScholarCross Ref
21. Knoerlein, B., Szekely, G. and Harders, M. 2007. Visuo-haptic collaborative augmented reality ping-pong. In Proc. of ACET’07, ACM, 91–94. Google ScholarDigital Library
22. Kron, A. and Schmidt, G. 2003. Multi-Fingered Tactile Feedback from Virtual and Remote Environments. In Proc. of HAPTICS’03, IEEE, 16. Google ScholarDigital Library
23. Kruijff, E., Schmalstieg, D. and Beckhaus, S. 2006. Using Neuromuscular Electrical Stimulation for Pseudo-Haptic Feedback. In Proceedings of VRST’06, ACM, 316–319. Google ScholarDigital Library
24. Mallinckrodt, E., Hughes, A. and Sleator, W. 1953. Perception by the Skin of Electrically Induced Vibrations. Science 118, 277–278.Google ScholarCross Ref
25. Matsushita, N. and Rekimoto, J. 1997. HoloWall: designing a finger, hand, body, and object sensitive wall. In Proc. of UIST’97, ACM, 209–210. Google ScholarDigital Library
26. MICROSOFT. 2010 Microsoft Surface 2.0.Google Scholar
27. Minsky, M., Ming, O.-Y., Steele, O., Frederick P. Brooks, J. and Behensky, M. 1990. Feeling and seeing: issues in force display. In Proc. of SIGGRAPH’90. 235–241. Google ScholarDigital Library
28. Niwa, M., Nozaki, T., Maeda, T. and Ando, H. 2010. Fingernail-Mounted Display of Attraction Force and Texture. In Proc. of EuroHaptics’10, Springer-Verlag, 3–8. Google ScholarDigital Library
29. Nojima, T., Sekiguchi, D., Inami, M. and Tachi, S. 2002. The SmartTool: A system for Augmented Reality of Haptics. In Proc. of VR’02, IEEE, 67–72. Google ScholarDigital Library
30. Poupyrev, I., Tan, D. et al. 2002. Developing a generic augmented-reality interface, IEEE Computer, 2002. 35: 44–49 Google ScholarDigital Library
31. Poupyrev, I. and Maruyama, S. 2003. Tactile interfaces for small touch screens. In Proc. of UIST’03, ACM, 217–220. Google ScholarDigital Library
32. Poupyrev, I., Nashida, T., Okabe, M. 2007. Actuation and Tangible User Interfaces: the Vaucanson Duck, Robots, and Shape Displays. In Proc. of TEI’07, ACM, 205–212 Google ScholarDigital Library
33. Rekimoto, J. 2009. SenseableRays: Opto-Haptic Substitution for Touch-Enhanced Interactive Spaces. In Proc. of CHI EA’09 ACM, 2519–2528. Google ScholarDigital Library
34. Rekimoto, J. and Saitoh, M. 1999. Augmented surfaces: a spatially continuous work space for hybrid computing environments. In Proc. of CHI’99, ACM, 378–385. Google ScholarDigital Library
35. Ryu, J. and Kim, G. 2004. Using a Vibro-tactile Display for Enhanced Collision Perception and Presence. In Proc. of VRST’04 ACM, 89–96. Google ScholarDigital Library
36. Schmalstieg, D., Fuhrmann, A. and Hesina, G. 2000. Bridging multiple user interface dimensions with augmented reality. In Proc. of ISAR’00, IEEE, 20–29Google Scholar
37. Strong, R. M. and Troxel, D. E. 1970. An electrotactile display. IEEE Transactions on Man-Machine Systems 11, 72–79.Google ScholarCross Ref
38. Takeuchi, Y. 2010. Gilded gait: reshaping the urban experience with augmented footsteps. In Proc. of UIST’10, ACM, 185–188. Google ScholarDigital Library
39. Tamaki, E., Miyaki, T. and Rekimoto, J. 2011. PossessedHand: techniques for controlling human hands using electrical muscles stimuli. In Proc. of the SIGCHI 2011, ACM, 543–552. Google ScholarDigital Library
40. Tan, H. and Pentland, A. 1997. Tactual displays for wearable computing. In ISWC’97 IEEE, 84–89. Google ScholarDigital Library
41. Tang, H. and Beebe, D. 1998. A microfabricated electrostatic haptic display for persons with visual imairments. IEEE Transactions on Rehabilitation Engineering 6, 241–248.Google ScholarCross Ref
42. Tsetserukou, D., Sato, K. and Tachi, S. 2010. ExoInterfaces: Novel Exosceleton Haptic Interfaces for Virtual Reality, Augmented Sport and Rehabilitation. In Proc. of the Augmented Human’10 2010 ACM, 1–6. Google ScholarDigital Library
43. Ullmer, B. and Ishii, H. 1997. The metaDESK: models and prototypes for tangible user interfaces. In Proc. of UIST’97 ACM, 223–232. Google ScholarDigital Library
44. Vallino, J. and Brown, C. 1999. Haptics in augmented reality. In Proceedings of the Multimedia Computing and Systems 1999 IEEE, 195–200. Google ScholarDigital Library
45. Webster, J. 1998. Medical instrumentation: Application and design Wiley, 173.Google Scholar
46. Willis, K. D. D., Poupyrev, I., Hudson, S. E. and Mahler, M. 2011. SideBySide: ad-hoc multi-user interaction with handheld projectors. In Proc. of UIST’11, ACM, 431–440. Google ScholarDigital Library
47. Wilson, A. D. 2010. Using a depth camera as a touch sensor. In Proc. of ITS 2010, 2010 ACM, 69–72. Google ScholarDigital Library
48. Wilson, A. D. and Benko, H. 2010. Combining multiple depth cameras and projectors for interactions on, above and between surfaces. In Proc. of UIST’10, ACM, 273–282. Google ScholarDigital Library
49. Woodward, C., Honkamaa, P., Jppinen, J. and Pykkimies, W. 2004. Camball – augmented virtual table tennis with real rackets. In Proc. of the ACET, ACM, 275–276. Google ScholarDigital Library