“Haptylus: Haptic Stylus for Interaction with Virtual Objects Behind a Touch Screen” – ACM SIGGRAPH HISTORY ARCHIVES

“Haptylus: Haptic Stylus for Interaction with Virtual Objects Behind a Touch Screen”

  • ©

Conference:


Experience Type(s):


Title:


    Haptylus: Haptic Stylus for Interaction with Virtual Objects Behind a Touch Screen

Description:


    Tablet PCs and smartphones rapidly become popular nowadays. People can touch objects on the touch panel display of the tablet PC or smartphone, but only get sensation of touching the surface of the display. Recently, some systems capable of inserting themselves into the display by using retractable stylus have been proposed. Beyond [Lee and Ishii 2010] is one of these systems. It consists of a retractable stylus, a table-top display, an infrared marker and a camera set at an environment. A virtual tip of the stylus is rendered when the retractable stylus is pushed to the table-top display. The head position of the user is detected by the infrared marker and the camera, and the virtual objects and the tip of the stylus are rendered properly according to the head’s position. The system enables the user to interact with a virtual object under the table. However, the stylus dose not shrink or extend automatically because the stylus dose not have any actuators such as a motor. So the user is unable to feel the haptic sensation from the virtual object. It is necessary for the user to perceive the force from the virtual object to interact with the object more realistically. Another limitation is the fact that the system is stationary. ImpAct [Withana et al. 2010] is another interaction system with a smartphone and a retractable stylus. The force feedback is represented by simply stopping the shrinkage of the stylus. However, the system gives only the rigid force feedback without tactile sensations. And the system does not give a user the tactile sensation from the virtual objects. In addition, the system does not consider the viewpoint of the user.

References:


    [1]
    Lee, J., and Ishii, H. 2010. Beyond-collapsible tools and gestures for computational design. Proceeding of the 28th of the international conference extended abstracts on Human factors in computing systems, 3931–3936.

    [2]
    Owaki, T., Nakabo, Y., Namiki, A., Ishii, I., and Ishikawa, M. 1999. Real-time system for virtually touching objects in the real world using modality transformation from images to haptic information. Systems and Computers in Japan 30, 9, 17–24.

    [3]
    Withana, A., Kondo, M., Makino, Y., Kakehi, G., Sugimoto, M., and Inami, M. 2010. Impact: Immersive haptic stylus to enable direct touch and manipulation for surface computing. Comput. Entertain. 8, 2 (Dec.), 9:1–9:16.


ACM Digital Library Publication:


Overview Page:



Submit a story:

If you would like to submit a story about this experience or presentation, please contact us: historyarchives@siggraph.org