“Twinkle: Interface for using Handheld Projectors to Intaract with Physical Surfaces” by Yoshida, Nii, Kawakami and Tachi

  • ©Takumi Yoshida, Hideaki Nii, Naoki Kawakami, and Susumu Tachi

Title:


    Twinkle: Interface for using Handheld Projectors to Intaract with Physical Surfaces

Presenter(s):


Entry Number: 23


Description:


    Recently, many small pocket-size projectors have been developed. It is expected that in the near future, such projectors will be installed in portable devices. Meanwhile, intuitive interfaces that operate according to the user’s motion have been popular. Therefore, the interfaces that can be used for accessing information using handheld projectors have been increasingly studied [Forlines et al. 2005][Cao et al. 2007]. However, these interfaces suffer from a number of problems. Some systems need motion-tracking systems in order to measure the position of the projector. Further, the surface where image is projected are limited to plain screen like a white wall.

    The purpose of our study is to propose a novel interface for interaction with an arbitrary physical plane surface; here, we define interaction as the utilization of a physical surface to perform certain tasks. We call the interface “Twinkle”. We define a physical surface as a surface that exists in a physical environment and is not plain. Examples of such surfaces are a poster on a wall, figures or characters on a whiteboard, and a desk on which objects are placed. When a user shines light from a handheld projector such as a flashlight onto a physical surface, pictures are displayed and sounds are emitted according to the objects that are present on the surface and the user’s motion. Figure 1 (left) shows the concept behind Twinkle.

    Our method enables various applications. A few examples of the applications of Twinkle are mentioned below. First, we propose an interface for music composition and musical performance. The pitch of a sound is determined by the size of the object illuminated by the projector. The color of the object and the user’s motion determine the tone and the volume, respectively, of the sound. The user can create melody and rhythm by laying out objects on a surface. This interface enables users to compose and play music on the basis of intuition, i.e., they can compose and play music even if they do not have knowledge of musical score. Next, we propose an AR annotation system. The system recognizes figures or characters on a surface, and information is presented near those objects. Additionally, the proposed interface can be used in shooting games or action games. In such games, real objects on a surface are regarded as obstacles.


Other Information:


    References

    CAO, X., FORLINES, C., AND BALAKRISHNAN, R. 2007. Multiuser interaction using handheld projectors. In Proceedings of the 20th Annual ACM Symposium on User interface Software and Technology, 43–52.

    FORLINES, C., BALAKRISHNAN, R., BEARDSLEY, P., V. BAAR, J., AND RASKAR, R. 2005. Zoom-and-pick: facilitating visual zooming and precision pointing with interactive handheld projectors. In Proceedings of the 18th annual ACM symposium on User interface software and technology, 73–82.


Additional Images:

©Takumi Yoshida, Hideaki Nii, Naoki Kawakami, and Susumu Tachi ©Takumi Yoshida, Hideaki Nii, Naoki Kawakami, and Susumu Tachi

PDF:



Conference:


Overview Page:


Type:


Keyword(s):