“Mobile Augmented Reality Systems” by Feiner
Conference:
Type(s):
E-Tech Type(s):
- Augmented Reality
Entry Number: 16
Title:
- Mobile Augmented Reality Systems
Presenter(s):
Collaborator(s):
Description:
Augmented reality refers to using computers to overlay virtual information on the real world. Mobile Augmented Reality Systems (MARS) uses see-through head-worn displays with backpack-based computers developed by Columbia University and the Naval Research Laboratory, tracking technology developed by InterSense, and an infrared transmitter-based ubiquitous information infrastructure from eyeled GmbH. Our system creates a pervasive 3D information space that documents Emerging Technologies. It demonstrates some of the user interface techniques that we are developing to present information for MARS, including systems that adapt as the user moves between regions with high-precision six-degree-of-freedom tracking, orientation tracking and coarse position tracking, and orientation tracking alone.
As attendees wearing our systems walk around and near our installation, they are tracked by a six-degree-of-freedom tracker. The information they view is situated relative to the 3D coordinate system of the installation area. For example, an installation may be surrounded by virtual representations of associated material. In other parts of the installation area, tracking is accomplished through a combination of inertial head-and-body orientation trackers and a coarse position tracker based on a constellation of infrared transmitters. In those areas, information is situated relative to the 3D coordinate system of the user’s body but is sensitive to the user’s coarse position. As users move between areas of the installation area where different tracking technologies are in effect, the user interface adapts to use the best one available. Our infrared transmitters will also allow attendees to explore parts of the same information space with their own hand-held devices.
The MARS user interfaces embody three techniques that we are exploring to develop effective augmented-reality user interfaces: information filtering, user interface component design, and view management. Information filtering helps select the most relevant information to present, based on data about the user, the tasks being performed, and the surrounding environment, including the user’s location. User interface component design determines the format in which this information should be conveyed, based on the available display resources and tracking accuracy. For example, the absence of high-accuracy position tracking would favor body- or screen-stabilized components over world-stabilized ones that would need to be registered with the physical objects to which they refer. View management attempts to ensure that the virtual objects that are selected for display are arranged appropriately with regard to their projections on the view plane. For example, virtual objects that are not constrained to occupy a specific position in the 3D world should be arranged so they do not obstruct the view of other physical or virtual objects in the scene that are more important.
We believe that user interface techniques of this sort will play a key role in the MARS devices that people will begin to use on an everyday basis over the coming decade.
Acknowledgements:
MARS research at Columbia University is funded in part by
ONR, NSF, and gifts from IBM, Intel, Microsoft, and
Mitsubishi. MARS research at NRL is funded in part by ONR.