“A haptic rendering for hybrid environments”

  • ©Jong-Phil Kim, Beom-Chan Lee, and Jeha Ryu

  • ©Jong-Phil Kim, Beom-Chan Lee, and Jeha Ryu

Conference:


Type:


Title:

    A haptic rendering for hybrid environments

Presenter(s)/Author(s):



Abstract:


    Haptic rendering is a process that provides user with tactual sensory information of virtual or augmented environments. Hybrid environments may contain diverse object data types such as camera-captured 2D or 2.5D images and 3D CAD models represented by triangular mesh, implicit, or voxel. The previous haptic algorithms that depend on a specific object data representation, however, can not provide unified haptic rendering for these hybrid environments. For example, in order to provide haptic feedback for medical instruments modeled by polygon and internal organs by MRI, additional efforts such as iso-surface extraction for the volumetric data are required. This paper proposes haptic rendering for the hybrid environments. It makes it possible to use various object data without any additional data preprocessing, and thus it enables to apply haptics easily for diverse hybrid environments.


Additional Images:

©Jong-Phil Kim, Beom-Chan Lee, and Jeha Ryu ©Jong-Phil Kim, Beom-Chan Lee, and Jeha Ryu

ACM Digital Library Publication:



Overview Page: