“Touch at a Distance: Simple Perception Aid Device with User’s Explorer Action”
Conference:
Experience Type(s):
Title:
- Touch at a Distance: Simple Perception Aid Device with User's Explorer Action
Description:
Although we obtain a lot of information in our environment via the visual modality, we also obtain rich information via the non-visual modality. In the mechanism how we perceive our environment, we use not only the sensor information, but also “how it changes according to how we act.” For example, we obtain the haptic information from the haptic sensor on our finger, and when we move our finger along to the surface of the touching object, the haptic information changes according to the finger motion, and we “perceive” the whole shape of the object by executing the action-and-sensing process. In other words, we have a high ability to “integrate” the relation of our body’s action and its related sensing data, so as to improve the accuracy of sensor in our body.
Based on this idea, we developed a simple perception aid device with user’s explorer action, to perceive the object at a distance, which has a linked range sensor and haptic actuator, which we name “FutureBody-Finger.” The distance sensor measures the distance to the object (20–80[cm]), and it is converted to the angle of lever attached at the servo motor (0–60[deg]). The user holds this device in his hand with attaching his index finger on the device’s lever. For the long distance to the object, the lever leans to the front, and the user feels nothing. On the other hand, for the short distance to the object, the lever stands vertically, and the user feels the existence of the object. Although the device simply measures the distance to the single point on the object, as the user “explorers” around him, the user can obtain more rich distance information of the surrounding object, and hence, finally perceive the shape of the whole object.
References:
[1]
B. B. Blasch, W. R. Wiener, and R. L. Welsh. 1997. foundations of Orientation and Mobility. AFB Press.
[2]
C. Carter, and K. A. Ferrell. 1980. The implementation of sonicguide with visually impaired infants and school children. Sensory Aids Corporation.
[3]
K. Ito, Y. Fujimoto, J. Akita, R. Otsuki, A. Masatani, T. Komatsu, M. Okamoto, and T. Ono. 2012. Development of the future body-finger: A novel travel aid for the blind. In Proceedings of 2nd International Conference on Ambient Computing, Applications, Services and Technologies, 60–63.
[4]
L. A. Johnson, and C. M. Higgins. 2006. A navigation aid for the blind using tactile-visual sensory substitution. In Proceedings of Annual International Conference of the IEEE Engineering in Medicine and Biology Society, vol. 6, 6289–6292.
[5]
L. Kay. 1974. A sonar aid to enhance spatial perception of the blind: engineering design and evaluation. Radio and Electronic Engineer 44, 11, 605–627.
[6]
L. Kay. 1984. Acoustic coupling to the ears in binaural sensory aids. Journal of Visual Impairment & Blindness 78, 1, 12–16.
[7]
Okamoto, M., Komatsu, T., Ito, K., and Ono, T. 2011. Futurebody: Design of perception using the human body. In Proceedings of the 2nd Augmented Human International Conference (AH 2011), Association for Computing Machinery, article No.a-35.
[8]
R. Mizuno, K. Ito, J. Akita, T. Ono, T. Komatsu, and M. Okamoto. 2008. Shape perception using cyarm — active sensing device. In Proceedings of the 6th International Conference of Cognitive Science (ICCS2008), 182–185.


