Integration of Visual and Somatosensory Information for Preshaping Hand in Grasping Movements

Part of Advances in Neural Information Processing Systems 5 (NIPS 1992)

Bibtex Metadata Paper

Authors

Yoji Uno, Naohiro Fukumura, Ryoji Suzuki, Mitsuo Kawato

Abstract

The primate brain must solve two important problems in grasping move(cid:173) ments. The first problem concerns the recognition of grasped objects: specifically, how does the brain integrate visual and motor information on a grasped object? The second problem concerns hand shape planning: specifically, how does the brain design the hand configuration suited to the shape of the object and the manipulation task? A neural network model that solves these problems has been developed. The operations of the net(cid:173) work are divided into a learning phase and an optimization phase. In the learning phase, internal representations, which depend on the grasped ob(cid:173) jects and the task, are acquired by integrating visual and somatosensory information. In the optimization phase, the most suitable hand shape for grasping an object is determined by using a relaxation computation of the network.

Parallel Distributed Processing Research Dept., Sony Corporation,

6-7-35 Kitashinagawa, Shinagawa-ku, Tokyo 141, Japan