Active Audition for Humanoid

Kazuhiro Nakadai and Tino Lourens, Japan Science and Technology Corporation; Hiroshi G. Okuno, Japan Science and Technology Corporation and Science University of Tokyo; Hiroaki Kitano, Japan Science and Technology Corporation and Sony Computer Science Laboratories, Inc.

In this paper, we present an active audition system for humanoid robot {\it SIG}. The audition system of the highly intelligent humanoid requires localization of sound sources and identification of meanings of the sound in the auditory scene. The active audition reported in this paper focuses on improved sound source tracking by integrating audition, vision, and motor movements and their motion information. Given the multiple sound sources in the auditory scene, {\it SIG the humanoid} actively moves its head to improve localization by aligning microphones orthogonal to the sound source and by capturing the possible sound sources by vision. However, such an active head movement inevitably creates motor noise. The system must adaptively cancel motor noise using motor control signals. The experimental result demonstrates that the active audition by integration of audition, vision, and motor control enables sound source tracking in variety of conditions.

This page is copyrighted by AAAI. All rights reserved. Your use of this site constitutes acceptance of all of AAAI's terms and conditions and privacy policy.