Multimodal User Supervised Interface and Intelligent Control (MUSIIC)

Matthew Beitler, Zunaid Kazi, Marcos Salganicoff, Richard Foulds, Shoupu Chen and Daniel Chester

This research involves a method and system which integrates multimodal human-computer interaction with reactive planning to operate a telerobot for use as an assistive device. The Multimodal User Supervised Interface and Intelligent Control (MUSIIC) strategy is a novel approach for intelligent assistive telerobotic system. This approach to robotic interaction is both a step towards addressing the problem of allowing individuals with physical disabilities to operate a robot in an unstructured environment and an illustration of general principles of integrating speech-deictic gesture control with a knowledge-driven reactive planner and a stereo-vision system.

This page is copyrighted by AAAI. All rights reserved. Your use of this site constitutes acceptance of all of AAAI's terms and conditions and privacy policy.