Multimodal Interaction for Distributed Interactive Simulation

Philip R. Cohen, Michael Johnston, David McGee, Sharon Oviatt, Jay Pittman, Ira Smith, Liang Chen, Josh Clow

This paper presents an emerging application of Artificial Intelligence research to distributed interactive simulations, with the goal of reducing exercise generation time and effort, yet maximizing training effectiveness. We have developed the QuickSet prototype, a pen/voice system running on a hand-held PC, communicating via wireless LAN through an agent architecture to NRaD’s LeatherNet system, a distributed interactive training simulator built for the US Marine Corps. The paper describes our novel multimodal integration strategy offering mutual compensation among modalities, as well as QuickSet’s agent-based infrastructure, and provides an example of multimodal simulation setup. Finally, we discuss our applications experience and lessons learned.

This page is copyrighted by AAAI. All rights reserved. Your use of this site constitutes acceptance of all of AAAI's terms and conditions and privacy policy.