Interactive Multisensory Object Perception for Embodied Agents
Papers from the 2017 AAAI Spring Symposium
Vivian Chu, Jivko Sinapov, Jeannette Bohg, Sonia Chernova, Andrea L. Thomaz, Organizers
Technical Report SS-17-05
This technical report has been published as a section in The 2017 AAAI Spring Symposium Series: Technical Reports.
Contents
Learning of Object Properties, Spatial Relations, and Actions for Embodied Agents from Language and Vision
Muhannad Alomari, Paul Duckworth, David C. Hogg, Anthony G. Cohn
How Much Haptic Surface Data Is Enough?
Alex Burka, Katherine J. Kuchenbecker
Regrasping Using Tactile Perception and Supervised Policy Learning
Yevgen Chebotar, Karol Hausman, Oliver Kroemer, Gaurav Sukhatme, Stefan Schaal
Merging Local and Global 3D Perception using Contact Sensing
Rebecca Cox, Nikolaus Correll
Summary of Experiments in Belief-Space Planning at the Laboratory for Perceptual Robotics
Scott Michael Jordan, Dirk Ruiken, Tiffany Q. Liu, Takeshi Takahashi, Michael W. Lanighan, Roderic A. Grupen
Visual Stability Prediction and Its Application to Manipulation
Wenbin Li, Ales Leonardis, Mario Fritz
Building Kinematic and Dynamic Models of Articulated Objects with Multi-Modal Interactive Perception
Roberto Martín-Martín, Oliver Brock
Reinforcement Learning Based Embodied Agents Modelling Human Users Through Interaction and Multi-Sensory Perception
Kory Wallace Mathewson, Patrick M. Pilarski
A Deep Neural Model for Emotion-Driven Multimodal Attention
German Ignacio Parisi, Pablo Barros, Haiyan Wu, Guochun Yang, Zhenghan Li, Xun Liu, Stefan Wermter
AAAI Digital Library
AAAI relies on your generous support through membership and donations. If you find these resources useful, we would be grateful for your support.