Explanation-Aware Computing: Papers from the AAAI Fall Symposium
Thomas Roth-Berghofere, Chair
November 4-6, 2005, Arlington, Virginia
Technical Report FS-05-04
116 pp., $30.00
ISBN 978-1-57735-250-1
[Add to Cart] [View Cart]
With the introduction of intelligent, adaptive systems and decision automation, the need arises for explaining system answers to the user with respect to the IT application's knowledge. The user is interested in how reliable a system's answers are. An obvious approach to increase confidence in the system's result is to output explanations as part of the result. Belief in a system can be increased not only by the quality of its output but, more importantly, by evidence of how it was derived. Such systems will become more robust and dependable. It is a psychological characteristic that the user wants to have some sense of control over the system. Such systems must justify their means and decisions.
Explanations as answers to why-questions are studied in depth in philosophy of science. Expert systems research operationalized explanations and derived aspects for good explanations. To fulfill these aspects, advanced models, methods, and tools are needed that provide mechanisms and techniques for structured management of explanation relevant information, effective ways for retrieving it, and the possibility to integrate explanation and application knowledge. Beyond technical aspects, it becomes important to understand explanations from social and philosophical perspectives on IT-applications.