A Minimal Encoding Approach to Feature Discovery

Mark Derthick

This paper discusses unsupervised learning of orthogonal concepts on relational data. Relational predicates, while formally equivalent to the features of the concept-learning literature, are not a good basis for defining concepts. Hence the current task demands a much larger search space than traditional concept learning algorithms, the sort of space explored by connectionist algorithms. However the intended application, using the discovered concepts in the Cyc knowledge base, requires that the concepts be interpretable by a human, an ability not yet realized with connectionist algorithms. Interpretability is aided by including a characterization of simplicity in the evaluation function. For Hinton’s Family Relations data, we do find cleaner, more intuitive features. Yet when the solutions are not known in advance, the difficulty of interpreting even features meeting the simplicity criteria calls into question the usefulness of any reformulation algorithm that creates radically new primitives in a knowledge-based setting. At the very least, much more sophisticated explanation tools are needed.

This page is copyrighted by AAAI. All rights reserved. Your use of this site constitutes acceptance of all of AAAI's terms and conditions and privacy policy.