Assessing Conceptual Complexity and Compressibility Using Information Gain and Mutual Information

F. Mathy

Published 2010 in Unknown venue

ABSTRACT

In this paper, a few basic notions stemming from information theory are presented with the intention of modeling the abstraction of relevant information in categorization tasks. In a categorization task, a single output variable is the basis for performing a dichotomic classification of objects that can be distinguished by a set of input variables which are more or less informative about the category to which the objects belong. At the beginning of the experiment, the target classification is unknown to learners who must select the most informative variables relative to the class in order to succeed in classifying the objects efficiently. I first show how the notion of entropy can be used to characterize basic psychological processes in learning. Then, I indicate how a learner might use information gain and mutual information –both based on entropy– to efficiently induce the shortest rule for categorizing a set of objects. Several basic classification tasks are studied in succession with the aim of showing that learning can improve as long as subjects are able to compress information. Referring to recent experimental results, I indicate in the Conclusion that these notions can account for both strategies and performance in subjects trying to simplify a learning process.

PUBLICATION RECORD

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.

REFERENCES

Showing 1-71 of 71 references · Page 1 of 1