Comparison processes in category learning: From theory to behavior

Rubi Hammer, Aharon Bar-Hillel, Tomer Hertz, Daphna Weinshall, Shaul Hochstein

Research output: Contribution to journalArticlepeer-review

36 Scopus citations

Abstract

Recent studies stressed the importance of comparing exemplars both for improving performance by artificial classifiers as well as for explaining human category-learning strategies. In this report we provide a theoretical analysis for the usability of exemplar comparison for category-learning. We distinguish between two types of comparison - comparison of exemplars identified to belong to the same category vs. comparison of exemplars identified to belong to two different categories. Our analysis suggests that these two types of comparison differ both qualitatively and quantitatively. In particular, in most everyday life scenarios, comparison of same-class exemplars will be far more informative than comparison of different-class exemplars. We also present behavioral findings suggesting that these properties of the two types of comparison shape the category-learning strategies that people implement. The predisposition for use of one strategy in preference to the other often results in a significant gap between the actual information content provided, and the way this information is eventually employed. These findings may further suggest under which conditions the reported category-learning biases may be overcome.

Original languageEnglish
Pages (from-to)102-118
Number of pages17
JournalBrain Research
Volume1225
DOIs
StatePublished - 15 Aug 2008
Externally publishedYes

Keywords

  • Categorization
  • Category-learning
  • Expectation-maximization
  • Multidimensional scaling
  • Perceived similarity
  • Perceptron

ASJC Scopus subject areas

  • General Neuroscience
  • Molecular Biology
  • Clinical Neurology
  • Developmental Biology

Fingerprint

Dive into the research topics of 'Comparison processes in category learning: From theory to behavior'. Together they form a unique fingerprint.

Cite this