{"pk":25723,"title":"Representing and Learning a Large System of Number Concepts\nwith Latent Predicate Networks","subtitle":null,"abstract":"Conventional models of exemplar or rule-based concept learning\ntend to focus on the acquisition of one concept at a time.\nThey often underemphasize the fact that we learn many concepts\nas part of large systems rather than as isolated individuals.\nIn such cases, the challenge of learning is not so much in\nproviding stand-alone definitions, but in describing the richly\nstructured relations between concepts. The natural numbers\nare one of the first such abstract conceptual systems children\nlearn, serving as a serious case study in concept representation\nand acquisition (Carey, 2009; Fuson, 1988; Gallistel\n&amp; Gelman, 2005). Even so, models of natural number learning\nfocused on single-concept acquisition have largely ignored\ntwo challenges related to natural number‚Äôs status as a system\nof concepts: 1) there is an unbounded set of exact number\nconcepts, each with distinct semantic content; and 2) people\ncan reason flexibly about any of these concepts (even fictitious\nones like eighteen-gazillion). To succeed, models must instead\nlearn the structure of the entire infinite set of number concepts,\nfocusing on how relationships between numbers support reference\nand generalization. Here, we suggest that the latent predicate\nnetwork (LPN) ‚Äì a probabilistic context-sensitive grammar\nformalism ‚Äì facilitates tractable learning and reasoning\nfor natural number concepts (Dechter, Rule, &amp; Tenenbaum,\n2015). We show how to express several key numerical relationships\nin our framework, and how a Bayesian learning algorithm\nfor LPNs can model key phenomena observed in children\nlearning to count. These results suggest that LPNs might\nserve as a computational mechanism by which children learn\nabstract numerical knowledge from utterances about number","language":"eng","license":{"name":"","short_name":"","text":null,"url":""},"keywords":[{"word":"child development; concept learning; number;\ngeneralization; computational model; grammar induction"}],"section":"Papers","is_remote":true,"remote_url":"https://escholarship.org/uc/item/07p69506","frozenauthors":[{"first_name":"Joshua","middle_name":"","last_name":"Rule","name_suffix":"","institution":"MIT","department":""},{"first_name":"Eyal","middle_name":"","last_name":"Dechter","name_suffix":"","institution":"MIT","department":""},{"first_name":"Joshua","middle_name":"B","last_name":"Tenenbaum","name_suffix":"","institution":"MIT","department":""}],"date_submitted":null,"date_accepted":null,"date_published":"2015-01-01T18:00:00Z","render_galley":null,"galleys":[{"label":"PDF","type":"pdf","path":"https://journalpub.escholarship.org/cognitivesciencesociety/article/25723/galley/15347/download/"}]}