Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
null (Ed.)Reduplication is common, but analogous reversal processes are rare, even though reversal, which involves nested rather than crossed dependencies, is less complex on the Chomsky hierarchy. We hypothesize that the explanation is that repetitions can be recognized when they match and reactivate a stored trace in short-term memory, but recognizing a reversal requires rearranging the input in working memory before attempting to match it to the stored trace. Repetitions can thus be recognized, and repetition patterns learned, implicitly, whereas reversals require explicit, conscious awareness. To test these hypotheses, participants were trained to recognize either a reduplication or a syllable-reversal pattern, and then asked to state the rule. In two experiments, above-chance classification performance on the Reversal pattern was confined to Correct Staters, whereas above-chance performance on the Reduplication pattern was found with or without correct rule-stating. Final proportion correct was positively correlated with final response time for the Reversal Correct Staters but no other group. These results support the hypothesis that reversal, unlike reduplication, requires conscious, time-consuming computation.more » « less
-
An evolutionary model of pattern learning in the MaxEnt OT/HG framework is de- scribed in which constraint induction and con- straint weighting are consequences of repro- duction with variation and differential tness. The model is shown to t human data from published experiments on both unsupervised phonotactic (Moreton et al., 2017) and super- vised visual (Nosofsky et al., 1994) pattern learning, and to account for the observed re- versal in dif culty order of exclusive-or vs. gang-effect patterns between the two experi- ments. Different parameter settings are shown to yield gradual, parallel, connectionist- and abrupt, serial, symbolic-like performance.more » « less
-
An evolutionary algorithm for simultaneously inducing and weighting phonological constraints (the Winnow-Maxent Subtree Breeder) is described, analyzed, and illustrated. Implementing weights as sub-population sizes, reproduction with selection executes a new variant of Winnow (Littlestone1988), which is shown to converge. A flexible constraint schema, based on the same prosodic and autosegmental trees used in representations, is described, together with algorithms for mutation and recombination (mating). The algorithm is applied to explaining abrupt learning curves, and predicts an empirical connection between abruptness and language-particularity.more » « less
-
When does a gradual learning rule translate into gradual learning performance? This paper studies a gradient-ascent Maximum Entropy phonotactic learner, as applied to two- alternative forced-choice performance expressed as log-odds. The main result is that slow initial performance cannot accelerate later if the initial weights are near zero, but can if they are not. Stated another way, abrupt- ness in this learner is an effect of transfer, either from Universal Grammar in the form of an initial weighting, or from previous learning in the form of an acquired weighting.more » « less
An official website of the United States government
