Show simple item record

dc.contributor.authorConway, Christopher
dc.date.accessioned2005-07-27
dc.date.available2005-07-27
dc.date.issued2005-07-27
dc.identifier.citationConway, C.M. & Christiansen, M.H. (2005). Modality-constrained statistical learning of tactile, visual, and auditory sequences. Journal of Experimental Psychology: Learning, Memory, and Cognition, 31, 24-39.en_US
dc.identifier.otherbibid: 6476009
dc.identifier.urihttps://hdl.handle.net/1813/2107
dc.descriptionMorten H. Christiansen, Michael Owren, Elizabeth Regan, Michael Spiveyen_US
dc.description.abstractIn order to steer through a world characterized by a complex mixture of variability and structure, organisms rely upon implicit statistical learning, the capability to extract probabalistic patterns occurring in environmental stimuli. Although statistical learning has been found to occur across a myriad of domains, there has been little investigation into the effect that sense modality and other stimulus attributes may have on learning. In a series of experiments, I investigate to what extent implicit statistical learning is constrained and influenced by the nature of the input in which the statistical regularities occur. All experiments have in common the use of artificial grammar learning methodology, where adult participants are incidentally exposed to statistically-governed patterns and then are tested on their ability to apply their acquired knowledge to novel instances. Chapter 2 presents two experiments that compared learning across touch, vision, and audition, producing evidence for modality constraints. Specifically, the auditory modality displayed a quantitative learning advantage compared to vision and touch; additionally, each sense modality was more or less attuned to specific aspects of the input. Chapter 3 describes an experiment that further explored modality constraints by manipulating both the presentation format (temporal, spatial, or spatiotemporal) and presentation rate for visual and auditory material. Consistent with a modality-constrained view of learning, vision and audition were best at encoding spatial and temporal regularities, respectively. Finally, using a novel cross-over design, Chapter 4 presents three experiments that pitted abstract, amodal processing against stimulus-specific learning and found that statistical learning is mediated to a greater extent by stimulus-specific, not abstract, representations. Taken together, the results from these experiments suggest that statistical learning inherently involves learning mechanisms that are heavily influenced by the perceptual and sensory characteristics of the stimuli. I argue that a full understanding of statistical learning -- and likely other aspects of language and cognition will come only by specifying the role played by the senses. I conclude with a proposal for a perceptual, modality-constrained view of implicit statistical learning framed within the context of cognition as a whole.en_US
dc.format.extent777672 bytes
dc.format.mimetypeapplication/pdf
dc.language.isoen_US
dc.publisherAmerican Psychological Associationen_US
dc.subjectartificial grammar learningen_US
dc.subjectstatistical learningen_US
dc.subjectimplicit learningen_US
dc.subjecttactile learningen_US
dc.subjectmodality effectsen_US
dc.titleAn odyssey through sight, sound, and touch: Toward a perceptual theory of implicit statistical learningen_US
dc.typedissertation or thesisen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record

Statistics