Your search
Results 2 resources
-
Children with autism spectrum disorders have been reported to be less influenced by a speaker’s face during speech perception than those with typically development. To more closely examine these reported differences, a novel visual phonemic restoration paradigm was used to assess neural signatures (event-related potentials [ERPs]) of audiovisual processing in typically developing children and in children with autism spectrum disorder. Video of a speaker saying the syllable /ba/ was paired with (1) a synthesized /ba/ or (2) a synthesized syllable derived from /ba/ in which auditory cues for the consonant were substantially weakened, thereby sounding more like /a/. The auditory stimuli are easily discriminable; however, in the context of a visual /ba/, the auditory /a/ is typically perceived as /ba/, producing a visual phonemic restoration. Only children with ASD showed a large /ba/-/a/ discrimination response in the presence of a speaker producing /ba/, suggesting reduced influence of visual speech. © 2021, The Author(s), under exclusive licence to Springer Science+Business Media, LLC part of Springer Nature.
-
Audiovisual speech perception includes the simultaneous processing of auditory and visual speech. Deficits in audiovisual speech perception are reported in autistic individuals; however, less is known regarding audiovisual speech perception within the broader autism phenotype (BAP), which includes individuals with elevated, yet subclinical, levels of autistic traits. We investigate the neural indices of audiovisual speech perception in adults exhibiting a range of autism-like traits using event-related potentials (ERPs) in a phonemic restoration paradigm. In this paradigm, we consider conditions where speech articulators (mouth and jaw) are present (AV condition) and obscured by a pixelated mask (PX condition). These two face conditions were included in both passive (simply viewing a speaking face) and active (participants were required to press a button for a specific consonant–vowel stimulus) experiments. The results revealed an N100 ERP component which was present for all listening contexts and conditions; however, it was attenuated in the active AV condition where participants were able to view the speaker’s face, including the mouth and jaw. The P300 ERP component was present within the active experiment only, and significantly greater within the AV condition compared to the PX condition. This suggests increased neural effort for detecting deviant stimuli when visible articulation was present and visual influence on perception. Finally, the P300 response was negatively correlated with autism-like traits, suggesting that higher autistic traits were associated with generally smaller P300 responses in the active AV and PX conditions. The conclusions support the finding that atypical audiovisual processing may be characteristic of the BAP in adults.
Explore
Resource type
- Journal Article (2)