The Effect of Visual Articulatory Information on the Neural Correlates of Non-native Speech Sound Discrimination
Behavioral studies have shown that the ability to discriminate between non-native speech sounds improves after seeing how the sounds are articulated. This study examined the influence of visual articulatory information on the neural correlates of non-native speech sound discrimination. English speak...
Main Authors: | , , , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Frontiers Media S.A.
2020-02-01
|
Series: | Frontiers in Human Neuroscience |
Subjects: | |
Online Access: | https://www.frontiersin.org/article/10.3389/fnhum.2020.00025/full |
id |
doaj-e642b5f6b4dc49a28a14bf3df06323b7 |
---|---|
record_format |
Article |
spelling |
doaj-e642b5f6b4dc49a28a14bf3df06323b72020-11-25T02:11:39ZengFrontiers Media S.A.Frontiers in Human Neuroscience1662-51612020-02-011410.3389/fnhum.2020.00025495801The Effect of Visual Articulatory Information on the Neural Correlates of Non-native Speech Sound DiscriminationJames M. A. PlumridgeMichael P. BarhamDenise L. FoleyAnna T. WareGillian M. ClarkNatalia Albein-UriosMelissa J. HaydenJarrad A. G. LumBehavioral studies have shown that the ability to discriminate between non-native speech sounds improves after seeing how the sounds are articulated. This study examined the influence of visual articulatory information on the neural correlates of non-native speech sound discrimination. English speakers’ discrimination of the Hindi dental and retroflex sounds was measured using the mismatch negativity (MMN) event-related potential, before and after they completed one of three 8-min training conditions. In an audio-visual speech training condition (n = 14), each sound was presented with its corresponding visual articulation. In one control condition (n = 14), both sounds were presented with the same visual articulation, resulting in one congruent and one incongruent audio-visual pairing. In another control condition (n = 14), both sounds were presented with the same image of a still face. The control conditions aimed to rule out the possibility that the MMN is influenced by non-specific audio-visual pairings, or by general exposure to the dental and retroflex sounds over the course of the study. The results showed that audio-visual speech training reduced the latency of the MMN but did not affect MMN amplitude. No change in MMN amplitude or latency was observed for the two control conditions. The pattern of results suggests that a relatively short audio-visual speech training session (i.e., 8 min) may increase the speed with which the brain processes non-native speech sound contrasts. The absence of a training effect on MMN amplitude suggests a single session of audio-visual speech training does not lead to the formation of more discrete memory traces for non-native speech sounds. Longer and/or multiple sessions might be needed to influence the MMN amplitude.https://www.frontiersin.org/article/10.3389/fnhum.2020.00025/fullaudio-visual trainingspeech processingspeech discriminationmismatch negativity (MMN)event related potential (ERP)non-native speech sounds |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
James M. A. Plumridge Michael P. Barham Denise L. Foley Anna T. Ware Gillian M. Clark Natalia Albein-Urios Melissa J. Hayden Jarrad A. G. Lum |
spellingShingle |
James M. A. Plumridge Michael P. Barham Denise L. Foley Anna T. Ware Gillian M. Clark Natalia Albein-Urios Melissa J. Hayden Jarrad A. G. Lum The Effect of Visual Articulatory Information on the Neural Correlates of Non-native Speech Sound Discrimination Frontiers in Human Neuroscience audio-visual training speech processing speech discrimination mismatch negativity (MMN) event related potential (ERP) non-native speech sounds |
author_facet |
James M. A. Plumridge Michael P. Barham Denise L. Foley Anna T. Ware Gillian M. Clark Natalia Albein-Urios Melissa J. Hayden Jarrad A. G. Lum |
author_sort |
James M. A. Plumridge |
title |
The Effect of Visual Articulatory Information on the Neural Correlates of Non-native Speech Sound Discrimination |
title_short |
The Effect of Visual Articulatory Information on the Neural Correlates of Non-native Speech Sound Discrimination |
title_full |
The Effect of Visual Articulatory Information on the Neural Correlates of Non-native Speech Sound Discrimination |
title_fullStr |
The Effect of Visual Articulatory Information on the Neural Correlates of Non-native Speech Sound Discrimination |
title_full_unstemmed |
The Effect of Visual Articulatory Information on the Neural Correlates of Non-native Speech Sound Discrimination |
title_sort |
effect of visual articulatory information on the neural correlates of non-native speech sound discrimination |
publisher |
Frontiers Media S.A. |
series |
Frontiers in Human Neuroscience |
issn |
1662-5161 |
publishDate |
2020-02-01 |
description |
Behavioral studies have shown that the ability to discriminate between non-native speech sounds improves after seeing how the sounds are articulated. This study examined the influence of visual articulatory information on the neural correlates of non-native speech sound discrimination. English speakers’ discrimination of the Hindi dental and retroflex sounds was measured using the mismatch negativity (MMN) event-related potential, before and after they completed one of three 8-min training conditions. In an audio-visual speech training condition (n = 14), each sound was presented with its corresponding visual articulation. In one control condition (n = 14), both sounds were presented with the same visual articulation, resulting in one congruent and one incongruent audio-visual pairing. In another control condition (n = 14), both sounds were presented with the same image of a still face. The control conditions aimed to rule out the possibility that the MMN is influenced by non-specific audio-visual pairings, or by general exposure to the dental and retroflex sounds over the course of the study. The results showed that audio-visual speech training reduced the latency of the MMN but did not affect MMN amplitude. No change in MMN amplitude or latency was observed for the two control conditions. The pattern of results suggests that a relatively short audio-visual speech training session (i.e., 8 min) may increase the speed with which the brain processes non-native speech sound contrasts. The absence of a training effect on MMN amplitude suggests a single session of audio-visual speech training does not lead to the formation of more discrete memory traces for non-native speech sounds. Longer and/or multiple sessions might be needed to influence the MMN amplitude. |
topic |
audio-visual training speech processing speech discrimination mismatch negativity (MMN) event related potential (ERP) non-native speech sounds |
url |
https://www.frontiersin.org/article/10.3389/fnhum.2020.00025/full |
work_keys_str_mv |
AT jamesmaplumridge theeffectofvisualarticulatoryinformationontheneuralcorrelatesofnonnativespeechsounddiscrimination AT michaelpbarham theeffectofvisualarticulatoryinformationontheneuralcorrelatesofnonnativespeechsounddiscrimination AT deniselfoley theeffectofvisualarticulatoryinformationontheneuralcorrelatesofnonnativespeechsounddiscrimination AT annatware theeffectofvisualarticulatoryinformationontheneuralcorrelatesofnonnativespeechsounddiscrimination AT gillianmclark theeffectofvisualarticulatoryinformationontheneuralcorrelatesofnonnativespeechsounddiscrimination AT nataliaalbeinurios theeffectofvisualarticulatoryinformationontheneuralcorrelatesofnonnativespeechsounddiscrimination AT melissajhayden theeffectofvisualarticulatoryinformationontheneuralcorrelatesofnonnativespeechsounddiscrimination AT jarradaglum theeffectofvisualarticulatoryinformationontheneuralcorrelatesofnonnativespeechsounddiscrimination AT jamesmaplumridge effectofvisualarticulatoryinformationontheneuralcorrelatesofnonnativespeechsounddiscrimination AT michaelpbarham effectofvisualarticulatoryinformationontheneuralcorrelatesofnonnativespeechsounddiscrimination AT deniselfoley effectofvisualarticulatoryinformationontheneuralcorrelatesofnonnativespeechsounddiscrimination AT annatware effectofvisualarticulatoryinformationontheneuralcorrelatesofnonnativespeechsounddiscrimination AT gillianmclark effectofvisualarticulatoryinformationontheneuralcorrelatesofnonnativespeechsounddiscrimination AT nataliaalbeinurios effectofvisualarticulatoryinformationontheneuralcorrelatesofnonnativespeechsounddiscrimination AT melissajhayden effectofvisualarticulatoryinformationontheneuralcorrelatesofnonnativespeechsounddiscrimination AT jarradaglum effectofvisualarticulatoryinformationontheneuralcorrelatesofnonnativespeechsounddiscrimination |
_version_ |
1724913532677390336 |