Original filename: Poster.pdf
Author: Jessica Vergeer
This PDF 1.5 document has been generated by Microsoft® PowerPoint® 2016, and has been sent on pdf-archive.com on 09/09/2019 at 12:11, from IP address 82.169.x.x.
The current document download page has been viewed 41 times.
File size: 380 KB (1 page).
Privacy: public file
Download original PDF file
Student number: 205045341
Supervisor: Elena Geangu
I hear your emotions
A Study on Individual Differences in Infants’ ERP Responses to Nonverbal Emotional Vocalizations.
Nonverbal emotional expressions, such as laughter or
crying, carry important emotional information for
interactions. The ability to produce and accurately
interpret nonverbal emotional vocalizations of another
individual is considered to play a crucial role in social
interactions. Infancy is a crucial time period for tuning and
optimizing the brain circuitry for processing stimuli with
socio-emotional relevance and emotional responsivity
(Crespo-Llado, Vanderwert, & Geangu, 2018). Vocal
emotional cues have a more critical role in guiding infants’
behaviour than facial expressions (Zhang et al., 2017).
However, less research has been conducted focussed on
how infants process and resonate with nonverbal
vocalizations and it remains mostly unstudied.
35 infants were included in the final EEG data analysis. This group consisted of ten 3month-old infants, thirteen 5-month-olds and twelve 12-month-olds (16 females). All
the participants were recruited from a small urban area in North East England. They did
not suffer from any neurological or other medical conditions and were observed to have
a normal hearing capacity at birth and at the age of testing.
A total of 8 nonverbal emotional vocalizations, produced by pre-verbal infants, was
gathered from sound library sources. 4 negative (crying) sounds and 4 positive
(laughing) sounds were used. During half of the blocks, blocks with background noise
were presented. The presentation of the stimuli is shown in Figure 1. The IBQ-R
(Gartstein & Rothbart, 2003) was used to measure temperament.
An ANT Neuro EEG system with a 64-channels Waveguard Net,
in a 10-20 system arrangement, was used to continuously acquire
the data at 500Hz. Regions of interest are shown in Figure 2.
The ERPs were recorded while the infants sat on their parent’s
lap. The stimuli presentation lasted approximately 8 minutes,
Figure 2. The location of the analyzed
but stopped when the infant showed signs of distress or fatique.
The ability to process the emotional information and
nonverbal emotional vocalizations can vary greatly
between individuals. These differences can already be
found in early infancy and have been associated with
temperamental characteristics (Jarcho et al., 2014).
Since sounds are rarely presented in a complete isolated
environment, background noise may have an influence on
how the brain reacts to nonverbal emotional vocalizations.
One of the most difficult communication situations is trying
to understand one talker in the background of other
talkers (Rosen, Souza, Ekelund, & Majeed, 2013).
It was expected to find the N100, P200 and the LPC.
Previous infant research has reported to find these ERP
components in frontal locations (Crespo-Llado et al.,
2018). Increased activity was expected to be found in the
parietal regions too. 3-, 5- and 12-month-olds were
expected to show increased activity during the
presentation of nonverbal emotional vocalizations in the
ERP components, but the clearest response was
expected within the 12-month-olds. Following previous
infant research on temperament, a significant relation
between infants’ negative emotionality and ERP
responses to emotional nonverbal vocalizations was
expected. Lastly, it was expected that background noise
did not have a significant effect on the ERP responses to
the nonverbal emotional vocalizations in both conditions.
Scalp topographies can be found in Figure 3. As seen in Figure 4., the N100 and
P200 were found in the frontal regions and the LPC in both frontal and parietal.
Frontal N100. Paiwise comparisons of the interaction of
age x emotion x background noise x hemisphere revealed 3 months-old infants
responded with a more negative N100 average amplitude to the cry sounds than
to laughter in the right hemisphere, when presented in silence whilst 12-months-old
responded with a more negative N100 to the laughter sounds.
Frontal P200. A more positive P200 was found in the left hemisphere compared to
right, and in silence compared to noise.
Frontal LPC. A more positive LPC was found for crying compared to laughter. Cry
sounds presented in silence elicited a more positive LPC than when presented in
Parietal LPC. The age x emotion x background noise x hemisphere interaction
showed that, for the 3- and 5-months-old, in the right hemisphere, during the silent
background, a more positive LPC was found during the laughter sounds than during
cry. The 3-month-olds elicited a higher LPC in the left than in the right hemisphere,
during the crying sounds with silent background. Background noise elicited a higher
LPC amplitude than silence, in the right hemisphere during crying sounds for 5- and
12-month-olds and during laughter sounds in the 5-month-olds too.
Correlation frontal P200. The only correlation found was a positive correlation
between the frontal P200 and Distress to Limitations.
Figure 1. Stimulus presentation.
Figure 3. Voltage topographies of the ERP components
over the scalp for laugh-noise (first row) cry-noise
(second row), laugh-silence (third row) and cry-silence
Figure 4. Grand average ERPs in frontal (A),
parietal (B) locations separate for noise (first
row) and silence (second row), for 3 months old,
5 months old and 12 months old.
ERP components. The processing of nonverbal emotional
vocalizations was associated with frontal (N100, P200 and
LPC) and parietal locations (LPC).
Temperament. Infants with a higher score on Distress to
Limitations were more likely to have a higher amplitude for
laughter in the frontal P200 corrected than for crying, while
no background noise was played.
Background noise. In some conditions, an influence of
background noise was observed. Previous findings are
divided, so more research is needed to investigate this
Altogether, this study suggests that 3-, 5- and 12-month-old
infants process nonverbal emotional vocalizations, spread
over different ERP components, which supports the idea
that this consists of a multistep process. (Crespo-Llado et
al., 2018). Further research focusing on each of the specific
age groups would be recommended. Crying sounds seem to
be preferred in the N100, P200 and frontal LPC, where
laughter sounds seem to be preferred in the parietal LPC.
Temperament seems to have an influence on brain
responses to nonverbal emotional vocalizations, but more
research, with a bigger group of participants could help to
investigate this relationship further.
Research focussing on the processing of nonverbal
emotional vocalizations is extremely important, as this is
suggested to be related to later social development. This
study adds knowledge to this important area of research.
Crespo-Llado, M.M., Vanderwert, R.E., & Geangu, E. (2018). Individual diﬀerences in infants’ neural responses to their peers’ cry and laughter. Biological Psychology, 135, 117-127.
Jarcho, J. M., Fox, N. A., Pine, D. S., Leibenluft, E., Shechner, T., Degnan, K. A., ... Ernst, M. (2014). Enduring inﬂuence of early temperament on neural mechanisms mediating attention-emotion conﬂict in adults. Depression and Anxiety, 31(1), 53–62.
Rosen, S., Souza, P., Ekelund, C., & Majeed, A.A. (2013). Listening to speech in a background of other talkers: Effects of talker number and noise vocoding. J. Acoust. Soc. Am., 133(4), 2431-2443.
Zhang, D., Zhou, Y., Hou, X., Cui, Y., & Zhou, C. (2017). Discrimination of emotional prosodies in human neonates: A pilot fNIRS study. Neuroscience Letters, 658, 62-66.